Post Job Free

Resume

Sign in

Azure Data Modeling

Location:
Avon, IN
Salary:
$135,000
Posted:
November 21, 2023

Contact this candidate

Resume:

WASIU OYETOLA

Cell (***) – *******

Clearance level: clearable

ad1b6c@r.postjobfree.com

PROFESSIONAL PROFILE

Dynamic and resilient professional with 7+ years of IT experience in the Healthcare, Financial, and Banking domain. Very passionate about projects and savvy with business requirements; translate them into clear and concise specifications; as well as extract and manipulate data from RDBMS or non- relational databases in order to make cutting-edge and prudent decisions to improve the outlook of the organization. Always ready to unleash the skills and knowledge acquired over the years for the development and progress of the organization. A wonderful team player who values collaboration and communication, eager and possesses a very strong work ethic, can relate and work efficiently with diverse groups.

SKILLS AND ACCOMPLISHMENTS

Having hands on Experience in Microsoft Azure cloud services like Azure data factory, Azure Data lake Gen2, Azure Synapse Analytics, Azure SQL server Azure stream analytics. Azure Databricks

Advocate and champion engineering design and development standards.

Proficiency in using Databricks to build scalable and efficient data processing workflows.

Knowledge of Databricks features such as clusters, notebooks, jobs, and libraries.

Experienced in configuring and managing streaming data inputs (Azure Event Hub, Azure IoT Hub & Blob storage) and outputs, defining query logic, and integrating with other Azure services.

Proficiency in utilizing window functions within Azure stream Analytics and Databricks to perform advanced analytics and calculations over streaming and batch Data.

Strong ability to write SQL queries using PROC SQL and PL/SQL statements, stored procedures, Functions, triggers and packages.

Expertise in TSQL (DDL, DML AND DCL), developing/creating new database objects such as Tables Views, Joins, Stored Procedures functions (UDF), cursors, common Table Expressions (CTEs)

Working knowledge of using Python (Jupiter) for Data Analysis

Performed DATA MODELING, DATA MIGRATION, consolidate and harmonized data. Hands-on data modeling experience using Dimensional Data Modeling, Star Schema, Fact and Dimension Table, conceptual, Logical, and Physical Data Modeling using Erwin 3.x/4. x.

Excellent knowledge and usage of reporting tools such as Tableau, SAS, and Power BI.

Experience in Data Analysis, migration, cleansing, Extraction, Transformation, Manipulation, Visual representation, Import and Export through various ETL tools.

Experience in Data Analysis and ETL techniques for loading enormous amounts of data Utilizing SISS

Possess strong computer and office administration skills with a vast knowledge of Windows and Microsoft Office Suite (Word, Excel, PowerPoint, Access, and Visio).

Utilized business intelligence tools to include OLAP (Online Analytical processing), data warehousing, reporting, querying tools and excel.

Experience in designing, publishing, and presenting customized interactive dashboards in Tableau using Marks, Action, Filters, Parameter, and calculations.

Working experience and knowledge in creating tables, reports, graphs, and listings using various procedures and handling large databases to perform complex data manipulations.

Great understanding of SDLC methodologies such as Waterfall and Agile (specifically SCRUM).

Good knowledge of healthcare information systems, experience with Practice Management software and HIPAA Guidelines.

Created storage solutions for management of PII/PHI data and encryption while migrating data from legacy data stores to Cloud and establishment of Delta Lake.

Hands on experience in SAS programming for extracting data from Flat files, Excel spreadsheets and external RDBMS (ORACLE) tables using LIBNAME and SQL PASSTHRU facility.

Hands on experience on Azure OpenAI prompt engineering.

Exceptional organizational and time management skills and the ability to work independently.

Exceptional team player with demonstrated experience of working diplomatically and collaboratively with all levels of staff to ensure successful implementation of assignments.

Self-motivated individual with strong communication skills, great attention to details and the ability to work under pressure.

TECHNICAL SKILLS

OPERATING SYSTEMS:

Win XP, Vista, 7, Windows Server 2003, 2008(X32, X64)

DATABASES:

Azure Synapse Analytics, MS SQL Server 2005/2008/2012, Teradata 12/13, MySQL 5.2, 5.7, SQL Lite, Oracle 9i,

ETL / OTHER TOOLS:

Azure Data Factory, Azure Databricks, SSIS (Visual Studio), SAS Base 9.x, SAS Enterprise Guide, DMA, Erwin Data Modeler, ER Assistant, Snow pipe

STORAGE: Azore blob storage, Azure Data Lake Gen2

STREAMING TOOLS: Azure stream Analytics, Databricks

QUERYING TOOLS:

SQL Management Studio 2008/2012, Snowflakes, Teradata SQL Assistant, SQL Plus, SQL Developer, Python, SAS

BUSINESS INTELLIGENCE TOOLS:

Tableau, MicroStrategy, Excel and PowerPivot, Power BI, Cognos, and PowerPivot, SSRS SAS,

PROGRAMMING LANGUAGE: Python (Jupiter Notebook, Anaconda), SQL

CLOUD COMPUTING PLATFORM: Snowflakes, Microsoft Azure, AWS, Google Cloud Storage (GCS).

PROFESSIONAL EXPERIENCE

Accenture LLP Carmel, IN Sep2022- Till date

Functional Data Integration Arch

Specialist/Lead

Data Engineering Lead -Client

Responsibilities:

Work closely with business stakeholders to understand their requirements and align Data solutions with the Strategic goals of the organization.

Develop comprehensive data models for enhanced data integrity and efficiency.

Design and implement data orchestration strategies on Azure, collaborating with cross-functional teams.

Led data orchestration initiatives on Azure using ADF, overseeing the design and implementation of end-to-end pipelines for a seamless and coordinated data flow across various systems and applications.

Lead the implementation of Azure services for migration and ETL, resulting in a significant reduction in processing time.

Leveraged Unity Catalog for comprehensive metadata management, ensuring efficient organization, discovery, and governance of data assets within the data ecosystem.

Implement the data architectural design by building and maintaining the systems and infrastructure needed to store process and analyze data.

Experienced in designing and implementing data orchestration strategies in Azure to automate data movement, transformation and processing across multiple systems and processes.

Proficiently utilized Databricks, including Unity Catalog, Delta Lake, and Databricks SQL, to optimize end-to-end analytics workflows and enhance data processing efficiency.

Integrated data from multiple sources transform it into a usable format and load it into the target systems.

Designed and implemented optimized real-time data streaming processing solutions using Azure stream analytics and Databricks.

Utilized advanced prompt engineering techniques on Azure OpenAI to optimize natural language understanding and guide the generation of precise outputs from language models, enhancing the efficiency and effectiveness of AI-driven applications.

Manage databases for performance and scalability, utilizing python and SQL to develop customer scripts & application for data process and migrations.

Responsibilities:

Database Administrator

Child welfare system (Dept of child service) -Client

Design and developed complex stored procedure applying 200 business rules against existing data and writing results to a new table.

Sourcing Data Analyst -Client

TruCare reporting

Supports the team as Senior SQL with all the data analysis queries and conversions.

Analyze complex current state SQL queries and conduct parison of the new cloud models to rebuild reports.

Document current state SQL queries/source data in source to target mapping.

Document future state transformations required to populate the correct data based on future state schema design.

Create and execute UAT to meet acceptance criteria.

Experienced with claims data and other health information systems.

Be a cross functional team player with ability to conduct testing needs like query troubleshooting understanding test data needs writing test cases against acceptance.

Criteria, regression testing, SIT, data validation, etc

Citizen Bank, Indianapolis, IN. Jun2020-Aug 2022

Data Analyst /Data Engineer

Responsibilities:

Communicating with stakeholders to get Business Requirements and documenting in the Business Requirements Documents (BRD)

Performing daily integration and ETL tasks by extracting, transforming, and loading big data from different Multiple data sources

Hands on experience on cloud offerings like AWS (S3 bucket, IAM Role and Policies),AWS Glue, lake formation snowflakes, Azure and other cloud concepts

Created high performance cloud and big data systems to be used with operational and analytic applications.

Hands on experience in python for data analysis and data ingestion

Design application architecture/ technical designs and articulate the data pipeline solutions to the team members.

Performed DATA MODELING and WAREHOUSING, DATA MIGRATION to consolidate and harmonized data. Hands-on data modeling experience using Dimensional Data Modeling, Star Schema, Fact and Dimension Table, Physical and Logical Data Modeling using Erwin 3.x/4. x.

Ensured that data warehousing design was scalable and maintainable

Designed conceptual data model based on the requirement, interacted with non-technical end-users to understand the business logic.

Wrote complex SQL queries to retrieve data from disparate table by using JOINs, Sub-queries, CTE’s, Correlated sub-queries and derived tables on SQL Server platform.

Performed ad-hoc reporting analysis as well as manipulate complex data on MS SQL server.

Experienced in creation and updates of several SQL Index, Views, Complex Stored procedures, triggers and appropriate User defined functions.

Utilize SSIS for ETL data Integration, data migration, and analysis

Identify and document limitations in data quality that jeopardize the ability of internal and external data analysis.

Great understanding of SDLC methodologies such as Waterfall and Agile (specifically SCRUM).

Thorough grounding in all phases of data analysis, including definition and analysis of questions with respect to available data and resources, overview of data and assessment of data quality, selection of appropriate models and statistical tests and presentation of results

Wrote complex SQL queries using advanced SQL concepts like aggregate functions, Group by and OLAP (Online Analytical Functions).

Data governance and defining processes concerning how data is stored, archived, backed up, and protected from mishaps, theft or attack.

Keystone bank Indianapolis, IN Apr 2018 - May 2020

Tableau Developer, Data Analyst

Responsibilities:

Wrote complex SQL queries using advanced SQL concepts like Aggregate functions and window functions.

Created Views and Database Triggers, Stored Procedures and Functions using SQL such that information entered by a certain WM can make appropriate changes to the respective tables.

Performed ad-hoc reporting analysis as well as manipulate complex data on MS SQL server.

Involved in extensive data validation by writing several complex SQL queries and involved in back-end testing and worked with data quality issues.

Developed queries to retrieve and analyze data from various sources for projects, program, or reports. Work with vendors for evaluation of data

Experienced in Data mapping, Data transformation between sources to target data models.

Analysed reporting requirements and develop various Dashboards.

Involved in extraction, transformation and loading of data directly from different source systems like flat files, Excel, Oracle and SQL Server.

Experience in creating various views in Tableau (Tree maps, Heat Maps, Scatter plot).

Experience in creating Filters, quick filters, table calculations, calculated measures and parameters.

Used statistical techniques such as regression, cluster analysis, factor analysis, time series forecasting, experimental and design, etc. to solve business problems.

Used SAS to solve problems with data and analytics.

Reviewed coding done to SAS and Python application upgrades and extensions.

Aetna Healthcare, Fort Wayne IN Feb2016-Mar2018

Data Analyst

Responsibilities

Business requirement gathering and translating them into clear and concise specifications and queries.

Collected data from all clients and made recommendations, make business recommendations based on data collected to improve business efficiency.

Ad-hoc data extraction through queries in SQL Server 2008, Excel, SAS, Tableau report or PDF.

Work effectively with both technical and non-technical team members across the enterprise to resolve issues in a respectful and a timely manner.

Use data visualization tools to provide an easy-to-understand interface for end users to quickly identify key themes within their Data

Created Tableau scorecards, dashboards using stack bars, bar graphs, scattered plots, geographical maps, heat maps, bullet charts, Gantt charts demonstrating key information for decision making.

Perform data maintenance using SQL queries, assisting with the product portfolio rollout and refreshes and assisting with testing rating system enhancements.

Developed and produced peer quality metric performance: validated data and processes to ensure accuracy, completeness, and consistency of data using SAS.

Analyze and assess invalid and missing data in the Claims Data

Implement commercial medical rate sets, manage rating system help desk questions from the users, implementing new or updated commercial product portfolios, perform data maintenance using SQL testing system enhancements before they go live to production and setting up rating structures quoting

Worked on analysis of the Facets claims processing application in order to gather requirements to comply with HIPAA 5010

EDUCATION

LADOKE AKINTOLA UNIVERSITY OF TECHNOLOGY (LAUTECH)

OGBOMOSO, NIGERIA.

COMPUTER SCIENCE AND ENGINEERING (B. tech)

UNIVERSITY OF SOUTHERN INDIANA

MBA Data Analytics (2024)



Contact this candidate