Post Job Free
Sign in

Power Bi Azure Data

Location:
Houston, TX
Posted:
May 25, 2025

Contact this candidate

Resume:

(M)832-***-**** Email ID: *****.***@*****.***

P R O F E S S I O N A L P R O F I L E

An experienced and highly motivated IT professional, with 13+ years of experience in analysis, architecture, design, development, testing and support of Business Intelligence products and services like MSBI (SSIS, SSRS), SQL SERVER, Data Visualization tools like Power BI Desktop, Power BI Service, MS Power Flow and Microsoft Azure Cloud Computing (Azure Data Storage, ADF, Azure SQL,Azure Data Bricks).

P R O F E S S I O N A L E X P E R I E N C E

Involved in all the stages of Software Development Life Cycle including Requirements gathering, Analysis and Design, Technical Specifications preparation, Implementation, Integration and Testing, Quality Assurance, Deployment, User Training, Performance Tuning and Production Support.

Involved in Sprint Planning, Daily Scrums, Sprint Reviews and Retrospective Meetings, Create the Task Board and Sprint Burn down Chart at the start of every Sprint, attend all Scrum meetings and continually work with the Team and business to find and implement improvements.

Strong SQL, developing database queries in-depth Knowledge of MS SQL Server and Stored Procedures

Expertise in upgrading from DTS to SSIS packages, Logging, SQL Mail, Check Points, Custom Control Flow, Auditing, Variables Deploying, resolving complex issues and Error Handling in SSIS.

Involved in Data Modeling, Star Schema/Snowflake Schema modeling, FACT & Dimensions tables, Physical & logical data modeling.

Involved the Source-Target Mapping for the Dimensions and Fact Tables.

Worked with Dimensional modeling, Data migration, Data cleansing, ETL Processes for the data warehouse.

Technical and analytical skill with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.

Worked on Various SSIS components like Control Flow, Data Flow Transformations, Error Logging and Event Handlers.

Experience in Error and Event Handling: Precedence Constraints, Break Points, Check Points, Logging.

Extensive knowledge of Creating User defined Project Variables, Package Variables and using System Variables for SSIS Packages and Project Configuration wizard to configure SSIS Projects.

Created SSIS Parts (package templates) for reusability.

Experience in Performance Tuning in SSIS packages by using Row Transformations, Block and Unblock Transformations.

Creating Package Configuration to change the package properties dynamically in SSIS.

Extensive knowledge of Creating User defined Project Variables, Package Variables and using System Variables

for SSIS Packages and Project Configuration wizard to configure SSIS Projects.

Writing complex queries using CTE (Common Table Expression) and recursive CTE.

Excellent experience in creating Indexes, Indexed views, Column Store Index.

Reorganizing Index for fixing internal fragmentation and Rebuilding the Index for fixing internal and external fragmentation, which will improve the query performance.

Good experience in creating effective Functions, Stored Procedures to assist efficient data manipulation and data consistency.

Experience in optimizing performance of various long running queries and stored procedures using SQLProfiler, Index Tuning Wizard, Windows Performance monitor and optimizing the queries.

Monitoring of database size and disk space in production and Planning and Implementing Backup and Recovery Strategy. Good Knowledge in Full Backup, Differential Backup and Log Backup.

Experience creating, populating and maintaining data marts. Thorough knowledge of Features, Structure, Attributes, Hierarchies, Star and Snowflake Schemas of Data Marts.

Hands on experience in Azure Cloud Services (PaaS & IaaS), Azure Data Factory (ADF), SQL Azure, Azure Data Storage.

Hands-on experience in Azure Analytics Services – Azure Data Lake Store (ADLS), Azure BLOB Storage, Azure SQL DB, Azure Data Factory (ADF) etc.

Excellent knowledge of ADF building components – Integration Runtime, Linked Services, Data Sets, Pipelines, Activities.

Designed and developed data ingestion pipelines from on-premise to different layers into the ADLS using Azure Data Factory (ADF V2).

Experience in developing ADF pipelines with various integration runtime and linked services using multiple activities like Lookup, Copy, Store Procedures, For each loop and Data Flow.

Actively involved in day-to- day ETL progress monitoring, pipeline deployments and validations.

Good understanding of business requirements, Warehouse schemas and mapping rules for the implementation of simple to complex ETL designs.

Orchestrated data integration pipelines in ADF using various Activities like Get Metadata, Lookup, For Each, Wait, Execute Pipeline, Set Variable, Filter, until, etc.

Implemented incremental loading for multiple tables into Data Lake into single pipeline.

Good experience in data migration on multiple source tables to Data Lake.

Expert in implementing various business rules for Data Extraction, Transforming and Loading (ETL) between Homogenous and Heterogeneous Systems using Azure Data Factory (ADF) and SQL Server Integration Services (SSIS).

Experience on Migrating SQL database to Azure Data Lake, Azure data Lake Analytics, Azure SQL Database, Data Bricks and Azure SQL Data warehouse and Controlling and granting database access and Migrating On premise databases to Azure Data lake store using Azure Data factory.

Designed and developed care coordination management Analytical dashboard with various measures, KPIs and

Custom Visuals metrics in Power BI.

Expert in using Power BI DAX functions like: Time Intelligence, Filter, Date and time and others

In-depth knowledge of connecting Power BI to different data sources like SQL server, Oracle, XL, CSV, website, folders utilizing both direct and indirect connections.

Fluent in Power BI data modeling with appropriate cardinality relationship 1 to 1, 1 to Many.

Utilized Star Schema & Snow Flake Schema for architecture and Row Level Security for Access control.

Expert in creating precise and beautiful visualization using appropriate chart types, slicers, filters, drill-down, drill through and bookmarks.

Experienced in Creating automated workflows using Power Automate.

T E C H N I C A L S K I L LS

BI Tools: MSBI (SSIS, SSRS), Power BI Desktop, Power BI Service. Power Automate.

Databases: SQL Server 2005-17, Azure SQL Database, Azure SQL Datawarehouse, Snowflake Datawarehouse.

Query Tools: SQL Management Studio, SQL Data Tool, Access, Teradata SQL Assistant, Rapid SQL, PL/SQL Developer. Modeling Tools: MS VISIO.

ETL Tools: SQL Server Integration Services (SSIS), Azure Data Factory (ADF), Azure Databricks. Reporting Tools: SSRS.

Programming: T-SQL, ASP.net, VB.Net, C AND C++. Operating Systems: Windows XP, 2003, 2007 2010 and 2010pro.

E D U C A T I O N

MCA (Computer Application) - Osmania University, Hyderabad - 2007

B.Sc. Computer Science- Andhra University, Vishakhapatnam - 2004

C E R T I F I C A T I O N S

Microsoft Certified – Power BI Data Analyst Associate (DA-100)

Microsoft Certified – Azure Data Engineer (DP-203)

PRO J E C T E X P E R I E N C E

Client: Molina Health Care

Role: Sr. Data Engineer

Period:- Aug 2023 to Till Date

ROLES & RESPONSIBILITY:

Gather and analyze the business requirements and then translate them to technical specifications.

Created packages in SSIS Designer using Control Flow Tasks and Data Flow Transformations to implement Business Rules.

Creating the Mappings using Data Flow Transformations such as the Sort, Derived Column, Conditional Split, SCD and Lookup etc.

Using Control Flow, elements like Containers, Execute SQL Task, Execute Package Task, File System Task and Send Mail Task etc.

Scheduling and Monitoring the ETL packages for Daily load in SQL Server Job Agent.

Debugging and Validating the Data flows and Control Flows

To Develop Exception Handling process for each SSIS Package.

Implemented Performance Tuning at various levels.

Developing data links and data loading scripts, including data transformations, reconciliations and accuracies.

Used various SSIS tasks such as Conditional Split, Derived Column, which were used for Data Scrubbing, data validation

checks during Staging, before loading the data into the Data warehouse.

Used SSIS to implement the slowly Changing Transformation, to maintain Historical Data in Data warehouse.

Created SSIS packages to load data into Data Warehouse using Various SSIS Tasks like Execute SQL Task, bulk insert task, data flow task, file system task, send mail task, active script task, and xml task.

Created SSIS Packages to export and import data from CSV files, Text files and Excel Spreadsheets with different extensions.

Created business logics using various SSIS task and transformations.

Created Jobs to Move SSIS package to move the jobs from Dev to UAT and UAT to Production.

Created SSIS Package to move the meta data between various environments like Dev, UAT, Test and Production.

Design and development of database scripts and procedures.

Implemented checkpoint configuration, package configuration in packages for better reusability.

Worked on several Azure storage system like BLOB and Data Lake (ADL) .

Recreated existing application logic and functionality in the Azure Data Lake, Data Factory, SQL

Ingested data into one or more Azure services – (Azure Storage, Azure SQL, Azure Data Warehouse) and processed the data in Azure Databricks

Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL and U-SQL Azure Data Lake Analytics. Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.

Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform and load data from different sources like Azure SQL, Blob storage, Azure SQL Data warehouse, write-back tool and backwards.

Using ADF Copy activity moved the files from various sources to cloud data warehouse server.

Preparing / cleaning the data by doing required data transformations, applying business logics to the data

Integrating the data from different sources, creating a data model and use joins where required

Designing and developing dashboard and dashboard functionalities as per the requirement

Developed dashboard functionalities using parameters, DAX functions, Power Query, Power Pivot, Power View, etc.

Developed all required visualizations (Charts and Tables)

Data modelling in power bi to create a Star Schema with Facts and Dimension table

Executing Unit testing on the data showing in dashboard visuals v/s raw data files

Implementing row-level security on data along with an understanding of application security layer models

Client: Ameri Health

Role: SSIS/Azure/Power BI

Location: Houston, Tx

Period: - Since Jul 2022 – Aug 2023

ROLES & RESPONSIBILITY:

Gather and analyze the business requirements and then translate them to technical specifications.

Created packages in SSIS Designer using Control Flow Tasks and Data Flow Transformations to implement Business Rules.

Creating the Mappings using Data Flow Transformations such as the Sort, Derived Column, Conditional Split, SCD and Lookup etc.

Using Control Flow, elements like Containers, Execute SQL Task, Execute Package Task, File System Task and Send Mail Task etc.

Scheduling and Monitoring the ETL packages for Daily load in SQL Server Job Agent.

Debugging and Validating the Data flows and Control Flows

To Develop Exception Handling process for each SSIS Package.

Implemented Performance Tuning at various levels.

Developing data links and data loading scripts, including data transformations, reconciliations and accuracies.

Used various SSIS tasks such as Conditional Split, Derived Column, which were used for Data Scrubbing, data validation

checks during Staging, before loading the data into the Data warehouse.

Used SSIS to implement the slowly Changing Transformation, to maintain Historical Data in Data warehouse.

Created SSIS packages to load data into Data Warehouse using Various SSIS Tasks like Execute SQL Task, bulk insert task, data flow task, file system task, send mail task, active script task, and xml task.

Created SSIS Packages to export and import data from CSV files, Text files and Excel Spreadsheets with different extensions.

Created business logics using various SSIS task and transformations.

Created Jobs to Move SSIS package to move the jobs from Dev to UAT and UAT to Production.

Created SSIS Package to move the meta data between various environments like Dev, UAT, Test and Production.

Design and development of database scripts and procedures.

Implemented checkpoint configuration, package configuration in packages for better reusability.

Worked on several Azure storage system like BLOB and Data Lake (ADL) .

Recreated existing application logic and functionality in the Azure Data Lake, Data Factory, SQL

Ingested data into one or more Azure services – (Azure Storage, Azure SQL, Azure Data Warehouse) and processed the data in Azure Databricks

Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL and U-SQL Azure Data Lake Analytics. Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.

Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform and load data from different sources like Azure SQL, Blob storage, Azure SQL Data warehouse, write-back tool and backwards.

Using ADF Copy activity moved the files from various sources to cloud data warehouse server.

Preparing / cleaning the data by doing required data transformations, applying business logics to the data

Integrating the data from different sources, creating a data model and use joins where required

Designing and developing dashboard and dashboard functionalities as per the requirement

Developed dashboard functionalities using parameters, DAX functions, Power Query, Power Pivot, Power View, etc.

Developed all required visualizations (Charts and Tables)

Data modelling in power bi to create a Star Schema with Facts and Dimension table

Executing Unit testing on the data showing in dashboard visuals v/s raw data files

Implementing row-level security on data along with an understanding of application security layer models

Environment: SSIS, SQL SERVER, Power BI & AZURE.

Title : Daimler & Chrysler Financial System Client: Merced’s Benz

Role: SSIS/Azure/Power BI

Location: Houston, Tx

Period: - Since Dec’ 2019 to Jul 2022

Daimler Chrysler is an unique in the automotive industry. Its product portfolio ranges from small cars to sports cars and luxury sedans and it offers financial and other automotive services through Daimler Chrysler Services. Daimler Chrysler Services offers tailor-made products, designed to accommodate individual finance requirements.

It offers you Finance & Leasing Products for personal and company vehicle finance, and Insurance & Warranty Protection

ROLES & RESPONSIBILITY:

Gather and analyze the business requirements and then translate them to technical specifications.

Created packages in SSIS Designer using Control Flow Tasks and Data Flow Transformations to implement Business Rules.

Creating the Mappings using Data Flow Transformations such as the Sort, Derived Column, Conditional Split, SCD and Lookup etc.

Using Control Flow, elements like Containers, Execute SQL Task, Execute Package Task, File System Task and Send Mail Task

Scheduling and Monitoring the ETL packages for Daily load in SQL Server Job Agent.

Debugging and Validating the Data flows and Control Flows

To Develop Exception Handling process for each SSIS Package.

Implemented Performance Tuning at various levels.

Developing data links and data loading scripts, including data transformations, reconciliations and accuracies.

Used various SSIS tasks such as Conditional Split, Derived Column, which were used for Data Scrubbing, data validation

checks during Staging, before loading the data into the Data warehouse.

Used SSIS to implement the slowly Changing Transformation, to maintain Historical Data in Data warehouse.

Created SSIS packages to load data into Data Warehouse using Various SSIS Tasks like Execute SQL Task, bulk insert task, data flow task, file system task, send mail task, active script task, and xml task.

Created SSIS Packages to export and import data from CSV files, Text files and Excel Spreadsheets with different extensions.

Created business logics using various SSIS task and transformations.

Created Jobs to Move SSIS package to move the jobs from Dev to UAT and UAT to Production.

Created SSIS Package to move the meta data between various environments like Dev, UAT, Test and Production.

Design and development of database scripts and procedures.

Implemented checkpoint configuration, package configuration in packages for better reusability.

Preparing / cleaning the data by doing required data transformations, applying business logics to the data

Integrating the data from different sources, creating a data model and use joins where required

Designing and developing dashboard and dashboard functionalities as per the requirement

Developed dashboard functionalities using parameters, DAX functions, Power Query, Power Pivot, Power View, etc.

Developed all required visualizations (Charts and Tables)

Data modelling in power bi to create a Star Schema with Facts and Dimension table

Executing Unit testing on the data showing in dashboard visuals v/s raw data files

Implementing row-level security on data along with an understanding of application security layer models

Using ADF Copy activity moved the files from various sources to cloud data warehouse server.

Worked on several Azure storage system like BLOB and Data Lake (ADL) .

Recreated existing application logic and functionality in the Azure Data Lake, Data Factory, SQL

Ingested data into one or more Azure services – (Azure Storage, Azure SQL, Azure Data Warehouse) and processed the data in Azure Databricks

Environment: SSIS, SQL SERVER, Power BI & AZURE.

Title : DATA-SOURCING MONTH-END PROCESS

Client: ANZ

Role: Power BI/SSIS/SQL SERVER DEVELOPER Location: HYDERABAD, INDIA

Period: - Oct-2016 to Dec -2019

As part of Data- Sourcing Month- END process, data from multiple source systems like Cache, cap, ftp, midanz, Esanda etc., for AU, NZ, NY, UK and Asian countries are extracted and hosted in a centralized location called ICIS. Income and tax calculations run on top of it to provide customer metrics.

Have maintained the accuracy level of the data loaded from multiple system acts as an input for Business Intelligence report. The system considered as a single source of truth for customer Profitability and metrics. This system also assists in periodical review of Customers’ financial performance and behavior and shapes the ongoing banking relationship.

ROLES & RESPONSIBILITY:

Created around 700 SSIS packages for Import, Aggregation and Extract process

Developed DPE (Data Process Engine) Packages for executing the packages depending on Business day.

Created dependency between Import to Aggregation and Aggregation to extract packages.

Developed scripts for validating the source file availability for all Import packages.

Created file transfer process for coping source files in to Input folder and sending extract files to end users.

Created Error login mechanism to log the Exception logs, Error logs, Package execution status logs and Process completion logs.

Have created Data integrity package for extracting last 6 months’ data trend, which is used for trend analysis by BAU team.

Analyzing the data and created Data modeling, Data Marts as per the requirement.

Loaded hierarchical data structures and created different level of details.

Loaded data order date wise, which is used to create the reports Weekly, Monthly, Quarterly and Yearly level.

Lead multiple teams for new initiatives, enhancements, and support of BI in Corporate Office of Technology.

Interacted with business stakeholders to understand their requirements, analyzing the available data to build Power BI Dashboards.

Configuring the Gateways for On-Premises data access.

Created DAX Queries to generated computed columns in Power BI.

Used Power BI, Power Pivot to develop data analysis prototype, and used Power View and Power Map to visualize reports

Published Power BI Reports in the required originations and Made Power BI Dashboards available in Web clients and mobile apps

Explore data in a Variety of ways and across multiple visualizations using Power BI Used Power BI Gateways to keep the dashboards and reports up to date.

Installed and Configured Enterprise Gateway and Personal Gateway in Power Bi Services. Published reports and dashboards using Power BI.

Environment: SSIS, SQL SERVER, Power BI.

Title : DATA - FERMAT (RIS)

Client: CGI

Role: MSBI DEVELOPER Location: HYDERABAD, INDIA

Period: - Dec-2010 to Oct -2016

Risk Information Store (RIS) is the main repository for all credit risk related data. It is also the source of both management credit risk reporting for internal use, and for all credit risk market disclosure requirements. RIS sources lending related transactions from all ANZ transaction systems used domestically and globally. The RIS upgrade project is replacing the current RIS platform with a credit risk engine from a new system called FERMAT.

The Institutional Customer Information System (ICIS) is a monthly reporting system that is a web-based reporting application that delivers customer and product information that receives a data extract from RIS. Once FERMAT has been installed and has gone live the RIS data extract will be replaced by a new extract from FERMAT. This project will ensure that the new data extract

from FERMAT will effectively maintain the existing data feed.

ROLES & RESPONSIBILITY:

Transformed data from various data sources using Flat file, Excel, ADO .net and OLE DB connection by creating various SSIS Packages.

Created packages in SSIS Designer using Control Flow Tasks and Data Flow Transformations to implement Business Rules.

Creating the Mappings using Data Flow Transformations such as the Sort, Derived Column, Conditional Split, SCD, Pivot and Lookup etc.

Using Control Flow, elements like Containers, Execute SQL Task, Execute Package Task, File System Task and Send Mail Task etc.

Scheduling and Monitoring the ETL packages for Daily load in SQL Server Job Agent.

Debugging and Validating the Data flows and Control Flows

To Develop Exception Handling process for each SSIS Package.

Implemented Performance Tuning at various levels.

Design and development of database scripts and procedures.

Implemented checkpoint configuration, package configuration in packages for better reusability.

Created manual test cases for SSIS testing.

Involved in UNIT testing.

Compared RIS output data with FERMAT output data and compared dollar value columns, Number of columns, Compared data format of each column.

Development of different types of SSRS reports like Drill down, sub report, link report, pivots, charts, tabular. Extract the data from SQL Server and applied Business Logic then load into the target tables.

Avoid Blank pages when we export the report to PDF format by changing the properties of can grow to avoid report control to grow more than the define report layout, etc.

Environment: SQL SERVER 2008 R2, SSIS 2008 R2, SSRS 2008 R2.



Contact this candidate