Post Job Free
Sign in

Power Bi Sql Server

Location:
Dallas, TX, 75225
Posted:
May 22, 2024

Contact this candidate

Resume:

MANI DEEP MITTAPALLY

609-***-****

ad5vwq@r.postjobfree.com

Professional Summary:

•Eight year’s plus of experience in IT as Business Intelligence Developer working in various domains such as Finance, Insurance and Healthcare.

•Extensive experience in T-SQL, SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), SSAS, Power BI & Tableau creating Design, Development, Testing, Implementation and Production/Support phases of SDLC for Data Warehouse, Client/Server, RDBMS, and Web Based Application working in Agile, Waterfall, RUP environment

•Expertise in Data Extraction, Transforming and Loading (ETL) using SQL Server Integration Services (SSIS), Business Intelligence Development Studio (BIDS), Data Transform Services (DTS).

•Strong experience in Data Warehousing and ETL using SSIS for Teradata, Oracle SQL Server Data mart, OLAP, OLTP.

•Extensively worked on Control Flow Tasks such as for each Loop Container, For Loop Container, Data Flow Task, File System Task, Script Task, Execute SQL Task, FTP Task, Execute Process Task, and Send Mail Task.

•Efficiently used Data Flow Tasks such as Data Conversion, Lookup, Aggregate, Audit, Multi Cast, OLE DB Command, Sort, Union All, and Copy Column, Derived Column, Row Count, Record Set Destination of the integration services (SSIS).

•Experience in debugging the SSIS packages using breakpoints and checkpoints.

•Hands-on Star Schemas, Snowflake Schemas, dimensional modeling and reporting tools, operations Data Store concepts, Data Marts and OLAP technologies.

•Highly proficient in creating data sources, data source views, dimensions, measures, measure groups, perspectives, partition aggregations. Analysis and Reporting Services, MDX queries, OLAP Cubes, and deployment and processing SSAS objects.

•Strong skills in writing MDX queries to access cubes in SSAS and creating feeds from OLTP systems.

•Designed and implemented user defined hierarchies in SSAS including parent-child hierarchy.

•Expertise with SSRS, Power BI (tabular reports, parameterized reports, drilldown, drill through, sub reports, charts, dashboards etc.)

•Developed Custom reports and deployed them in the server using Power BI.

•Conversant with all phases of Software Development Life Cycle (SDLC) involving Systems Analysis, Design, Development, and Implementation.

•Highly proficient in the use of SQL for developing complex stored procedures, triggers, tables, user profiles, user functions, relational database models and data integrity, SQL joins and query writing.

•Widely worked on to improve SQL Query performance, SQL Query analytics and highly complex joins to populate the desired result set.

•Good experience in development skills with Azure Data Lake, Azure Data Factory, SQL Data Warehouse Azure Blob, Azure Storage Explorer.

•Experience in Power BI to create Dashboard Reports using Source as Azure Database, Cubes.

•Comprehensively worked on SQL scripts using local and Global Temp Tables, Variables and Table Variables, Common Table Expressions (CTE) as per the requirement and convenience.

•Worked on SQL scripts PLSQL, Creating Tables, Clustered/ Non-clustered Indexes Joins, to get the desired result sets from the Oracle Database as per the requirement.

•Hands-on Deploying reports, creating report schedules and subscriptions. Managing and securing reports and Proficient in MS Office includes MS Excel, creating spread sheets, graphs, and dashboard reports.

Skill Set:

RDBMS: MS SQL Server, Oracle, MS access, DB2, Teradata, OBIEE, SAP

BI and SQL Tools: SSIS, SSAS, SSRS, Business Intelligence Development Studio (BIDS), SSMS, Visual Studio, Power BI, Tableau.

Scripting and Query Languages: T-SQL, HTML, VBScript, C, C#

Operating Systems: MS Windows 2010/07/03/P, Windows Server 2012/2008, MS-DOS, Unix-Ubuntu.

Tools: SQL Alerts, SQL jobs, Erwin, ER studio.

Others: MS office 2013/2010/2007, Agile and waterfall methodology, Share point, Power Pivot.

Education:

Bachelor’s in computer science May 2014

Jawaharlal Nehru Technology University, Hyderabad, India.

Master’s in computer science Dec 2016

California State University, Hayward, California.

PhD in Information Technology

University of Cumberland’s, Williamsburg, Kentucky.

Certification:

Azure Data Engineer Associate.

Vaya Health, Asheville, NC 11/2022 – Present

Data Engineer

Worked on Design, Development, Testing, and Implementation and Production/Support phases of SDLC for Integration to EDW using SSIS & Azure data factory, Reporting work in SSRS & Power BI.

•Developed SSIS packages using various transformations and migrated a few packages in Azure data factory.

•Extract transforms and loads from source system to azure data storage services using a combination of azure data factory T-SQL, Spark SQL, and U-SQL azure data lake analytics.

•Data ingestion to one or more azure services – (Azure data lake, Azure storage, Azure Storage, Azure SQL, Azure DW) and processing the data in Azure Databricks.

•Developed JSON scripts for deploying the pipelines in Azure data factory that process the data using SQL Activity.

•Developed reports, dashboards using power bi, deployed into power bi service using various data sources and maintaining row level security to those reports.

•Using query editor in power bi performed certain operation like fetching data from different file.

•Worked on migrating the database from on-prem to cloud using Azure data factory, Data Lake, data bricks etc.

•Developed lots of customs visuals and wrote many DAX calculations according to business logic for interactive visuals to users.

•Developed mappings that perform Extraction, Transformation, and load of source data into EDW using various transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure, SQL, Normalizer, and update strategy to meet business logic in the mappings.

•Migrated the data from various sources from the vendors into the EDW using SSIS and Azure data factory.

•Worked on migrating the legacy SSIS packages to Azure data factory pipelines.

•Re-written the data source in SSRS and Power BI reports and re-written the stored procedures, views, functions etc., and using new tables in EDW.

•Developed various reports using chart graphs, KPI’s, graphs, grids, drill through, drill downs etc., in SSRS and Power BI.

•Maintaining the Reports and SQL code like tables, stored procedures, views, functions etc., in Azure DevOps repositories and deploying those according development to production phases.

•Involved in writing SQL Queries, Stored procedures, view, functions etc., and worked on performance tuning.

•Participated in weekly status meetings and conducted internal and external demos as well as formal walk through among various teams and documenting the proceedings.

Avanos, Alpharetta, GA 07/2021 – 10/2022

Azure/BI Developer

•Developed mappings that perform Extraction, Transformation, and load of source data into Derived Masters schema using various power center transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure, SQL, Normalizer, and update strategy to meet business logic in the mappings.

•Implemented performance tuning on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.

•Developed Dashboards in Tableau & Power BI reports and migrated the reports from Tableau to Power BI Premium.

•Utilized Power BI to create various analytics dashboards that depicts critical KPIs such as legal case matter, billing hours and case proceedings along with slicers and dicers enabling end-user to make filters.

•Worked on migrating the data from SAP Data warehouse to Snowflake and SAP Hana to Azure SQL Server.

•Implemented Copy activity, Custom Azure Data Factory Pipeline Activities for On-cloud ETL processing.

•Extract Transform and Load data from source systems to Azure storage using a combination of Azure Data Factory, T-SQL, and U-SQL Azure.

•Migrated SAP Hana data into Azure SQL Serve using Azure Data Factory.

•Utilized Power Query in Power BI to Pivot and Un-Pivot the data mode for data cleansing and data massaging.

•Implemented Power BI gateway different source to matching analytics criteria and further modeling for Power Pivot and Power View.

•Involved in T-SQL development skills to write complex queries involving multiple tables, procedures, triggers, and user defined functions.

•Date Ingestion to one or more Azure Services (Azure Data Lake, Azure Storage, Azure SQL, and Azure DW) and processing the data in Azure Databricks.

•Primarily involved in Data Migration using SQL, SQL Azure, Azure storage, and Azure Data Factory, SSIS.

Environment: Power BI, SSIS, ServiceNow, Azure DevOps, Data analyzer, SQL, SAP Hana, T-SQL, Python, Azure SQL Server, Azure Data Factory.

Banner Health, Phoenix, AZ 09/2020 – 06/2021

ETL/Power BI Developer

•Create mapping and workflows using SSIS to extract, transform and load data into the target environment.

•Experience in design, developing, testing, and implementing HL7 interfaces.

•Experience in working with data warehouses and data marts using ETL tools (Designer, Repository Manager, Workflow Manager, and Workflow Monitor).

•Developed mappings that perform Extraction, Transformation, and load of source data into Derived master’s schema using various power center transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure, SQL, Normalizer, and update strategy to meet business logic in the mappings.

•Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer, and Scheduling of the workflow.

•Implemented performance tuning on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.

•Developed Dashboards in Tableau & Power BI reports and migrated the reports from Tableau to Power BI Premium.

•Experience in setting up repos, CI/CD pipelines using Azure DevOps.

•Worked in Python data libraries like Pandas, NumPy.

•Utilized Power BI to create various analytics dashboards that depicts critical KPIs such as legal case matter, billing hours and case proceedings along with slicers and dicers enabling end-user to make filters.

•Implemented Copy activity, Custom Azure Data Factory Pipeline Activities for On-cloud ETL processing.

•Worked on loading the data using Azure Data Factory pipeline in a Delta load format.

•Experience working with Python and SQL notebooks in Azure data bricks.

•Implemented Power BI gateway different source to matching analytics criteria and further modeling for Power Pivot and Power View.

•Involved in building the dimensions, cubes with snowflake schema using Azure analysis Service.

•Extracted data from Oracle and SQL Server then used Teradata for data warehousing.

•Implemented Copy activity, Custom Azure Data Factory Pipeline Activities.

•Primarily involved in Data Migration using SQL Azure, Azure storage, and Azure Data Factory, SSIS.

•Worked on legacy reports for optimization using T-SQL development skill and wrote complex queries and optimized the complex query.

•Experience in production support to optimize the complex queries and reports.

•Made use of post-Session success and post-Session failure commands in the Session task to execute scripts needed for cleanup and update purposes.

•Performed tuning of SQL queries and Stored Procedure for speedy extraction of data to resolve and troubleshoot issues in OLTP environment.

•Designated owner and accountable for major tasks and took responsibility for actions and outcomes to ensure timely and cost-effective results for our team.

•Worked with SQL queries, created Stored Procedures, packages, Triggers, Views using PL/SQL Programming.

•Developed reports Power BI reports from using multiple data sources such as Azure SQL Data warehouse, Cubes.

•Participated in weekly status meetings and conducted internal and external reviews as well as formal walk through among various teams and documenting the proceedings.

Environment: Power BI, Tableau, Talend, Workflow Manager, TFS, Azure SQL, Azure Data Factory,

Azure Data bricks, Workflow Monitor, Data analyzer, T-SQL, PL/SQL, Oracle 10g/9i, SQL Server 2009.

Avenue Insights & Analytics, Lexington, KY 05/2019 – 08/2020

SQL /SSIS/SSRS/SSAS Developer

Responsibilities:

•Created SSIS Packages using Execute SQL Task, File System Task, Send Mail Task, for each Loop container and Sequence container.

•Created SSIS Package with various Variables, Expressions and Transformations like Conditional Split, Derived Column, Data Conversion, Lookup, Merge Join, Sort and Aggregate.

•Debugged SSIS Packages using Breakpoints, Data Viewers & watch windows.

•Involved in data migration from Excel, CSV files into SQL Server using SSIS Packages.

•With Analysis Services created various Cubes, Dimensions, Measures and Aggregates based on the business requirement.

•Involved in creation, build, deployment, and processing of SSAS Cubes.

•Created/ Modified SSRS based reports based on T-SQL and Stored Procedure with Input parameters.

• Developed Tabular Reports, Sub Reports, Matrix Reports, Drill down Reports and Charts using SQL Server Reporting Services (SSRS).

•Identified and modified the Key Performance Indicators (KPIs), measures in co-ordinance with the requirement and created calculate member in MOLAP cube using MDX in Business Intelligence Studio using SSAS.

•Designed cubes and built Scorecards and dashboards for different levels of management.

•Included Report Design and Coding for Standard Tabular type reports, including Drill Down and Drill through Functionality and Graphical presentations such as Charts and Dashboard type metrics for effective analysis.

•Built advanced reports using query parameters, report parameters, user access restrictions, filters, and interactive sorts.

•Involved in writing MDX queries and performance optimization of the SSAS cubes. Build the dimensions, cubes with star schema and snowflake schema using SQL Server Analysis Services

•Experience in production support with SQL queries, Stored Procedures, packages, Triggers, jobs etc.

•Worked on Team Foundation Server (TFS) to push the code into repository and updating the stories.

Environment:

MS SQL SERVER 2005/2008/2012/2014, SQL Server Reporting Service (SSRS), SQL Server Integration Service (SSIS), SQL Server Analysis Service (SSAS), Business Intelligence Studio, T-SQL, SQL Profiler, Team Foundation Server (TFS), Microsoft Visio, MS Office 2003.

Crump Life Insurance, Harrisburg, PA 04/2017 - 04/2019

SQL /SSIS/SSRS/SSAS Developer

Responsibilities:

•Actively involved in the gathering requirements and analysis of the requirements related to the project and responsible for writing Business Approach Documents (B.A.D) and Technical Documents for new projects.

•Created SSIS packages to validate, extract, transform and load data to a centralized SQL server from the existing diversified data sources.

•Involved in performing incremental loads while transferring data from OLTP to data warehouse using different data flow and control flow tasks in Business intelligence studios using SSIS.

• Involved in Debugging the SSIS Packages using Data Viewers, Breakpoints.

•Used various transformations like Derived column, Pivot, Un-pivot, copy column, merge join, union all, Lookup, audit, aggregate etc.

•Created the reports based on the requirements and Deployed the Reports in Report Server.

•With Analysis Services created Cubes, Dimensions, facts, Measures and Aggregates based on the business requirement.

•Involved in defining Referenced Relationships and in Identifying KPIs in SSAS.

•Involved in the performance and database optimization of T-SQL queries.

•Developed User Defined Functions, Stored Procedures, SQL Queries based for Ad hoc processes.

•Created OLAP cubes, facts, and dimensions for the analysis of sales in various locations and process data from data mart using SSAS.

•Involved in building the dimensions, cubes with star schema and snowflake schema using SSAS.

•Worked with RDBMS concepts for creating database objects such as tables, functions, stored procedures, indexes, views.

•I was involved in performance tuning of the SQL queries. Experienced with Power Pivot and used SQL server agent and AutosSys to schedule the packages.

•Created sub-reports, drill down reports, summary reports, parameterized reports, and ad-hoc reports using SSRS.

•Created a sustainable reporting practice and means of storing, sharing/ publishing the information and reports using subscriptions.

Environment: MS SQL Server 2014/2012, MS SQL Server Management Studio (SSMS), MS SQL Server Integration Services (SSIS), MS SQL Server Reporting Services (SSRS), SSAS Tabular, TFS, Cubes, MS SQL Server Analysis Services (SSAS), MS Visual Studio 2010, MS Excel, T-SQL, AutosSys.

HSBC Bank, Hyderabad, India 06/2014 - 07/2015

SQL Developer

Responsibilities:

•Developed, maintained, and executed SQL scripts/queries and validated completeness, integrity, and accuracy of data.

•Designed, developed, and tested stored procedures, views, and complex queries for creation of Reports.

•Designed and developed Data Transformation Process (Extraction, Transformation & Loading) using Heterogonous Sources to Destination (SQL Server).

•Successfully created SSIS packages using Sequence Container, Data Flow, Execute SQL, Bulk Insert, Send Mail Task Etc.

•Used SQL Agent for job scheduling.

•Used advanced features of T-SQL to design and tune T-SQL to interface with the database and other applications in the most efficient management.

•Built and performed user acceptance testing on new and modified reports. Generated several types of reports like tabular, matrix, list and more.

Environment: MS SQL Server 2008/2005, SQL Server Reporting services (SSRS), SQL Server Integration Services (SSIS), and MS Excel, Microsoft office Suite (Word, Excel, and PowerPoint)

Azure Data Engineer Associate.



Contact this candidate