Post Job Free

Resume

Sign in

Azure Data Engineer

Location:
Holly Springs, NC
Posted:
April 18, 2024

Contact this candidate

Resume:

Name: Soujanya Muppana

Role: Azure Data Engineer Data Engineer

Mobile: +1-409-***-**** Email: ad43uz@r.postjobfree.com

LinkedIn: https://www.linkedin.com/in/soujanya-muppana0127/

PROFESSIONAL SUMMARY:

• 10+ years of experience working in BI Development and Application Maintenance using Microsoft Technologies.

• Experience in Azure SQL server, Azure Data Factory and Azure Data Lake, Azure Data Bricks, Azure Synapse.

• Extensive experience in Extraction, Transformation and loading of data directly from different heterogeneous source systems.

• Strong experience with data migration, cloud migration, and ETL processes.

• Hands on Experience in ETL process using Azure Data Factory.

• Experience in PowerBI(Power Pivot/View) to design multiple dashboards to display information required by different departments and upper level management.

• Experience in developing formatted, complex reusable formula reports with advanced features such as conditional formatting, built- in/custom functions usage, multiple grouping reports in PowerBI.

• Hands on Experience in Logic Apps.

• End user training experience.

• Hands on Experience in Designing and implementing Relational and non- relational database.

• Good knowledge on JSON and XMS.

• Good knowledge and hands on experience in Azure Data Bricks.

• Good knowledge and hands on experience in PySpark and Python.

• Hands on experience in working on Microsoft SQL Server database.

• Hands on experience in working on Logic Apps.

• Hands on experience in working on AWS.

• Experience in creating ADF Pipelines and SSIS Packages.

• Experience with C#.net and Javascript.

• Hands on experience in Creating Stored Procedures, Functions, Triggers, Indexes, Views and CTE.

• Expertise in monitoring and analyzing the issues related to Jobs implemented using ADF Pipelines.

• Experience in Extraction, Transformation, and Loading of data directly from different heterogeneous source systems like Flat File, CSV, XLS,REST API and Relational Database, Salesforce.

• Experience in DevOps integration for CI/CD.

• Hands on experience in writing SQL, Spark or KQL code and integrate with enterprise CI/CD processes.

• Worked on Git hub version control system.

• Good communication, analytical, problem-solving skills, and a good team player.

• Proficient in technical writing and presentations

• Good team player, strong interpersonal and communication skills combined with self-motivation, initiative, and the ability to think outside the box.

• Hands on experience in design & documentation, Project cutover & sanity checks.

• Experience in preparing the project scope, deliverables timeline, and project execution plan.

• Collaborated closely with cross-functional teams to gather integration requirements and design robust, scalable integration solutions by considering the performance and high availability.

• Worked in onsite-offshore model.

• Excellent Analytical and communication skills

Roles & Responsibilities Summary:

• Understanding the functional requirements of the customers.

• Analysis of the Azure services and deciding on the suitable services required for project by doing R&D and performing POCs.

• Developer for handling the Azure project by creating dataflows and pipelines using ADF.

• Expert in creating data pipelines(jobs) using Azure Data factory

• Expert in creating PySpark notebooks.

• Working on Designing and implementing Relational and non- relational databases.

• Implemented best practices and performance tuning to improve ETL process and easy code readability

• Developed Mappings using Transformations like Expression, Filter, Joins and Lookups for better data mappings.

• Expertise in writing complex DAX functions in Power BI and Power Pivot.

• Developer PowerBI dash boards as booking YoY comparison by Region/Product/Customer group.

• Used PowerBI gateways to keep dashboards and reports UpToDate. Published reports and dashboards using PowerBI.

• Working on creating Logic Apps.

• Working on creating Notebooks using Azure Databricks.

• Created jobs to load historical data and incremental data.

• Extensive experience in scheduled and maintain the ADF data pipeline using Azure Data Factory triggers

• Team Lead/Developer for handling the MSBI project.

• Good knowledge on Star Schema.

• Attending the walkthroughs for the project/change requirements.

• Developing and implementing the SSIS Packages for the different stages of ETL and Reporting.

• Created ETL Packages using SSIS from Excel, SQL Server source to SQL Server Destination.

• Writing and validating Unit Test Cases.

• Working on the Enhancements of the application.

• Writing Stored Procedures, functions as per the requirement.

• Production support, fixing bugs, issues.

• Creating various reports for projects like Weekly status report, Monthly status report, Job issue tracker, defect tracker and CR Tracker.

• Involved in Effort estimation of the requirements.

Key Achievements

• Received Star Performer award for the outstanding performance.

Education details & Certifications:

Education

• Master of Computer Applications.

• Adikavi Nannaya University

Professional Affiliations / Certifications

• Microsoft Certified Azure Data Engineer Associate

Training

• Azure Data Factory and Azure Databricks,Synapse

• SQL Server, PLSQL training from SQL Star training institute

• MSBI,SSIS,SSAS,SSRS,PowerBI

• Python,Pyspark,LogicApps

System Experience:

Database

SQl Server,Oracle,MongoDB

Cloud Technologies

Azure Data Factory, Azure Data Warehouse, Azure Sql server, Azure Datalake and Azure Data Bricks, Azure Synapse,LogicApps

Reporting Tools

SSRS and Crystal Reports, PowerBI, Tableau

Data Integration Tools

Azure Data Bricks, Azure Data Factory

Code Repositories

GitHub

Data Sources

Flat files, ADLS, Azure Blob storage

Scrum Boards

Azure DevOps, Synapse KANBAN Board

Domains Worked

Insurance, E-Commerce

Programming languages

C#.Net,Python

Scheduling Tools

ADF Scheduled Triggers

Professional EXPERIENCE:

Client: CDW, Chicago USA

Role : Data Engineer

Duration: May 2021 to till date.

Responsibilities:

• Analyzing the requirement and created complex Dataflows to load data.

• Gathered business requirements and prepared technical design documents, target to source mapping document, mapping specification document.

• Worked on Parameters and Transformations like Look up, Aggregate, Join, Window, Filter, Alter Row, Derived Column with complex Expressions.

• Worked on designing the RADDB, APPDB, CloudDWDB and CPRADB databases in Microsoft SQL Server.

• Loaded incremental data into Azure SQL Server.

• Created Pipelines in ADF using Linked Services/Datasets/Dataflows/Pipeline from source to Data lake and Data lake to Azure Sql Server.

• Created and debug the Stored Proc.

• Created PySpark and Python notebooks.

• Created Logic Apps.

• Created PowerBI dashboards and published reports and dashboards using PowerBI.

• Created DAX queries to generate computed columns in PowerBI.

• Extensively used different types of powerBI charts like stacked column chart, Pie chart, multi row cards, slicers and filters.

• Drill through dashboards created to drill down from yearly to quarterly and monthly.

• Scheduling and Monitoring the ADF Pipelines in Production.

• Fixing the production issues for the Pipeline failures.

• Performance tuning the stored procedures in Azure Sql Server.

• Understanding the client requirement and working on POCs.

• Fixed/resolved the production issues.

• Implementation of ETL pipelines using Azure data factory.

• Development of stored procedures, functions and views using SQL and PL/SQL.

• Performed end-to-end processes to extract, ingestion, and consume large volumes of data from multiple sources.

• Collaborate with Business Analysts, other developers, and business users through the project life cycle to gather and understand requirements.

• Involved in Business calls along with product owners for better understanding of the requirements.

• Analysing and evaluating data sources, data volume, and business rules.

• Building and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.

• Performed basic and unit testing.

• Assisted in UAT Testing and provided necessary reports to the business users.

Technical Platform: Azure Data lake, Azure Data Factory, SQL Server, PowerBI,LogicApps, Data Bricks

Client: TARDIS WAREHOUSE, Torus, UK

Role: Data Engineer

Duration: (Jun 2017 – April 2021)

TARDIS is a centralized data warehouse across entire Torus Insurance and all end users would connect to the SSAS cube and generate their own reports. TARDIS consists of enterprise data which would come from different sources (GENIUS, DUCK CREEK, CARLOS, FIG and CLAIM CENTER) and would store in TARDIS ware house. TARDIS is very critical to the entire Torus Insurance as it is single version of truth for every line of business which Torus Insurance operates across the global. Torus Insurance is operating in many lines of businesses like Property, Casualty, Liability and Specialty. TARDIS would refresh in every 15 minutes and almost latest data would be available for end users.

Responsibilities:

• In the current project I work as an ETL Developer and the Reporting Developer and collected all the required specifications of input data.

• Used Control Flow Tasks like For Loop Container, For Each Loop Container, Sequential Container, Execute SQL Task and Data Flow Task.

• I also worked in Data Extraction, Data Transformation and Loading the Data, also developing of packages.

• Creating SSIS Packages by using different Data Transformations like Derived Column, Look up, Conditional

Split, Merge Join, Multicast while loading data into DWH.

• Working with Dataflow Task, ExecuteSqlTask Frequently.

• Developed and configured PowerBI reports and dashboards from multiple datasources using data blending.

• Explored data in a variety of ways across multiple visualizations using PowerBI.

• Creation of Configurations, Deploying packages to target server.

• Creation of SQL Server agent jobs with SSIS packages to run daily ETLs at schedule times.

• Implementing Transformations in dataflow to clean data.

• Creation of reports from DWH using SSRS and PowerBI.

• Created multiple transformations and loaded data from multiple sources to PowerBI Dashboard.

• Creation of Tabular, Matrix, and Chart types of reports in SSRS as per end user requirement.

• Creation of report subscriptions to send the reports by mails by scheduling the reports.

• Performed PowerBI desktop data modelling which cleans, transforms and mashup data from multiple sources.

• Installed, configured and implemented in Enterprise gateway in PowerBI Service.

• Providing report level securities for End user.

• Hands on experience with Unix/Linux Servers and commands

• Creation of Pyspark and Python notebooks.

• Involved in Processing the Data of Cubes.s.

Technical Platform: Azure Data lake, Azure Data Factory, SQL Server,Power BI,SQL,PLSQL,Azure Databricks

Client: PBM Project, Trademark

Role: Software Engineer

Duration: (Mar 2014 - Apr 2017)

Description:

Caremark is one of the nation’s leading pharmacy benefit management (PBM) Companies which provides services to over 2,000 health plans, including Corporations, Managed care Organizations, insurance Companies etc.,. CVS Caremark Corporation is an integrated pharmacy services provider which combines with United States pharmaceutical services company with a U.S. Pharmacy Chain. The purpose of this project is to load the data into data warehouse dimensions and facts from flat files and sql server source systems, creating reports that are used by the management for their data analysis.

Responsibilities:

• Involved in Creating Database Objects like Tables, Stored Procedures, Views, Triggers, and user defined functions for this project.

• Gathered Requirements from the end user and involved in developing logical model and implementing requirements in SQL Server.

• Creation of work flows with Data flow, Execute SQL task, and Execute package tasks And Containers.

• Creation of Package configuration to handle data source connection information.

• Implementing logging for each package.

• After loading the data I used to generate name type of reports according to the client Requirement.

• Responsible for creating reports based on the requirements using Reporting Services 2008R2.

• Identified the database tables for defining the queries for the reports.

• Identified and defined the Datasets for report generation.

• Wrote queries for drill down reports to get the dataset required to build the report.

• Deployed generated reports onto the Report Server to access it through browser.



Contact this candidate