Post Job Free

Resume

Sign in

Azure data engineer

Location:
Nashua, NH
Posted:
January 30, 2023

Contact this candidate

Resume:

VENKATA KIRAN KONIKI

Email: adu0pn@r.postjobfree.com

Phone: 603-***-****

LinkedIn: www.linkedin.com/in/KonikiKiran

SUMMARY:

Over 8 years of IT experience in Microsoft Modern Data platform. Business knowledge in Dairy ERP, HRM, Health care, Sales.

Azure Data Engineer Summary:

Implementation of Azure Cloud Components-Azure Data Factory, Azure Data Analytics, Azure Data Lake, Azure DataBricks, Azure Synapse Analytics, Azure Data Store, Tabular Model in Azure analysis services, Azure SQL DB/DW, Power BI, U-SQL and T-SQL

Good experience in developing BI applications utilizing MSBI stack like SSIS, SSRS, T-SQL and SQL Server and SQL Data warehouse.

Expertise in developing and creating the Azure Blob, SQL DB and lunching the windows virtual machines in Azure.

Expertise in Logical App.

Use Power BI to generate the reports and Dashboards visualize recent processed data and deploying Reports.

Experience in RDBMS to Cosmos DB Data moving

Experience in Azure Devops CI CD pipelines

Good Knowledge on Dataware Housing Concepts and Data/dimension modeling like Star/Snowflake schemas, Dimensions and Fact tables.

Hands on experience on Azure analysis services and Tabular cube model

Hands on experience on Creating ADF Dynamic Pipelines for Full Load and Incremental Load

Hands on expedience on installing the self-hosted Integration runtime to migrate data from

on premise to cloud.

Experience on Python & Big Data concepts.

Deployment of SSIS packages in Azure data factory & Scheduling jobs.

Experience in Creating SSIS Packages, SSRS Reports & SSAS.

Well experienced in both Waterfall and Agile methodologies.

Experienced in Azure Document DB Migration Tool.

Certified DP-203 Azure data engineer certification number (I415-5360).

SKILLS:

Azure Data Lake, Azure Data Factory, Azure Synapse Analytics.

Azure Databricks, Azure SQL database, Azure SQL Data warehouse.

MSBI.

Azure Document DB Migration Toll.

Python.

Data Visualization (Power-BI),

Data Migration.

DB2, SQL, Snowflake, Databricks.

Analytic Problem-Solving.

EDUCATION:

B. Tech from JNTU Kakinada (Andhra Pradesh) in 2015

PROFESSIONAL EXPERIENCE:

Title: ICP

Client: Optum

Role: Data Engineer

Duration: July 2022 – Till date

Environment: Azure Data FactoryV2, Azure Databricks, Azure SQL DB, PowerBI, Devops

Improve claims overpayments identification opportunity in post-pay workflow.

Overpayments detection and auditing process for post-day claims need to improvement.

losses 50% overpayment identification to vendors, more so for high value hospital bill audits.

Responsibilities:

Design and implement database solutions in Azure SQL Data Warehouse, Azure SQL.

Migrate data from traditional database systems to Azure databases

Analyze, design and build Modern data solutions using Azure PaaS service to support visualization of data. Understand current Production state of application and determine the impact of new implementation on existing business processes

Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL and U-SQL Azure Data Lake Analytics. Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.

Create CI CD pipelines in DevOps.

Experience in DWH/BI project implementation using Azure Data Factory

Interacts with Business Analysts, Users, and SMEs on elaborating requirements.

Design and implement end-to-end data solutions (storage, integration, processing, and visualization) in Azure.

Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform and load data from different sources like Azure SQL, Blob storage, Azure SQL Data warehouse.

Responsible for estimating the cluster size, monitoring and troubleshooting of the Spark databricks cluster.

Experienced in performance tuning of Spark Applications for setting right Batch Interval time, correct level of Parallelism and memory tuning.

To meet specific business requirements wrote UDF’s in Scala and Pyspark

Developed JSON Scripts for deploying the Pipeline in Azure Data Factory (ADF) that process the data using the Sql Activity.

Created Build and Release for multiple projects (modules) in production environment using Visual Studio Team Services (VSTS)

Design and implement database solutions in Azure SQL Data Warehouse, Azure SQL.

Propose architectures considering cost/spend in Azure and develop recommendations to right-size data infrastructure.

Setup and maintain the Azure SQL Database, Azure Analysis Service, Azure SQL Data warehouse, Azure Data Factory, Azure SQL Data warehouse

Develop conceptual solutions & create proofs-of-concept to demonstrate viability of solutions.

Implement Copy activity, Custom Azure Data Factory Pipeline Activities

Responsible for creating Requirements Documentation for various projects.

Title: GSK BAU Support

Client: GSK

Role: Data Engineer

Duration: Apr 2021 – June 2022

Environment: Azure Data FactoryV2, Azure Databricks, ADL, Synapse, Azure SQL DB, Devops, SSIS

Production support L2 & L3 for that runs, different application and that include incident management, service request management, change management and enhancement of the existing platform based on business requirement.

Responsibilities:

Analyzing the requirements and discussed with managers and leads on the functionality

understand existing source systems and DB tables.

Recreating existing application logic and functionality in the Azure Data Lake, Data Factory, SQL Database and SQL Data warehouse environment. Experience in DWH/BI project implementation using Azure DF and databricks

Participated in Team meetings to ensure a mutual understanding with business, development and test teams.

Responsible for creating Requirements Documentation for various projects. Strong analytical skills, proven ability to work well in a multi-disciplined team environment and adapt at learning new tools and processes with ease.

Designed SSIS packages for loading data from different source systems to STG DB.

Building the pipelines to copy the data from On-Prem to destination in Azure Data Factory.

Creating Stored Procedure and Scheduled them in Azure Environment.

Monitoring Produced and Consumed Data Sets of ADF.

Used various sources to pull data into Power BI such as SQL Server, Excel, Oracle, SQL Azure, etc.

Implement ad-hoc analysis solutions using Azure Data Lake Analytics/Store, HDInsight.

Used various sources to pull data into Power BI such as SQL Server, Excel, Oracle, SQL Azure, etc.

Collaborate with application architects on infrastructure as a service (IaaS) applications to Platform as a Service (PaaS).

Encapsulated frequently executed SQL statements into stored procedures to reduce the query execution times.

Create Azure DL and ADF for data migration into Azure cloud storage.

Develop ADF pipelines & monitoring ADF pipelines & Azure data bricks notebooks error handling.

Extract the data one platform to Azure synapse DB & Azure SQL DB.

Create SQL scripts and Store Procedures for data transformation/ formulas.

Prepare deployment artifacts for non-prod environments.

Title: OCC

Client: Nokia

Role: Data Engineer

Duration: Oct 2018 – Mar 2021

Environment: Azure Data FactoryV2, Azure SQL DB, Azure analysis services, Devops

Off-Cycle Compensation Management Tool(RFP)

It will Managing Off-Cycle and Mandatory Increases outside of the Annual Salary Increase cycle.

Responsibilities:

Analyzing the requirements and discussed with managers and leads on the functionality.

Working with database models and designs, create metadata for required objects.

Extract the data from different source systems and different file formats to azure SQL Database.

Created ADF pipelines for transfer and implemented different activities for data cleansing.

Implemented error logging and audit data tables for different data process in ADF.

Implemented SCDC types in ADF pipeline process.

Sending Notification email’s using Logic app.

Worked on SQL data warehouse and created Dimension& Fact tables.

Worked on Azure Analysis services.

Creating Power BI Reports.

Title: Midtronics

Client: Midtronics

Role: Software Developer

Duration: Jun 2017 – Sep 2018

Environment: Azure Data FactoryV2, Azure SQL DB, Azure analysis services, Devops, Azure Document Migration Tool, Cosmos DB, SSIS, Power BI

Midtronics BMIS is a customizable, web-based service that provides a centralized collection of battery and electrical testing data from multiple sources, test data analysis, web-based reports for review, and automatic software updates for test equipment in the field.

Responsibilities:

Analyzing the requirements and discussed with managers and leads on the functionality.

Encapsulated frequently executed SQL statements into stored procedures to reduce the query execution times

Created SSIS packages to implement error/failure handling with event handlers, row redirects, and loggings.

Managed packages the in SSISDB catalog with environments; automated deployment and execution with SQL agent jobs

Involved in the design of Data-warehouse using Star-Schema methodology and converted data from various sources to Sql tables

Obtained user approvals from the client for the collected requirements to ensure similar understanding between development team and business.

Extensively utilized SSIS packages to create complete ETL process and load data into database which was to be used by Reporting Services

Designed and developed various SSIS packages (ETL) to extract and transform data and involved in Scheduling SSIS Packages.

Created OLAP applications with OLAP services in SQL Server and build cubes with many dimensions using both star and snowflake schemas

Move the data Azure SQL Database to Another Azure SQL Database.

Deploy the SSIS packages in Azure data factory & Scheduling jobs.

Implemented SCDC types in ADF pipeline process.

Created CI/CD Azure DevOps pipelines.

Move the test data Azure SQL database to Cosmos DB.

Creating Power BI Reports.

Title: SWAYAM(MHRD)

Client: Microsoft

Role: Database Developer

Duration: Jun 2016– Nov 2016

Environment: SSIS, SSRS, MS SQL Server, Azure VM

The Ministry of Human Resource Development (MHRD), Government of India has envisioned the need to offer a state-of-the-art digital learning platform to the Indian student community. The idea is to bring in “Quality”, “Parity”, “Affordability” and “Reach” to the learners from all demographics.

Responsibilities:

•Wrote the Stored Procedures, User Defined functions and Views according to requirements.

•Have worked with Database testing.

•Created datasets using stored procedures and reports using multi value parameters

•Creating the reports like Drill Down, Drill Through, Table according to client requirement.

•Increased the block size from 4k to 64kb for increasing performance in azure VM’s.

•Move all databases to another drive including system databases.

•Involved in support the team.

Title: GST

Client: GST

Role: Power BI developer

Duration: Aug 2016– Nov 2016

Environment: Power BI

Tax base of Goods and Service tax (“GST”) would be very wide and would comprehensively extend over all goods and services up to the final consumer point. We need to do sentiment analysis on the GST data from social network i.e. Face book, twitter.

Responsibilities:

•Import the data different sources like excel, azure, on premise database.

•Wrote the DAX functions and measures.

•Creating the relationships on tables.

•Create the drill down, table, chart, dashboard reports.

•All server log files dumped into OMS depository and connecting to power bi

Title: Dairy ERP

Client: Sangam Dairy

Role: MS SQL Server Developer

Duration: July 2014 – June 2016

Environment: MS SQL Server, Oracle, SSIS, SSRS, SSAS.

Dairy ERP on Web for Dairy Sales and Distribution comprised of industry leading functionalities for Enterprise Resource Planning (ERP). Milk Sales is a complex process that needs to be executed swiftly on a continuous basis. Dairy ERP provides a sophisticated solution to distribute milk and milk products to retail outlets and also to end-customers. Dairy procure the Milk from farmers/societies in villages. Dairy will give some benefits to the farmers/societies like insurance, loans, veterinary services, Cattle feed, Fertilizers, Fodder Seed and also some schemes. Manage all Procured milk and benefits, then at the end dairy will calculate the bill for all societies/farmers.

Responsibilities:

•Analyzing the requirements and discussed with managers and leads on the functionality.

•Wrote SQL Queries using Joins, Sub queries as per business requirements for developers and also for to generate reports.

•Wrote the Stored Procedures, User Defined functions and Views according to requirements.

•Creating and Modifying Tables schemas as per requirement.

•Have worked with Database testing.

•Extract Data from Oracle to SQL Server by using import and export.

•Created Triggers to enforce data and referential integrity.

•Creating Constraints and Indexes for performance.

•Performed the optimization, tuning of the indexes, queries and the Stored Procedures.

•Created SSIS Packages for implementing ETL by using various control flow items execute SQL and data flow transformations conditional split, multicast, derived column, lookup, sort, union, all etc. Performed error handling while moving the data.

•Implemented package configuration in packages for better reusability.

•Worked on Deployment & Scheduling of SSIS Packages.

•Performed Schedule Jobs.

•Created datasets using stored procedures and reports using multi value parameters.

•Having experience in creating the reports like Drill Down, Drill Through according to client requirement.

•Involved in support the team.

Place:

Date: (K.V. Kiran)



Contact this candidate