Post Job Free

Resume

Sign in

Azure Data Engineer

Location:
Atlanta, GA
Posted:
July 02, 2023

Contact this candidate

Resume:

Mahesh popuri

E-Mail: adx1vk@r.postjobfree.com Mobile: +1-479-***-****

PROFESSIONAL SUMMARY

having 7 years of IT experience in the areas of data modeling, data warehousing, business intelligence, data analytics and cloud services like Azure, Snowflake, SQL, Python,,ETL tools.

Requirement Gathering - Proficient in all phases of analysis, design, development, testing and deployment and highly competent in gathering user requirements.

Working as cloud data engineer and involved in end-to-end design, development, data migration, support

Projects.

Proficient in design and development of process required to Extract, Load & Transform data into the data warehouse using Snowflake and Azure cloud environments.

Snowflake DBA - Experienced in creation of snowflake data base objects like data bases, schemas, tables etc.

Experienced in handling of semi structured & unstructured data loads & unloads into the snowflake DW.

Experienced in validation of data before loading into snowflake to avoid operational issues and saving of compute resources.

Experienced in Query performance optimizations & handling of restore and querying data in case of disaster scenarios using snowflake utilities.

Experienced in data migration from traditional on-premises to the cloud systems by using ETL tools.

Experienced in working with Snowflake Multi - Cluster warehouses, Snow pipes, Internal/External Stages, Store Procedures, tasks and streams.

Experienced in configuration of snow pipes data pipelines to load data from different clouds storages and configure the email alert notifications to trigger in case of pipelines failures.

Experienced in configuration of CI/CD pipelines to deploy changes to further environments.

Experienced in creation of data pipelines & Orchestrations jobs in ETL tools like MATILLION to perform smooth data transition.

Experienced in Azure Devops project methodologies.

Experienced in creation of data factory and developing various pipelines using various activities (like copy data activity, data flows, logic apps, variables, lookup’s etc.) In ADF

Extracted data from .XML file into the SQL table using Data Bricks (using Python)

Worked on areas of Azure data modelling, Extraction and Reporting.

Experienced in creating stored procedures to handle junk values and Azure SQL tables.

Worked on extracting data from various file formats such as JSON, XML and CSV

Worked on extracting data from systems and HANA data base.

Experience in storing the files from various sources such as Flat File, and REST API’s into ADLS (Gen-2).

Experience in end-to-end implementation of flow from the scratch.

Experience on creation of ADLS (Storage accounts and Blob).

Experience in migrating data from systems to ADLS using Datavard Glue.

Designing and implementing highly performant data ingestion pipelines from multiple sources using Apache Spark and/or Azure Databricks

Delivering and presenting proofs of concept to of key technology components to project stakeholders.

Developing scalable and re-usable frameworks for ingesting the datasets

Working with other members of the project team to support delivery of additional project components (API interfaces, Search)

Working within an agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.

Designing and implementing the various SQL resources available in Azure SQL DB, Managed Instance DB’s, Elastic Pools, SQL synapse Analytics (Formerly Azure Data ware house), On-demand SQL DW.

Implementing the partition-based data into databases.

Leveraging the Views/Stored Procedures with High performance for reporting Capabilities.

SAP HANA Data Modelling, Extraction and Data Administration

Creating various types of views – Attribute view, Analytic View and creating various types of views attribute view, Analytic View and Calculation views to provide real time analytics to the user.

Experienced in designing table functions and stored procedure to meet business requirement.

Experienced on Advanced DSO and Composite Providers

Experienced in life cycle Implementation of BI/BW that includes:

Data-Modelling, Data Extraction and Reporting with BEX - Designed data model with LSA and LSA++ architecture.

Functional Module Extraction, User exits, Customer exits, Routines and Enhancements - Extraction methods such as from flat file, LO Cockpit, Business Content extraction, and Generic Extraction and DB connect.

Expertise in development of mid to large-scale applications on R3, ECC 5.0/6.0 and legacy systems.

Strong experience in analyzing business processes, identifying right metrics, KPI’s and building top management decision support systems those deliver business value.

Methodical approach to implement best practice KPI’s, converting requirements to Functional &Technical specifications, experienced in defining test criteria, conduct User Acceptance Test (UAT).

Experienced in Production support, troubleshooting issues raised.

BW performance tuning to improve query and data loading performance.

Experience in working with Open DSO and advance DSO and composite info providers.

TECHNICAL SKILLS

Data Warehousing

Snowflake, Azure Synapse, Azure Data Bricks

Cloud Platform

Microsoft Azure

ETL /ELT Tools

Matillion, Azure Data Factory

ERP

ECC

Functional Modules

SD, MM, FICO, COPA

Reporting Tools

Web Intelligence (WebI), Bex Query Designer, Bex Analyzer, Power BI

Programming

, SQL, Python,

Databases

Oracle 10g/9i/8i, Azure SQL

PROFESSIONAL EXPERIENCE

Client: Nestle Nutrition.

Duration: august 2021/till date

Role: Data Engineer

Responsibilities:

Experienced in gathering of end user requirements and preparing functional and technical build specification documents.

Evaluate Snowflake design considerations for any changes in the application.

Build the logical and physical data model for snowflake as per the changes required.

Define roles, privileges required to access different data base objects

Defined virtual warehouse sizing for snowflake for different type of workloads.

Perform the data validation checks before loading into snowflake tables.

Configuring the Snow pipe mechanism for continuous ingestion of data from different cloud providers.

Created Matillion ETL jobs to migrate data from on-premises to snowflake.

Created data pipelines in Matillion ETL and performed the data orchestration.

Enabled the CDC data pipelines to capture delta records from on premises to snowflake.

Configure CI/CD pipelines to deploy new and updated software safely.

Created Integration objects, file formats, stages, and used COPY INTO/Snow pipe to ingest CSV/PARQUET/JSON data continuously from Azure data lakes.

Configure the streams and scheduled the tasks to automate the complex queries to achieve business requirements.

Handled data migration activities from on-premises to cloud by defining the data pipelines to Orchestration jobs in ETL.

Designed the complex queries by following the query optimization technique to improve query performance.

Developed the pipelines in Azure Data factory for various scenarios to meet business requirement using blob storages and ingesting the data into azure synapse analytics.

Handled the delta mechanism by creating watermark table through Azure data factory

Created the Azure data storage to store raw data is extracted from on premises systems.

Extract transforms and load data from source systems to Azure data storage services using a combination of Azure data factory, T-SQL and U-SQL Azure data Lake Analytics.

Developed SQL for data extraction, transformation, and aggregation from multiple formats for analyzing & transforming the data to uncover insights into the customer usage patterns.

Responsible for estimating the cluster size, monitoring, and troubleshooting of the spark database cluster.

Experienced in performing tuning of application for setting right batch interval time, correct level of parallelism and memory tuning.

Developed JSON scripts for deploying the pipelines in Azure data factory that process the data using SQL activity.

Hands on experience on developing SQL scripts for automation purpose.

Client: Sportech Inc

Company: Sparity soft technologies Pvt limited

Duration: July 2017/Nov 2019

Role: Data Engineer

Responsibilities:

Snowflake lead developer, Snowflake administration.

Bulk loading from the external stage Azure data lake internal stage to snowflake cloud using COPY command.

Loading data into snowflake tables from the internal stage using snow SQL.

Used COPY, LIST, PUT, and GET commands for validating the internal stage files.

Used import and export from the internal stage (snowflake) from the external stage (Azure data lake)

Writing complex snow SQL scripts in snowflake cloud data warehouse to business analysis and reporting.

Used FLATTEN table function to produce a lateral view of VARIANT, OBJECT and ARRAY column.

Used SNOW PIPE for continuous data ingestion from Azure blob storages..

Developed snowflake procedure for executing branching and looping.

Created clone objects to maintain zero copy cloning.

Loaded on-prime JSON/XML data to Snowflake using Snow SQL CLI.

Experienced in Continuous Data Protection Lifecycle: Time Travel, Fail-Safe zone.

Queried Historical Results/Data based on Timestamp, Offset, Query ID

Experience on Streams; Change Data Capture from existing data sources to Snowflake.

Worked on Zero-Copy Cloning. Cloned Databases, Schemas, Tables, Pipes, Stages,

File Formats, Streams, Sequences, Tasks etc.

Hands on in deploying Snowflake feature such as secure Storage integration, Data Sharing.

Created views, secure views, Materialized views.

Experienced in snow pipe to automate data loads.

Creating data pipelines in snowflake from different partner connects like Matillion.

Developed SQL for data extraction, transformation, and aggregation from multiple formats for analyzing & transforming the data to uncover insights into the customer usage patterns.

Engage end users and assist (trouble shoot, train on new features) with adoption of new systems, configuration and testing, data migration, translate user needs into technical features and engage users to validate MVP, exposure to Azure Development and CI/CD.

Experienced in developing solutions using Azure Functions Azure Blob Storage, SQL Data Warehouse, Microsoft Event Hubs, Azure Data bricks, Data Lake Analytics.

Client: Sportech Inc

Company: Sparity soft technologies Pvt limited

Duration: oct 2015/June 2017

Role: Data Engineer

Responsibilities:

Designing the essential core building block's that make up a technical solution, producing a modular and flexible design that meets the business requirements.

Designing a solution that is as elegant, optimized, modularized and reusable as possible, ensuring the design is flexible and extensible.

Guide on the effective use of solutions and architectures in order to achieve business outcomes in alignment with business requirements and the overall Enterprise Architecture

Provide E2E solution architecture design and recommendations to the vendor

Collaborate with stakeholders to understand business goals and support delivery outcomes

Collaborate with other Architects for successful delivery of program streams and individual projects.

Collaborate with the Enterprise Architect ensuring that Solution Architectures are in alignment with the overall EA and Technology strategy

Uphold the architectural governance principles and good implementation design for architectural deliverables.

Contribute to the effective management of risks and issues associated with Solution designs.

Explore and prototype new tools & technologies and identify how to leverage them when industrializing solutions

Design and implement the management, monitoring, security, and privacy of data using the full stack of Azure data services to satisfy business needs.

Involved in requirements gathering sessions and supported preparing technical specification documents.

Experienced in creating Datasets required in pipelines.

Experience in developing various pipelines using various activities (like copy data activity, dataflows, logic apps, variables, lookup’s etc.) in ADF.

Experienced in performing various operations to add, merge columns using data flows as per business requirement.

Extracted data from .XML file into the SQL table using Data Bricks (using Python).

Worked on areas of Azure data modelling, Extraction and Reporting.

Experienced in creating stored procedures to handle junk values and SQL tables.

Experienced in the creation of flat files and designed the flow accordingly in ADF.

Worked on extracting data from various file formats such as JSON, XML and CSV

Migrating data to Microsoft Azure data lakes and then to Snowflake.Experienced in configuring Azure CI/CD setup.

Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform and load data from different sources like Azure SQL, Blob storage, Azure SQL Data warehouse, write-back tool and backwards.

Client: Multiple betting clients

Company: North Alley India Pvt limited, India Duration:Jan2014/AUGUST 2015

Role: Solution Designer

Responsibilities:

Worked on Data Modelling, Extraction and Data Administration

Creating various types of views – Attribute view, Analytic View and creating various types of views – Attribute view, Analytic View and Calculation views to provide real time analytics to the user.

Create Build spec and Technical Test scripts.

Create or Validate CR & Manual actions and follow up CR implementation in Preproduction /Prod.

Support Functional teams for new Projects, meetings/WS/Telco has and Pilot or Feasibility checks for new projects, Prototype BA projects.

Coordinate the Design & Build making use of Nearshore/Offshore resources.

Support Hyper care / projects in Pilot or Prototype phase.

Raise and follow up Customer messages.

Helping the FICO D&B and RBS on day-to-day activities.

Validate reports for decommission and co-ordinate with offshore for the deletion.

Attending scrum calls to progress the development, issues, and preparation of production backlogs.

Designed and developed implementation strategies for the Project.

Organizing daily meeting with offshore and onshore team to get the requirement.

LSA architecture design prepared and implemented for across all modules.

Performed testing at various stages of implementation and validate the results with the requirements and also with the existing Landscape.

Conducted meetings with Basis to release the BW transports to production environments.

Attended daily meeting with onsite and COE team to give updates on the development.

Involved in performance tuning of information modelling

Loading data to developed BW objects to perform the testing in lower environments

Scheduled the web intelligence reports as per user-needed frequency.

Provided end user assistance to solve the report data related issues .

Worked on solution manager to upload the technical specs and functional specs.

Extensively worked on data modeling using Attribute View, Analytic View, and Calculation View.

EDUCATION DETAILS:

Completed bachelors in computers in 2014 may in Karunya university Coimbatore in India

Completed masters in computer information systems in new England college in 2021 may in United states.



Contact this candidate