Post Job Free

Resume

Sign in

Azure Data Engineer

Location:
Kirkland, WA
Salary:
$100k/annum
Posted:
August 16, 2023

Contact this candidate

Resume:

Tejaswini(Azure Data Engineer)

https://www.linkedin.com/in/tejaswini-nandam-195b39255/

Contact: Address:

425-***-**** Redmond,WA

adyzew@r.postjobfree.com

Professional summary:

• Having 8+ years of career success using IT experience in Data Engineer and Data Test Engineer with ETL-Development, ETL-Testing, BI Development and Testing of various data sources and Data quality and Experience in Bi Platforms like Power BI.

• Designed, developed high quality data pipelines using Azure Data Factory and DataBricks to build reports, and data analysis.

• Experience in migrating several on-premises databases to cloud.

• Experienced in migrating on-premises database to Azure SQL database and Managed instance.

• Experienced in migrating existing pipelines from Azure Data Factory to Databricks Workflows.

• Experienced in developing pipelines and working with Azure Cosmos DB database technologies.

• Experience in working with Cosmos DB and Storage Tables for creating pipelines.

• In-depth understanding of Spark Architecture including Spark core, Spark SQL, Data Frames, Spark streaming, Spark Mllib

• worked on Spark RDDs to do the transformations on the data.

• Worked on data processing and transformations actions in spark using python(Pyspark) language.

• Good knowledge about using optimization technics in Spark.

• Worked on performance tuning of the SQL queries for query optimization.

• Experienced in working with Delta Tables for data analysis.

• Work experience with REST APIs using Data Bricks to extract, clean and analyse the data

• Exceptional analytical and critical thinking skills. Delivers a proactive approach, great work ethic, and ability to function well in fast-paced/deadline-driven team environments.

• Collaborate with Data Architects/Engineers to establish data governance/Security (Key Vault, Service Principal, SAS, Network security at schema level and row level), resource groups, integration runtime setting, aggregated function for Databricks development.

• Hands on experience in using GIT, Bitbucket for source code branching and merging

• Good Experience in developing Stored Procedures, Functions, Views, Exception Handling and DB Queries.

• Using JIRA and Agile Process to plan and track the requirements of the Project.

• Writing the SQL scripts and optimizing the scripts for data extraction and processing large amounts of data.

• Worked on Logic Apps in Azure Data Factory.

• Strong working knowledge of MS Excel and advanced data manipulation functions (Pivot Tables, VLOOKUP, ODBC, VBA) is preferred.

• Experienced in SQL optimization to improve the performance of reports.

• Great communication skills and the ability to work across organizational boundaries.

• Played a key role in gathering requirements from clients with effective communication skills.

• Implemented Power BI reports for data analysis.

EDUCATION :

• Bachelor of technology [computer science Engineering] from JNTU University, India 2010 TECHNICAL SKILLS:

BI Reporting Tools Microsoft Power BI Desktop, Power BI Service, Tableau 2021.x Database SQL, NOSQL, SQL Server, MS Access, Oracle, MySql, PostgreSQL, Azure cosmos No SQL DB, Azure SQL DB, Azure SQL warehouse Version Control GIT, BitBucket Microsoft Azure Devops / Repos Programming Languages Python Programming (Pandas, NumPy, Matplotlib, NLTK), PySpark, Scala

Cloud Platform Microsoft Azure - ADFv2, BLOB Storage, Azure SQL DB, SQL server, Azure Analytic services, Data bricks, Azure Cosmos DB, Azure Data lake (gen2), Azure Eventhub, GIT Repository Management, ARM Templates. ETL Azure Data Factory, DataBricks

Certifications(s):

• Azure Data Engineer Associate Certified.

WORK EXPERIENCE:

Data Engineer Client: Corteva Agri Science, IOWA Dec-2021 to Present Roles and Responsibilities:

• Design and developing the data pipelines using and Azure Data Factory and DataBricks.

• Involved in migrating on-premises database to Azure SQL database and Managed instance.

• Involved in migrating existing pipelines from ADF to Databricks Workflows.

• Creating pipelines, dataflows, data transformations and manipulations using ADF

• Mounted Azure data lake storage in databricks and created ADF pipeline databricks notebook activity to cleanse the data.

• Creating and analysing data using Delta Tables.

• Developed Data mapping, Transformation and Cleansing rules for the Data Management involving OLTP and OLAP.

• Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.

• Designed pipelines with Azure Cosmos DB database and other datastores.

• Used CosmosDB and Storage Tables for creating pipelines vice versa.

• Experience managing Azure Data Lakes (ADLS) Gen2 and Data Lake Analytics and an understanding of how to integrate with other Azure Services.

• Reviewing and analyze existing SQL Code / Stored Procedures.

• Experience working in an Agile development environment. Data Engineer Client: 7-eleven, Texas Apr 2020-Dec 2021 Roles and Responsibilities:

• Planning, Designing and Development of an application on Microsoft Azure Cloud.

• Designing and Development of Azure Data Factory Pipelines to extract the data from relational sources such as Teradata, Oracle, SQL server, Sybase, Postgres, DB2 and Non-relational sources like Flat files, SharePoint.

• Develop Azure Databricks notebooks to apply the business rules and perform data profiling.

• Design and Develop Databricks Scala, Python notebooks to join, aggregate datasets and process the files stored in Azure data lake storage.

• Develop interfaces using Azure Databricks to Import/Export the datasets into/from Azure Data Lake Storage.

• Create External tables in Azure Synapse Analytics (Azure SQL data warehouse) for reporting purpose.

• Design and Develop Stored procedures to apply the complex business transformations in Azure SQL data warehouse.

• Development of Azure Data Factory Pipelines to copy historical data from on-prem Teradata DB into ADLS.

• Design and Develop the Oracle Goldengate extract and replicate process components such as manager, extract, replicate, Data pumps, collector to enable real time data process between source and destination Databases.

• Follow Agile methodology for Development, Test and Deployment process.

• Daily Interactions with Business stakeholders and Product Owners to understand and convert business requirements into technical specifications. Data Engineer client: Ameriprise financial, Minnesota Dec 2017 – Apr 2020 Roles and Resposibilities:

• worked with credit team to understand the context and requirement of the dashboard.

• Used Power BI to consume data from different sources and communicated the data anomalies to the source teams.

• Created Measures and calculated columns using time intelligence DAX queries which was helpful for trend analusis.

• Published the report in the cloud and setup a deployment life-cycle for future enhancements.

• Created Key influencers report in Power BI understanding factors contributing to delinquencies.

• Implemented Power BI reports for data analysis.

• Follow Agile methodology for Developmen, Test and Deployment process. ETL Test Engineer Client: Hp,Oregon Aug 2015 – Dec 2017 Roles and Responsibilities:

• Design and developing the data pipelines using SSIS

• Execute SQL queries to perform backend database testing and verify/modify user data.

• Scheduling jobs and managing them.

• Execute the test scripts against source databases and validate the rules according to mapping documentation.

• Involve with extraction routines to verify all the required data load into target systems.

• Perform technical testing and analysis using SQL and automate testing tool.

• Performing data validations and data transformations.

• Implemented Power BI reports for data analysis.

• Reviewing and analyze existing SQL Code / Stored Procedures.

• Performing unit testing and documenting unit test results.

• Performance Testing, Tuning of various reports and dashboards.

• Experience working in an Agile development environment.

• Involving in Sprint planning, daily calls, and retrospectives.



Contact this candidate