Post Job Free

Resume

Sign in

Data Warehouse Engineer

Location:
Boston, MA
Posted:
April 18, 2024

Contact this candidate

Resume:

VENKATA VARUN BATHINI

ad432u@r.postjobfree.com

+1-508-***-****

New England College, Henniker, NH May 2022

MBA, Cybersecurity Major.

New York Institute of Technology, Old Westbury, NY Dec 2016 M.S; Information, Network and Computer Security.

Vellore Institute of Technology (VIT) University, Vellore, India May 2014 Bachelor of Technology in Computer Science and Engineering.

Skills

ETL Tools

Informatica Power Centre 10.X/9.6.1, B2bexchange

Databases

Oracle 11g/10g, MS Sql Server, Teradata 15/14/13/12, My Sql, DB2, Sybase, Flat File, XML.

Programming

Languages/ Scripting

Sql, PL/SQL, Unix Shell, C

Tools

Toad, SQL PLUS, Sql Assistant, SQL Developer, AOTS, Service Now, Rally, Advanced Excel, MS Access, TWS Maestro, DB viewer, Citrix, JIRA, Pix4d, Ardupilot, ADF, Databricks

Operating Systems

Windows, Unix, Linux

Summary

Experienced Data Engineer with around 7-year track record in orchestrating seamless system onboarding, executing successful data migrations, and developing robust ETL processes. Proficient in crafting SQL scripts, collaborating with cross-functional teams, and leveraging cutting-edge technologies, including Informatica and various databases. Adept in Maestro scheduling, Control-M scheduling, and Unix scripting. Demonstrates proficiency in populating staging tables, managing data copy requests, and architecting datamarts for streamlined reporting. Skilled in translating intricate business requirements into efficient mapping and workflow solutions.

Extensively used ETL to load data from Oracle database, Flat files to Data Warehouse.

Involved in production support for the Data Warehouse Informatica jobs.

Modified the existing mappings and workflows as per requirement changes and migrated them back to production environment.

Created robust and complex workflow and Worklets using Informatica Workflow Manager and troubleshot data load problems.

Wrote SQL Queries, Triggers and Shell Scripts to apply and maintain the Business Rules.

Translated business requirements into Informatica mappings/workflows.

Developed Mappings to extract data from ODS to Data Mart, and monitored Daily, Weekly and Monthly Loads.

Created and monitored Sessions/Batches using Informatica Workflow Manager/Workflow Monitor to load data into target Oracle database.

Worked with mapping variables, Mapping parameters and variable functions like Set variable, Count variable, Set invariable and Setmaxvariable.

Creating the Ingestion pipelines using ADF (Azure Data Factory) based on the Source requirements and scheduled them with Event based and Scheduled triggers.

Testing the Data Pipelines using Standard test templates before deploying to higher environments.

Developed the notebooks for handling the Transaction data and other transformations using Azure Databricks.

Onboarded the Phoenix system to the existing data warehouse and was responsible for automating new files to feed downstream systems.

Successfully implemented data migration from the existing retirement system, Reflex, to Empower.

Extensively worked on Data Extraction, Transformation, and Loading with CSV files, flat files, and mainframe files.

Possess extensive and in-depth knowledge of developing SQL scripts using SQL functions, grouping operations, subqueries, analytical functions, and joins to test the DW projects.

Worked with the release team to perform migration activities.

Created built requests to perform code-checking and promote them to higher environments.

Created deployment groups to perform migration activities.

Integrated various applications into their Consolidated routing system.

Created ISQL to load the data into stage tables and the final table.

Created Maestro scripts to schedule Informatica workflows.

Involved in all vital phases of the software development life cycle, including Business Requirements Analysis, Application Design, Development, Testing, Implementation, and Support for Enterprise Data Warehouse and Client/Server applications.

Populated the staging tables with various sources like flat files (Fixed Width and Delimited) and Sybase.

Effectively used JIRA to document requirements, technical design, and test cases.

Conducted meetings with end-users to detail business needs and determine/define the scope of data and code fixes and supported user acceptance testing.

Used Sybase load utilities for data loading and UNIX shell scripting for the file validation process.

Worked on TWS scheduling and Unix scripting.

Modified the existing mappings and workflows as per requirement changes and migrated them back to a higher environment.

Collaborated with the release team to execute migration activities.

Worked with the DBA team to increase space in the respective databases to perform DML operations.

Heavily involved in performing DCR (Data Copy Request) activities to copy data from production to the test environment.

Populated the staging tables with various sources like flat files (Fixed Width and Delimited), Oracle, and MySQL.

Analyzed SCD1 and SCD2 implementations as per the project requirements to load Dimension and Fact tables.

Configured Managed File Transfer (MFT) to securely transfer files to various targets.

Collaborated in converting requirements with System Analysts (SAs) and presenting to the client.

Used various Teradata load utilities for data loading and employed UNIX shell scripting for the file validation process.

Analyzed data warehouse data to build various data marts for reporting purposes.

Developed and supported the Extraction, Transformation, and Load (ETL) process using Informatica Power Center to populate Teradata tables and flat files.

Worked on Control-M scheduling and Unix Scripting.

.

Extensively utilized ETL processes to load data from the Oracle database and flat files into the Data Warehouse.

Modified existing mappings and workflows in response to requirement changes, migrating them back to a higher environment.

Created robust and complex workflows and Worklets using Informatica Workflow Manager and troubleshooted data load problems.

Wrote SQL queries, triggers, and shell scripts to apply and maintain business rules.

Translated business requirements into Informatica mappings and workflows.

Used Source Analyzer and Warehouse Designer to import the source and target database schemas, and the Mapping Designer to map sources to the target.

Analyzed SCD1 and SCD2 implementations based on project requirements to load Dimension and Fact tables.

Employed Informatica Designer to create complex mappings using different transformations such as Source Qualifier, Expression, Lookup, Aggregator, Update Strategy, Joiner, Filter, and Router transformations to pipeline data to Data Warehouse/Data Marts.

Developed mappings to extract data from ODS to Data Mart, and monitored daily, weekly, and monthly loads.

Created and monitored sessions/batches using Informatica Workflow Manager/Workflow Monitor to load data into the target Oracle database.

Worked with mapping variables, mapping parameters, and variable functions like Set Variable, Count Variable, Set Invariable, and Metavariable.

Data Engineer, MassMutual, Springfield, MA Apr 2019 –Dec 2022

Sr ETL Developer, Anthem, Richmond, VA Dec 2018 – Mar 2019

ETL Informatica Developer, Perfume Center of America Inc, Ronkonkoma, NY Mar 2017 –Nov 2018



Contact this candidate