Post Job Free

Resume

Sign in

Etl Developer Informatica

Location:
Parma Heights, OH
Posted:
April 29, 2023

Contact this candidate

Resume:

Priya Saravana Kumar

440-***-**** adwtaf@r.postjobfree.com Cleveland, OH

Summary:

ETL Developer with 5+ years of experience in building enterprise grade data intensive applications,

tackling architectural and scalability problems, collecting and sorting data.Experienced in working

across different stages of the data pipeline, including acquisition, integration, ODS and real-time data

marts. Adept in working quickly and efficiently in close collaboration with analytic, engineering and

business stakeholders.

Skills:

Databases: Oracle 11g,MS SQL Server,Snowflake DB

Languages: Python, SQL

ETL/ELT: Informatica PowerCenter 10.2/9.6,Azure Data Factory

Scripting: UNIX, Linux, Shell Scripting (Bash)

Scheduling Tool: Tidal,Autosys,Control-m

Analytics/Visualization: PowerBI, Tableau

Cloud Data Warehouse: Snowflake, Azure Synapse

Cloud Data Engineering Platforms: Azure, AWS,IICS

Education/Certification:

● Master of Science in Information Technology, Bharathidasan University, India

● Bachelors in Computer Applications, Bharathidasan University, India

● Microsoft Azure Fundamentals Certification

Work Experience:

ETL Developer, FiveBrains, India May 2012 to Apr 2016

Responsibilities:

● Gather requirements from business owners and write specifications based on required

business rules, structure, analyze and visualize the enterprise’s current state; design and

visualize the future state; design and guide the change process from current to future

state.

● Prepared Technical design documents for various processes using Informatica best

standards to meet the business requirements.

● Designed and developed complex mapping to load data using various transformations

like Source Qualifier, Sequence Generator, Expression, Lookup, Aggregator, Router,

Rank, Filter, Update Strategy and Stored Procedures.

● Responsible for identifying the bottlenecks and tuning the performance of the Informatica

mappings/sessions.

● Worked on complex mapping for the performance tuning to reduce the total ETL process

time.

● Used parallel processing capabilities, Session-Partitioning and Target Table partitioning

utilities.

● Worked on CDC (Change Data Capture) and implemented SCD (Slowly Changing

Dimensions) Type 1, Type 2, and Type 3.

● Migration of code changes to QA/PROD environment and providing support till

QA/PROD sign off and troubleshooting and fixing issues.

● Automated ETL processes across millions of rows of data which reduced manual

workload by 32% on a monthly basis.

Informatica Developer, Curvet Tech, India Feb 2011 to May 2012

Responsibilities:

● Involved in all phases of SDLC from requirement gathering, design, development,

testing, user training and warranty support.

● Actively involved in interacting with business users to record user requirements and

Business Analysis.

● Translated requirements into business rules & made recommendations for

innovative IT solutions.

● Outlined the complete process flow and documented the data conversion, integration

and load mechanisms to verify specifications for this data migration project.

● Involved in documentation of Data Mapping & ETL specifications for development

from source to target.

● Maintained warehouse metadata, naming standards and warehouse standards for future

application development.

● Created the design and technical specifications for the ETL process of the project.

Used Informatica as an ETL tool to create source/target definitions, mappings and

sessions to extract,transform and load data into staging tables from various sources.

● Responsible for mapping and transforming existing feeds into the new data structures

and standards utilizing Router, Lookups Using Connected, Unconnected, Expression,

Aggregator, Update strategy.

● Performance tuning of the process at the mapping level, session level, source level, and

the target level.

● Created Workflows containing command, email, session and a variety of tasks.

● Tuning the mappings based on criteria, creating partitions in case of performance issues.

● Tested End to End to verify the failures in the mappings using scripts.



Contact this candidate