REENA PATEL
ETL Informatica Developer
Contact: 209-***-****
Email: **************@*****.***
PROFESSIONAL SUMMARY:
●ETL Informatica expert with over 8 years of IT experience, specializing in the analysis, design, development, implementation, and troubleshooting of Data Warehouse applications.
●Extensive experience in Informatica PowerCenter 10.2, 9.6.1, 9.1, along with a strong background in ETL skills for Data Warehousing using Informatica.
●Strong understanding of the Data Warehouse project development life cycle, with expertise in documenting all phases of DWH projects.
●Excellent knowledge in identifying performance bottlenecks and tuning Informatica Load for better performance and efficiency.
●Strong experience in implementing CDC using Informatica PowerCenter.
●Extensively worked on Informatica PowerCenter transformations such as Expression, Joiner, Sorter, Filter, Router, and other transformations as required.
●Good understanding of Data Warehouse Concepts like Dimension Tables, Fact tables, Slowly Changing Dimensions, Data Marts, and Dimensional modeling schemas.
●Experience in Data modeling; Dimensional modeling and E-R Modeling, and OLTP AND OLAP in Data Analysis. Very familiar with SCD1 and SCD2 in snowflake schema and star schema.
●Strong experience in Extraction, Transformation, and Loading (ETL) data from various data sources into Data Marts and Data Warehouse using Informatica PowerCenter components.
●Strong Experience in developing Sessions/Tasks, Worklets, and Workflows using Workflow Manager tools - Task Developer, Workflow & Worklet Designer.
●Experience in debugging mappings, identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
●Experience in Performance tuning of the ETL process using pushdown optimization and other techniques. Reduced the execution time for huge volumes of data for company merger projects.
●Good experience in writing UNIX shell scripts, automation of ETL processes, error handling, and auditing purposes.
●Skilled in creating & maintaining ETL Specification Documents, Use Cases, Source to Target mapping, Requirement Traceability Matrix, performing Impact Assessment, and providing Effort estimates, deployment artifacts.
●Assisted other ETL developers in solving complex scenarios and coordinated with source systems owners with day-to-day ETL progress monitoring.
●Have 4 years of coordination experience with offshore/onshore, Data Analyst, and Data Modular teams for requirement gathering, Issue fixing, design process, analysis of new/existing source.
●Provide work estimation for change requests with deliverables timeline.
●Work on Change Request with ServiceNow with deployment tools (IBM uDeploy, Artifactory, GIT, Harvest) for end-to-end delivery along with signoff activity from TCoE (Testing Team) and ETL Architect for performance improvements of ETL or SQL. Ability to meet tight deadlines with my quick learning, status reporting, cooperativeness, and hardworking strength
●Performed ad-hoc requirements and POCs for any new initiative/Project, I performed POC on migration of tools like ETL Informatica PowerCenter 9.6/10.2 and environment upgrade
●EDUCATION:
●Bachelor in computer application from Gujarat University, 2014, India
●CERTIFICATION - Azure AZ-900
TECHNICAL SKILLS:
Client: PNC, Cleveland, OH
Role: ETL Informatica Developer
Duration: SEP 2021 to till date
Project: PNC Enterprise Data Management technology (EDMT)
Project Description:
The PNC Enterprise Data Management Technology (EDMT) project aims to establish an enterprise architecture compliant with Basel II regulations. The EDMT platform, known as the Enterprise Data Warehouse (EDW), facilitates the flow of data from source systems into application-specific data marts. These data marts store information required by downstream applications for calculations and reporting purposes. The architecture, inspired by IBM Banking Data Warehouse (BDW) design, provides an integrated data environment supporting decision-making, reporting, and compliance requirements. The project involves sourcing data into a central hub and utilizing business intelligence functionality to distribute data across different lines of business.
Responsibilities:
●Conducted requirement gathering and analysis sessions with Data Analysts and Data Modular teams for new source systems and changes to existing ones.
●Developed and managed code using Version One project management tool, ensuring efficient Agile project management from backlog to reporting.
●Participated in daily, weekly, and bi-weekly status meetings with Project Leads, Project Managers, Clients, and offshore/onshore teams, providing updates on project development.
●Resolved technical and functional issues, escalating critical matters to supervisors as needed, and performed estimation analysis of code design and deliverables.
●Conducted performance tuning for Informatica and Teradata processes, optimizing mapping, session, and workflow levels to enhance overall system performance.
●Conducted code reviews, unit test case documentation, and defect prevention activities, ensuring adherence to quality standards.
●Scheduled ETL processes using CA7 automation, deployed developed code using uDeploy and Artifactory, and coordinated production setup and implementation with application support teams.
●Assisted the Support Team in resolving production issues and conducted knowledge transfer sessions for existing applications
Environment: Informatica PowerCenter 10.1, Teradata, Teradata TPT utility, Performance tuning, Unit test case documents, Design, IQA, EQA documents, Code design, POC, DDL level. uDeploy, Artifactory, VersionOne, Putty/WinSCP, Teradata data mover, CA7 scheduling, IGC data Lineage, Agile, Scrum, ALM tool for defect scheduling
Client: Horizon BCBSNJ, Newark, NJ
Role: Informatica Developer
Duration: Apr 2019 to Sep21
Project: BCBS enterprise data warehouse
Horizon Blue Cross Blue Shield of New Jersey provides health insurance coverage; this project integrates from multiple applications into data into enterprise data warehouses. This project involves building a data warehouse by consolidating data from a variety of Sources into a central data warehouse to solve multiple use cases to support downstream application. This project comes under the Government Program: Medicaid and Medicare under the government Program, which integrates data from various source legacy systems like facets, QNXT, Leon etc.
Responsibilities:
●Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
●Actively involved in interacting with business users to record user requirements and Business Analysis.
●Outlined the complete process flow and documented the data conversion, integration, and load mechanisms to verify specifications for this project.
●Parsing high-level design spec to simple ETL coding and mapping standards.
●Worked with Power Center Designer tools in developing mappings and Mapplets to extract and load the data from flat files and SQL server database.
●Maintained warehouse metadata, naming standards and warehouse standards for future application development.
●Created the design and technical specifications for the ETL process of the project.
●Used Informatica as an ETL tool to create source/target definitions, mappings, and sessions to extract, transform and load data into staging tables from various sources.
●Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
●Worked with various complex mapping, designed slowly changing dimension Type1 and Type2.
●Performance tuning of the process at the mapping level, session level, source level, and the target level.
●Created Workflows containing command, email, session, decision, and a wide variety of tasks.
●Tuning the mappings based on criteria, creating partitions in case of performance issues.
●Performed data validation after the successful End to End tests and appropriate error handling in ETL processes.
●Resolving the tickets based on the priority levels raised by the QA team.
●Developed Parameter files for passing values to the mappings as per requirement.
●Scheduled batch and sessions within Informatica using Informatica scheduler and also customized pre-written shell scripts for job scheduling.
Environment: Informatica Power Center 10.2, Oracle 12c, UNIX Shell Scripts, Unix, Erwin 9.3, Control M, Teradata 16
Client: Prudential Financial, Newark, NJ
Role: Informatica Developer
Duration: Sept 2017 to Apr 2019
Project: Customer Data Mart
Prudential Financial, Inc., together with its subsidiaries, provides insurance, investment management, and other financial products and services in the United States and internationally.
The objective of this project is to integrate Customer data from multiple source systems into data mart and support its stakeholder for making better decisions. It helps users / businesses make decisions by binding their data.
Responsibilities:
●Involved in all phases of SDLC from requirement gathering, design, development, testing, UAT, deployment to production environment.
●Actively involved in interacting with business users to record user requirements.
●Involved in Analysis, profiling and cleansing of source data and understanding the business process.
●Translated requirements into business rules & made recommendations for innovative IT solutions.
●Outlined the complete process flow and documented the data conversion, integration, and load mechanisms to verify specifications for this data migration project.
●Involved in documentation of Data Mapping & ETL specifications for development from source to target.
●Also implemented the change Data Capture (CDC) while integrating the enterprise data sources.
●Parsing high-level design spec to simple ETL coding and mapping standards.
●Worked with Power Center Designer tools in developing mappings and Mapplets to extract and load the data from flat files and Oracle database.
●Maintained naming standards and warehouse standards for future application development.
●Created the design and technical specifications for the ETL process of the project.
●Used Informatica as an ETL tool to create source/target definitions, mappings, and sessions to extract, transform and load data into staging tables from various sources.
●Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
●Performance tuning of the process at the mapping level, session level, source level, and the target level.
●Strong on Exception Handling Mappings for Data Quality, Data Cleansing and Data Validation.
●Created Workflows containing command, email, session, decision, and a wide variety of tasks.
●Tuning the mappings based on criteria, creating partitions in case of performance issues.
●Tested End to End to verify the failures in the mappings using scripts.
●Performed data validation after the successful End to End tests and appropriate error handling in ETL processes.
●Resolving the tickets based on the priority levels raised by the QA team.
●Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT)
●Developed Parameter files for passing values to the mappings for each type of client
●Scheduled batch and sessions within Informatica using Informatica scheduler and wrote shell scripts for job scheduling.
Environment: Informatica PowerCenter 9.6.1., Oracle 11g, MySQL, UNIX, Control M, JIRA.
Client: Johnson & Johnson, New Brunswick
Role: ETL Informatica Developer
Duration: Dec 2016 to Sept 2017
Project: Sales performance and Reporting
Description: Johnson & Johnson researches and develops, manufactures, and sells a range of products in the healthcare field worldwide. This project collects data from various source systems using Informatica tools, to build with Cognos report and dashboard to help customers for better reporting and improve their sales performance.
Responsibilities:
●Experienced in using Informatica for data profiling and data cleansing, applying rules and develop mappings to move data from source to target systems
●Designed and implemented ETL mappings and processes as per the company standards, using Informatica PowerCenter.
●Extensively worked on complex mappings which involved slowly changing dimensions.
●Developed several complex mappings in Informatica a variety of transformations, Mapping
●Worked extensively on Informatica transformations like Source Qualifier, Expression, Filter, Router, Aggregator, Lookup, Update strategy, Sequence generator and Joiners.
●Deployed reusable transformation objects such as Mapplets to avoid duplication of metadata, reducing the development time.
●Worked on developing Change Data Capture (CDC) mechanism using Informatica
●Power Exchange for some of the interfaces based on the requirements and limitations of the Project.
●Implemented performance and query tuning on all the objects of Informatica using SQL Developer.
●Worked in the ETL Code Migration Process from DEV to QA and to PRODUCTION.
●Created the design and technical specifications for the ETL process of the project.
●Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing
●Involve in scheduling using Informatica pre/post session operations.
●Created different parameter files and started sessions using these parameter files using pmcmd command to change session parameters, mapping parameters, and variables at runtime.
Environment: Informatica PowerCenter 9.1.1, SQL Server 2008, Oracle 10g, Unix, Teradata 14