Post Job Free
Sign in

ETL Informatica Developer

Location:
Fairfax, VA
Posted:
February 27, 2025

Contact this candidate

Resume:

DEEPALI THAKRE

ETL Informatica Developer

202-***-**** • **********@*****.***

SUMMARY:

8+ years of IT experience in ETL informatica development and all phases of Software Development Life Cycle (SDLC) including Business Requirement Analysis, Application designing, Development, Implementations and Testing of Data warehousing and Database business systems for Insurance, Banking.

Excellent experience in designing and development of ETL Methodology using Informatica PowerCenter 10.5, 10.4.

Experience in Performance tuning at source side and target side.

Perform the performance tuning on SQL query.

Having experience in working with Postgres, SQL Server 2012/2014, and ORACLE databases.

Worked extensively in design and analysis of Requirements, Development, Testing and Production deployment.

Having experience in design and managing complex ETL workflows.

Expert in writing SQL queries using multiple databases like ORACLE, SQL Server, Postgres etc.

Extensive knowledge of Informatica tuning and SQL tuning.

Experience in integration of various data sources into staging area.

Extensive experience with Data Extraction, Transformation, and Loading (ETL) from Multiple Sources.

Perform data profiling, cleansing, and validation of source data.

Designed and developed complex mappings, tasks and workflows and tuned it.

Experience in handling of huge volume data and tuning it.

Perform root cause analysis/troubleshooting on any code issues.

Complete ETL development within timeframes.

Perform QA functions for the ETL application, including ensuring that unit and integration test plans are developed and executed.

Provide code migration instructions in standard documentation.

Experience in Debugging and Performance tuning of targets, sources, mappings and sessions in Informatica.

Streamline ETL procedures to improve efficiency and cut down on processing time.

Reconcile the data according to requirement with different functionality.

Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations.

Experience in Performance tuning of ETL process and reduced the execution time for huge volumes of data for a company merger projects. Heavily created user defined functions, reusable transformations, look-ups.

Working experience in Agile and Waterfall methodologies.

Excellent communication skills, ability to communicate effectively with executive and management team, having strong analytical, problem-solving skills.

Designed and implemented end-to-end data integration workflows using Informatica Cloud Data Integration, improving ETL process

Hands on experience on implementing data pipelines on Azure Blob Storage, Azure SQL Server, Azure Data Lake.

Working with different sources and sync, linked service, data sets and data flow on Azure data factory.

Working on Tasks such as Dataflow task, Execute SQL, Execute Package, File system and Containers like for loop, for each loop in Integration services development

Hands on experience in creating, deploying, trouble shooting, scheduling, and monitoring.

Developed reusable mappings, tasks, and parameterized workflows to streamline data and transformation.

Extensive experience in writing and debugging stored procedures, views, Functions and triggers

Migrated legacy ETL pipelines to IICS, reducing operational costs and improving system reliability.

Migrated on-premises ETL workflows to AWS Cloud, ensuring minimal downtime and optimizing costs

Designed and implemented ETL pipelines using Informatica PowerCenter integrated with AWS.

Utilized AWS S3 as a staging area for ETL processes, integrating Informatica workflows

Implemented custom data validation scripts in Python to ensure data integrity before

Scripting experience with Linux Shell scripting.

Working with various transformations such as Lookup, Conditional Split, Derived column, merge join etc. Working on Tasks such as Dataflow task, Execute SQL, Execute Package, File system and Containers like for loop, for each loop in Integration services development.

Hands on experience on SQL, oracle and postgry queries for implementation of mapping.

Experienced in working with Synapse analytics to load data from the On-premises SQL server to Dedicated SQL data warehouse/Azure SQL/Data Lake Store.

Experienced in creating synapse pipelines to load data from different sources to dedicated SQL.

Experience in writing SQL Objects like Stored Procedures, Views, Joins.

Good knowledge in creating dimensions and facts in DWH modelling with experience in Azure synapse

TECHNICAL SKILLS:

ETL TECHNOLOGY: Informatica PowerCenter 10.5, 10.4, IICS

DATA WAREHOUSE: Star Schema, Snowflake schema

DATABASES: Postgres, Oracle 12c/11g/10g, MS SQL Server 2016/2014, MySQL

Others: SQL, WinSCP, Azur data factory, Azure DevOps, ServiceNow, python,UNIX shell scripting, AWS,ubantu, cgywin

PROFESSIONAL EXPERIENCE:

Infinite Computer Solutions, Rockville, MD(Remote) Nov 2022 -Present

Client : Conduent

Role: Informatica Developer

Responsibilities:

Worked with business analysts for requirement gathering, business analysis, testing, business process descriptions, scenarios and workflow analysis.

Created Technical Design Document from business requirements document (BRD).

Analyzed business and system requirements to identify system impacts.

Created the Detail Technical Design Documents which have the ETL technical specifications for the given functionality.

Validation, Standardization and cleansing of data will be done in the process of implementing the business rules using Data Quality (IDQ)tool.

Worked on Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer transformations.

Strong ability to identify data anomalies, inconsistencies, and patterns to improve data governance and integrity.

Used IDQ to complete initial data profiling and remove duplicate data.

Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accurately check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.

Utilized Informatica Data Explorer (IDE) to Conducted data quality assessments, identifying null values, duplicates leading to a improvement in data accuracy.

Integrated IDE with Informatica PowerCenter and Data Quality (IDQ) to streamline ETL processes and enhance data governance frameworks.

Implemented business rule validation in IDE to flag data discrepancies before ETL processing, improving data accuracy.

Utilized Informatica Data Explorer (IDE) to regulatory compliance initiatives by ensuring data completeness and consistency in project.

Designed custom data validation rules in IDE, enhancing data integrity and minimizing processing errors in downstream applications.

Prepared source to target mapping and conducted meetings with the business to understand data / transformation logic.

Analyzed the existing mapping logic to determine the reusability of the code.

Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.

Done the extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better performance.

Created Unit test plans and did unit testing using different scenarios separately for every process. Involved in System test, Regression test and supported the UAT for the client.

Performing ETL and database code migrations across environments using deployment groups.

Populating the business rules using mappings into the target tables.

Involved in end-to-end system testing, performance and regression testing and data validation.

Managed performance and tuning of SQL queries and fixed the slow running queries in production.

Created batch scripts for automated database build deployment.

Expert in all phases of Software development life cycle (SDLC) - Project Analysis, Requirements, Design Documentation, Development, Unit Testing, User Acceptance Testing, Implementation, Post Implementation.

Designed, Developed and Implemented ETL processes using IICS Data integration.

Created IICS connections using various cloud connectors in IICS administrator.

Designed and built IICS mappings to extract data from Oracle, SQL Server and loaded it into table.

Designed the ETL flows for generating flat file extracts as per the business needs.

Worked on converting Informatica PowerCenter ETL code to IICS using PC to Cloud Conversion service.

Implemented Type1, Type2 in IICS.

Worked on converting Informatica PowerCenter ETL code to IICS using PC to Cloud Conversion service.

Performed loads into Snowflake instance using Snowflake connector in IICS.

Extensively used Informatica Power Center, IICS to create mappings, sessions and workflows for populating the data into dimension, fact, and lookup tables from different source systems (SQL server, Oracle, Flat files, tables).

Integrated Data Quality routines in the Informatica mappings to standardize and cleanse the data.

Created complex data transformations using IICS mappings, including aggregations, lookups, and joins, to standardize data across multiple sources.

Designed error-handling workflows and alerts to monitor and resolve data pipeline issues.

Maintained compliance by implementing data masking and encryption for sensitive information.

Migrated on-premises ETL workflows to AWS Cloud, ensuring minimal downtime and optimizing costs

Utilized AWS S3 as a staging area for ETL processes, integrating Informatica workflows and transforming data from various sources.

Developed Python-based utilities to monitor, trigger, and log Informatica workflows for enhanced operational efficiency.

Implemented custom data validation scripts in Python to ensure data integrity before loading into target systems via Informatica.

Used Python to create transformation logic for non-standard data formats, reducing complexity in Informatica mappings.

Environment: Informatica PowerCenter 10.5, 10.4, IICS, Snowflake, SQL, Oracle 12c, MySQL, Autosys.

Client: CareFirst BlueCross BlueShield, Baltimore, MD Jul 2020 to Oct 2022 Role: Informatica Developer

Responsibilities:

Maintained warehouse metadata, naming standards and warehouse standards for future application development.

Developed complex Power Center mappings by using different transformations and components respectively to meet the data integration and data quality requirements.

Created and monitored workflows/sessions using Informatica Server Manager/Workflow Monitor to load data into target Oracle database.

Performed Unit testing, Integration Testing, and User Acceptance testing to pro-actively identify data discrepancies and inaccuracies.

Involved in performance tuning at source, target, mapping and session level.

Prepared design documents, ETL Specifications and migration documents.

Maintained daily Tech Tracker for the updates from the team regarding their objects, issues and progress.

Created the design and technical specifications for the ETL process of the project.

Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.

Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator & Update strategy.

Worked with various complex mapping, designed slowly changing dimension Type1 and Type2.

Performance tuning of the process at the mapping level, session level, source level, and the target level.

Tuning the mappings based on criteria, creating partitions in case of performance issues.

Performed data validation after the successful End to End tests and appropriate error handling in ETL processes.

Resolving the tickets based on the priority levels raised by QA team.

Environment: Informatica PowerCenter 10.2, SQL, Oracle 12c, MySQL, Autosys

Client: BOK Financial Corporation, Richmond, VA Apr 2018 to Jul 2020 Role: Informatica Developer

Responsibilities:

Actively involved in interacting with business users to record user requirements and Business Analysis.

Used Informatica PowerCenter for (ETL) extraction, transformation and loading data from heterogeneous source systems like Flat file, SQL Server 2012 and Oracle 11g into target database which is oracle11g.

Documented Data Mappings/ Transformations as per the business requirement.

Prepared Technical Design documents and Test cases.

Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.

Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.

Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.

Created Sessions and extracted data from various sources, transformed data according to the requirement and loaded it into data warehouse tables.

Developed several reusable transformations and mapplets that were used in other mappings.

Developed transformation logic as per the requirement, created mappings and loaded data into respective targets.

Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse.

Migrated Mappings, Sessions, Workflows from Development to Test and then to UAT environment.

Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.

Extensively worked on Unit testing for the Informatica code using SQL Queries and Debugger.

Improved performance testing in Mapping and the session level.

Environment: Informatica PowerCenter 9.6.1, SQL Server 2016, ORACLE 11g, SQL, UNIX, SQL Developer, Autosys, JIRA

Client: UHC, Minneapolis, MN Jan 2017 to Apr 2018 Role: Junior Informatica Developer

Responsibilities:

Co-ordination from various business users’ stakeholders and SME to get Functional expertise, design and business test scenarios review, UAT participation and validation of data from multiple sources.

Worked on Power Center Designer tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.

Used Debugger within the Mapping Designer to test the data flow between source and target and to troubleshoot the invalid mappings.

Created Mappings using Mapping Designer to load the data from various sources, using different transformations like Source Qualifier, Expression, Lookup (Connected and Unconnected), Aggregator, Update Strategy, Joiner, Filter, and Sorter transformations.

Used SQL to analyze source data, performed data analysis and validate the data.

Scheduled Informatica Jobs through Autosys scheduling tool.

Studying the existing system and conducting reviews to provide a unified view of the program.

Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters.

Involved in creating Informatica mappings, Mapplets, worklets and workflows to populate the data from different sources to warehouse.

Responsible for facilitating load testing and benchmarking the developed product with the set performance standards.

Involved in testing the database using complex SQL scripts and handled the performance issues effectively.

EDUCATION:

Bachelors in computer technology from Nagpur University, India



Contact this candidate