Sign in

Senior ETL Informatica Developer

Peoria, Illinois, United States
January 11, 2018

Contact this candidate


ETL Informatica Developer



Around 8 Years of IT experience in developing/Testing/Validating Data Warehouse applications using ETL and Business Intelligence tools like Informatica Power Center, Power Exchange, HP Quality Centre, ALM etc.

Experience in Installation, Configuration and Administration of Informatica Power center 7.x/6.x /8.x/9.x Client, Server in Windows and UNIX.

Full Software Development Life Cycle (SDLC) experience, involved in requirement modelanalysis, design, development, testing, and maintenance with working experienced in Agile, Scrum and Waterfall environments.

Highly experienced in Extraction/Transformation/Loading of the legacy data to Data warehouse using ETL Tools.

Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Integration, Data Import and Data Export using multiple ETL tools such as Informatica Power Center and Data Stage.

Experience working with SDLC, RUP, Waterfall and Agile methodologies

Involved in Test planning and Effort estimation. Responsible for Test status reporting and documentation. Single Point Responsibility for releases, defects fixes.

Excellent understanding of ETL, Dimensional Data Modeling techniques, Slowly Changing Dimensions (SCD) and Data Warehouse Concepts - Star and Snowflake schemas, Fact and Dimension tables, Surrogate keys, and Normalization/Denormalization.

Designing and executing unit test plans and gap analysis to ensure that business requirements and functional specifications are tested and fulfilled.

Used various Transformations such as Expressions, Filters, aggregators, Lookups, Routers, Normalizer, and Sequence Generator etc. to load consistent data in to Oracle, and Teradata databases.

Having good knowledge in Normalization (1NF, 2NF and 3NF) and De-Normalization techniques for optimum performance on XML data, Relational and Dimensional databases environment

Expertise in SQL/PLSQL programming, developing & executing Packages, Stored Procedures, Functions, Triggers, Table Partitioning, Materialized Views.

Having good experience in Teradata sql assistant. Extracted data from Teradata and loaded into various other target systems. Also used Teradata as target instances.

Extensive success in translating business requirements and user expectations into detailed specifications employing Unified Modelling Language (UML)

Experience in Performance Tuning of sources, targets, mappings, transformations and sessions and experience in creating CDC Datamaps using Power exchange and achieved incremental loading into target tables.

Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow flake Schemas.

Performed Detailed Data Analysis (DDA), Data Quality Analysis (DQA) and Data Profiling on source data.

Involved in working with Power Exchange sources and extracted non-relational data into Informatica via Datamaps etc.

Experience with Type 1, Type2, Type3 Dimensions

Experience with Teradata as the target for the data marts. Worked with BTEQ, FastLoad and MultiLoad.

Experience in Integration of various data sources like Oracle, Teradata, SQL Server, DB2 and Flat Files in various formats like fixed width, CSV and excel.

Expertise in error handling and reprocessing of error records while extraction and loading of data into enterprise ware house objects.

Good hands on creating, modifying and implementing Unix shell scripts for running Informatica workflows, preprocessing and post processing validations etc.

Good knowledge on schedule jobs in Autosys/Control-M and defining clear and perfect interdependency between the jobs to achieve what we needed.

Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.


ETL Tools

Informatica 6.x/7.x/8.x/9.x, Power Exchange 8.6.1, Datastage8.1, Datastage8.5

Data Modeling

Dimensional Data Modeling, Star Join Schema Modeling, Snow Flake Modeling, Fact and Dimensions Tables, Physical and Logical Data Modeling Using Erwin Tool.

Testing Tools

HP Quality Center, Silk Test, SCM tools etc.


Oracle 11g/10g/9i/8.x, SQL Server 2008/2005/2000, MySQL, DB2, MS Access

Database Tools

TOAD, SQL*PLUS, SQL Developer, SQL*Loader, Teradata SQL Assistant


SQL (2012 and 2014), PLSQL, Shell Scripting, C, C++, PERL, PYTHON.

Operating Systems

MS-DOS, Windows7/Vista/XP/2003/2000/NT, UNIX, AIX, Sun Solaris

Caterpillar, Peoria, IL April’17- Till Date

Sr. ETL Developer

Caterpillar is the world's leading manufacturer of construction and mining equipment, diesel and natural gas engines, industrial gas turbines and diesel-electric locomotives. The ETL project is to support the portal application which enable the business users to make the right decision

Identify the sources and analyze the source data.

Created and stored metadata in the repository using Informatica Repository Manager.

Cleanse the source data, Extract and Transform data with business rules, and built re-usable mappings, using Informatica PowerCenter Designer.

Involved in creating, editing, scheduling and deleting of sessions using workflow manager in Informatica

Worked with Cognos Developers during requirements gathering

Implemented various Data Transformations using slowly changing dimensions

Monitored workflows and collected performance data to maximize the session performance

Provided fixes for critical bugs and assisting code moves to QA.

Extensively worked with SQL queries. Created Stored Procedures, packages, Triggers, Views using PL/SQL Programming.

Used Teradata utilities fastload, multiload, t pump to load the data.

Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ.

Created Logical/Physical Data models in 3NF in the Warehouse area of Enterprise Data Warehouse.

Optimized the performance of the Informatica mappings. Configured session properties and target options for better performance.

Performed Unit testing and Integration testing on the mappings. Worked with QA Team to resolve QA issues

Create the system to extract, transform and load market data, check correctness of data loading, use UNIX Korn shell and Perl, Oracle stored procedures.

Validated complex mappings involving Filter, Router, Expression, Lookup, Update Strategy, Sequence generator, Joiner and Aggregator transformations.

Expertize in defining and documenting the business process flows (UML) like Use case diagrams, Activity diagrams, Sequence diagrams and Data Flow Diagrams.

Used debugger to test the mapping and fixed the bugs. Created and used Debugger sessions to debug sessions and created breakpoints for better analysis of mappings.

Created mapping variables and parameters and used them appropriately in mappings.

Extensively used all the features including Designer, Workflow manager and Repository Manager, Workflow monitor.

Used Informatica to load the data to Teradata by making various connections to load and extract the data to and from Teradata efficiently.

Designed and developed several ETL scripts using Informatica, UNIX shell scripts

Performed Unit testing and Integration testing on the mappings. Worked with QA Team to resolve QA issues.

Developed banking management scripts in python to support the chase website in creating user profiles, transactions for the withdrawals and deposit.

Did QA of ETL processes, migrated Informatica objects from development to QA and production using deployment groups.

Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions.

Designed and Developed pre-session, post-session routines for Informatica sessions to drop and recreate indexes and key constraints for Bulk Loading.

Involved in Performance Tuning at various levels including Target, Source, Mapping and Session for large data files.

Involved in conducting and leading the team meetings and providing status report to project manager.

Environment: Informatica PowerCenter 9.6.1, IDQ 9.6.1, Oracle 11g, MySQL, Oracle 11g, DB2, UML, MS SQL Server 2005, Erwin, TOAD, korn Shell scripts, PERL, PYTHON, UltraEdit, WinSQL, Autosys, Tidal Scheduler, Shell Scripting, Putty, WinSCP, Notepad++, JIRA, Cognos 10.x.

Wells Fargo, San Francisco, CA Sep’15- March’17

Sr. ETL Developer

Wells Fargo & Company is an American international banking and financial services holding company headquartered in San Francisco, California. It is the world's second largest bank by market capitalization and the third largest bank in the U.S. by assets.

Developer and interacted with Business Analyst to understand the business requirements &Involved in analyzing requirements to refine transformations.

Provided technical guidance for re-engineering functions of Oracle warehouse operations.

Collaborated with architects to align the ETL design to the business case and the overall solution.

Developing the UNIX shell scripts for master workflows.

Design and develop new enhanced functionality for existing applications.

Created the ETL source to target mapping documents working with the business Analysts.

Designed, developed, and maintained Enterprise Data Architecture for enterprise data management including business intelligence systems, data governance, data quality, enterprise metadata tools, data modeling, data integration, operational data stores, data marts, data warehouses, and data standards.

Performed source data analysis and captured metadata, reviewed results with business. Corrected data anomalies as per business recommendation.

Worked with Informatica power center 9.5 to extract the data from IBM Mainframes DB2 sources into Teradata.

Created design standards of the ETL code by applying SCD logics.

Created Parameter files, mapplets, and worklets for reusability in the code.

Reviewed and maintained the ETL coding standards.

Maintained an understanding of XML, XSD, DOM/SAX parsing, XPath and XSLT

Performed source data analysis and captured metadata, reviewed results with business. Corrected data anomalies as per business recommendation.

Worked on the Persistent cache options where ever required to re-use the cache for larger table lookups.

Created 3NF business area data modeling with de-normalized physical implementation data and information requirements analysis using Erwin tool.

Designed the ETL processes using Informatica to load data from Mainframe DB2, Oracle, SQL Server, Flat Files, XML Files and Excel files to target Teradata warehouse database.

Extensively used transformations like router, lookup (connected and unconnected), update strategy, source qualifier, joiner, expression, stored procedures, aggregator and sequence generator transformation.

Created Oracle stored procedures for capturing the ETL run statistics for the daily delta loads.

Performance tuned the database stored procedures and changed the updates to deletes and inserts to reduce the costly operations on the database.

Used performance tuning at the session level, mapping level, and at the database level.

Used the ‘Organize On’ Option in the Netezza tables for frequently joined tables.

Creating DB Objects using Best practices to avoid data skews on the Objects.

Performance tuning of SQL scripts.

Created and executed the unit test plans based on system and validation requirements.

Worked on the migration of the code from DEV, QA, UAT, PROD using XML migrations.

Effectively communicate project expectations to team members in a timely and clear fashion

Environment: Informatica 9.1/9.6.1, Teradata, DB2, Oracle 11g, MySQL, PERL, PYTHON, MS SQL Server, UML, Cognos, ERWIN 9.5.02, Teradata SQL Assistant, Toad 12.1.0, Aginity Workbench 4.3, flat files, XML files.

Mayo Clinic July 13-Sep 15

Role: ETL Developer

Work Location: Rochester, MN

Mayo Clinic is a nonprofit medical practice and medical research group based in Rochester, Minnesota. It employs more than 4,500 physicians and scientists and 57,100 allied health staff. It is widely regarded as one of the United States' greatest hospitals and ranked No. 1 in the country on the 2016–2017 U.S. News & World Report List of "Best Hospitals" of the United States.


Extensively developed various Mappings that incorporated business logic, which performed Extraction, Transformation and Loading of Source data into OLAP schema.

Extensively used Transformations like Source Qualifier, Expression, Lookup, Update Strategy, Aggregator, Stored Procedure, Filter, Router, Joiner etc. Implemented Lookup transformation to update already existing target tables.

Extensively used Re-Usable Objects like Mapplets, sessions and Transformations in the Mappings.

Developed sequential, concurrent sessions and validated them. Scheduled these sessions by using Workflow manager.

Used Teradata utilities fast load, multi load, t pump to load the data.

Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ.

Tuned target, source, transformation, mapping and session to increase Session performance.

Understanding the Reporting requirements.

Developed Stored Procedures and test the application with Toad. Developed PL/SQL Procedures, Functions to build business rules which are helpful to extract data from source and load data to target.

Extensively wrote SQL Queries (Sub queries, correlated sub queries and Join conditions) for Data Accuracy, Data Analysis and Data Extraction needs.

Extensively used debugger to find out errors in mappings and later fixed them.

Run and control workflows using the pmcmd.

Proactively evaluated the quality and integrity of data by unit test and System test for Informatica mappings according to the business needs.

Managed open defects and brought them to closure by working closely with Developers during SIT, UAT and Performance testing.

Involved actively in creating test data in for performance testing using various data generator tools.

Writing different test cases for business rules validations during UAT phase and logging the defects in Quality Center.

Environment Informatica Power Center 9.1.0, Teradata, UML, MS SQL Server, PERL, PYTHON, Oracle, Flat files, MySQL, Autosys, ALM tools for Testing and Data Generator internal tool of client.

Exilant Technologies, INDIA Feb 11 – June 13

Client: US Bank, Minnesota, MN

Role: ETL Developer

Created a data warehouse to align the HR strategy to the overall business strategy. It can present an integrated view of the workforce and help in designing the retention scheme, improve productivity and curtail costs. This included extraction of data from different platforms, metadata management, and integration of mapping and loading to target tables.


Worked with Business Analyst in requirements gathering, business analysis and project coordination.

Responsible for developing complex Informatica mappings using different transformations.

Responsible for creating Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor.

Created and Monitored Sessions and various other Tasks such as Event-Raise Task, Event-Wait Task, Decision Task, Email Task, Assignment Task, Command Task etc. using Informatica Workflow.

Responsible for Defining Mapping parameters and variables and Session parameters according to the requirements and usage of workflow variables for triggering emails.

Responsible for tuning the Informatica mappings to increase the performance.

Implemented complex ETL logic using SQL overrides in the source Qualifier.

Performed Unit tests development work and validates results with Business Analyst.

Developed Unix Scripts for updating the control table parameters based on the environments.

Responsible for providing written status reports to management regarding project status, task, and issues/risks, testing.

Anlayzing requirements to create test cases and getting approval from client for execution.

Used Defect Tracking tools such as ATLAS etc for proper management and reporting of defects identified.

Written various SQL’s for validating test data and production data from sources systems before loading for performance testing/ UAT.

Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process.

Environment: Informatica 8.6.1, Oracle 10g, SQL, PL/SQL, MySQL, Teradata, TOAD, Shell Scripts, UNIX (AIX), Autosys, XSLT, MQ Migration Tools, Defect Tracking Tools (ATLAS).

ICICI Bank May 09 –Feb 11

Role: ETL Developer

Location: Hyderabad, India


Developed ETL and source to target mappings.

Developed the transformation/business logic to load data into data warehouse.

Created Several Informatica Mappings to populate the data into dimensions and fact tables.

Involved in the development of Informatica mappings and tuned them for optimum performance, Dependencies and Batch Design.

Identify the Fact tables and slowly changing dimensional tables.

Extensively used ETL to load data from multiple sources to Staging area (Oracle 9i) using Informatica Power Center Worked with pre and post sessions, and extracted data from Transaction System into Staging Area.

Extensively used Source Qualifier Transformation to filter data at Source level rather than at Transformation level. Created different transformations such as Source Qualifier, Joiner, Expression, Aggregator, Rank, Lookups, Filters, Stored Procedures, Update Strategy and Sequence Generator.

Used Debugger to test the data flow and fix the mappings.

Tuned the workflows and mappings.

Used Informatica designer for designing mappings and mapplets to extract data from Oracle sources.

Used Various Transformations like Expression, Filter, Router, Joiner, Look Up, and Update strategy, Source Qualifier in many mappings.

Designed Complex mappings for Slowly Changing Dimensions using Lookup (connected and unconnected), Update strategy and filter transformations for retaining consistent historical data.

Written PL/SQL procedures for processing business logic in the database.

Query based optimization, Cost based optimization.

Created workflows using Workflow manager for different tasks like sending email notifications, timer that triggers when an event occurs, and sessions to run a mapping.

Executed Workflows and Sessions using Workflow Monitor.

Environment: Informatica Power Center V8.6.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), SQL, PL/SQL, MS SQL Server, Oracle 11g/10g, DB2, Flat files, Shell Scripting, UNIX, Mainframes, Windows.

Education: Bachelors in Computer Science and Engineering from Jawaharlal Nehru Technological University

Contact this candidate