10+ years of extensive experience in Analysis, Design, Development, Implementation & maintenance of Informatica ETL, OLTP & Data Warehouse systems, Business Intelligence and Enterprise Application Integration.
Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, Deployment and Support.
Extensive experience using Informatica tools like Power Center, Power Exchange, Workflow Manager, Mapping Designer, Mapplet Designer and Informatica Administrator console
Elaborate knowledge in data warehousing, operational data stores(ODS), analytical systems, data marts, decision support systems, business information systems, dimensional modeling concepts.
Strong expertise in Relational data base systems like Oracle, SQL Server, MS Access, DB2 and Teradata.
Design, Develop and Implement autosys jobs to trigger UNIX Shell scripts for importing the data from source system and bringing data into HDFS through AWS S3 storage.
Vast experience in designing and developing complex mappings using various transformations like Update Strategy, Router, Filter, Expression, Aggregator, Joiner, Unconnected and Connected lookups, etc.
Extensively worked in coding using SQL, PL/SQL procedures/functions and triggers.
Extensively worked on developing, monitoring and jobs scheduling using shell scripts for real time and batch processing.
Good familiarity with various Business Intelligence tools such as Oracle Business Intelligence, Business Objects, Cognos and Microstrategy
Good knowledge on Informatica Power Exchange CDC for relational database sources on Linux, Unix or Windows Operating system.
Implemented and extensively worked on slowly changing dimensions SCD Type1, Type2 and Type3 for accessing transaction history of accounts and transaction information using Change Data Capture (CDC) corresponding to business requirements.
Proficient in interaction with the data architect, database administrators and other team members to understand dimensional requirements and translate them into functional specifications.
Ensuring data quality and reliability and provide feedback to businesses and IT on how to improve the quality of the data.
Experience in support the production team by performing activities like troubleshooting, fixing, and maintaining ETL job
Good knowledge on Agile Methodology and the scrum process.
Extensive functional and technical exposure. Experience working on high-visibility projects.
Experience in using templates, undertake a test driven development process and analyze test results to correctly align development efforts with problem areas
Ability to work independently or in a team environment, and to effectively manage time and resources and prioritize tasks.
Informatica Powercenter 10.x/9.x/ 8.x (Repository Manager, Designer, Work Flow Manager, Work Flow Monitor), Informatica Administrator, IDQ, Informatica Power Exchange CDC components, Informatica MDM.
Oracle 11g, 10g/9i, DB2, MS SQL Server 2008/2005/2000, Teradata, Mongo, MS Access,DB2 UDB,MY SQL
Windows 7, Windows XP, UNIX,IBM AIX
SQL Plus, Toad for Oracle, Putty, WinSCP, Microsoft Visual Studio
Oracle Business Intelligence, Cognos, Microstrategy, Tableau, Qlikview
SQL, PL/SQL, UNIX Shell Scripting, AWS S3
Waterfall, Agile, Spiral, Prototype Model
Client: Hyundai Capital Group Sept 2019 – Till Date
Role: ETL Developer
Location: Irvine, CA
Headquartered in Irvine, CA, Hyundai Capital America is a top-10 U.S. auto-finance company supporting the financial services needs of Hyundai Motor America and Kia Motors America. Through the Hyundai Motor Finance and Kia Motors Finance brands, the company provides financial products to Hyundai and Kia dealerships nationwide, including dealer inventory and facility financing, as well as indirect vehicle financing for retail and lease customers in Hyundai, KIA, and Genesis cars. Through its subsidiary, Hyundai Protection Plan, the company offers vehicle service contracts and other vehicle protection products under the Hyundai Protection Plan and Power Protect brands.
Collaborate with project analysts and data architects to create both high-level and detailed ETL design & mapping specifications in line with established standards
Develop and support code for a variety of data migration.
Very strong SQL knowledge and understanding
Extract the data from the Relational Databases, Flat files using ETL process.
Develop the ETL mappings according to the loads for Stage and Reporting.
Build mappings, workflows in a cost effective manner.
Analyze the technical documentation and conclude the design of the ETL loads.
Work on all phases of data warehouse development life-cycle, ETL design and implement and support new and existing applications.
Import mapplets and mappings from Informatica developer (IDQ) to Power Center.
Create generic rules in Informatica developer as per the business.
Design complex UNIX scripts and automated it to run the workflows daily, weekly and Monthly.
Use Debugger in Informatica Power Center Designer to check the errors in mapping.
Use Debugger wizard to remove bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.
Capture data error records corrected and loaded into target system.
Implement efficient and effective performance tuning procedures.
Resolve production issues for any scheduled jobs failures and report issues to the concerned teams.
Environment: Informatica Power Center 10.X, Oracle, UNIX shell scripting, SQL Server, Flat Files, XML, Informatica Power Exchange, TWS, T-SQL.
Client: WELLSFARGO Aug 2017 – Aug 2019
Role: ETL Consultant
Location: Irvine, CA
Wells Fargo & Company is an American multinational financial services company. The Company has three operating segments: Community Banking, Wholesale Banking, and Wealth and Investment Management. The Company provides retail, commercial and corporate banking services through banking locations and offices, the Internet and other distribution channels to individuals, businesses and institutions in all 50 states. This project includes developing Data warehouse from different data feeds and other operational data sources. Built a central Database where data comes from different sources like oracle, SQL server and flat files. Actively involved as an Analyst for preparing design documents and interacted with the data modelers to understand the data model and design the ETL logic.
Gathered requirements and created ETL Mapping Specification Documentation.
Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
Parsed high-level design specification to simple ETL coding and mapping standards.
Designed and customized data models for Data warehouse supporting data from multiple sources on real time
Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
Created mapping documents to outline data flow from sources to targets.
Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
Developed mapping parameters and variables to support SQL override.
Created mapplets to use them in different mappings.
Developed mappings to load into staging tables and then to Dimensions and Facts.
Used existing ETL standards to develop these mappings.
Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
Extracted data from various source systems like Oracle, SQL Server and DB2 to load the data into Landing Zone and then by using Java copy command loaded into AWS-S3 Raw Bucket.
Modified existing mappings for enhancements of new business requirements.
Used Debugger to test the mappings and fixed the bugs.
Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
Involved in Performance tuning at source, target, mappings, sessions, and system levels.
Prepared migration document to move the mappings from development to testing and then to production repositories.
Involved in System Study and Transferred knowledge to offshore team
Distribution of work to onsite and offshore team and follow-ups.
Environment: Informatica Power Center 9.6, IDQ, Oracle, UNIX shell scripting, SQL Server, Flat Files, XML, Informatica Power Exchange, TWS, T-SQL.
Client: Ameriprise Financial Jan 2017 – Jun 2017
Role: IDQ/MDM Developer
Location: Minneapolis, MN
Ameriprise Financial Inc. is an investment adviser and a broker-dealer. The company provides personalized services and offers advice on securities, with an emphasis on mutual funds, annuities, life insurance, and municipal funds The Company provides the infrastructure that helps the financial services industry operate. It serves a client base across its four businesses: Bank/Broker-Dealer Communications, Mutual Fund and Retirement Solutions, Corporate Issuer Solutions, and Finance Dealer Technology and Operations. The main goal of the project is to deploy for creating a central repository of customer data. The project was designed to load data into Customer Master from Seven Source systems to get the best version of truth.
Participated in data discovery, data profiling and requirements analysis around master data management activities for the Customer, Vendor, and Product domains.
Worked with Data architect to understand the Match needs, source system trust scores, data model and table structures.
Defined, configured and optimized various MDM processes including landing, staging, base objects, foreign-key relationships, lookups, query groups, queries custom queries, cleanse functions, batch groups and packages using Informatica MDM Hub console.
Experience in Installing Informatica MDM HUB server/Cleanse Server.
Experience with MDM Hub configurations - Data modeling & Data Mappings, Data validation, cleansing, Match and Merge rules.
Used IDQ toolkit Informatica Analyst and Informatica Developer to perform extensive Data profiling during initial/mid stage of the project to develop a thorough knowledge of source data content, structure and data quality.
Cleanse and Scrub the Data in uniform data type and format applying MDM Informatica and IDQ tools and Load to STAGE and HUB tables and then to the EDW and Finally to Dimension and rollup/aggregate the data by the business grains into the FACT tables.
Primary activities include data analysis identifying and implementing data quality rules in IDQ.
Understand the data quality rules defined by the business/functional teams and propose the optimization of these rules if applicable, then design and develop these rules with IDQ, including thorough unit test planning and performing.
Performed Data Profiling using Informatica Data Quality Tool.
Expertise in Debugging and Performance tuning of targets, sources, mappings and sessions. Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations and mapplets.
Extracted data from sources like fixed width and Delimited Flat files transformed the data according the business requirement and then loaded into Target database.
Created sessions, workflows and used various tasks like Email, Event-wait and raise in workflow manager.
Created data profiles and scorecards using Analyst tool to analyze and measure the quality of data.
Defined Trust and validation rules before loading the data into the base tables.
Developed User Exits for implementing custom business rules.
Defined System Trust and Validation rules for the base object columns.
Configured Address Doctor to Validate Parse Cleanse and Standardize Address data. Created and deployed new applications in Informatica Data Director and binding applications to a specific ORS.
Running the Stage Jobs, Load Jobs, Match and Merge Jobs using the Batch Viewer and Automation Processes.
Designed Use Case for Member data search, add, update of member data.
Used Metadata Manager xConnectors and built in connectors to bring in metadata from various source systems Oracle, SAP PowerDesigner, Teradata, Powercenter, BTEQ’s, SSRS etc.
Automated metadata loads and refreshes for xConnectors and Business Gloassaries.
Defined and maintained taxonomies for managing and organizing metadata.
Established and maintained end to end data lineage accurately reflecting production processes and data flows.
Established and maintained Business/Technical/Operational definitions and Business Glossaries within enterprise Business Glossary.
Organized business glossaries and mapped technical terms to data fields in MM for viewing data lineages.
Established governance processes for publishing Business Glossary.
Documented standards and best practices around Metadata Manager.
Assisted in UAT Planning and Issues Resolution.
Environment: Informatica MDM 10.1, Informatica Data Quality IDQ 9.6.1/10.1, Informatica Metadata Manager 9.6.1/10.1, Business Glossary 9.6.1/10.1, Oracle 11g, UNIX shell scripting, SQL Server 2012, Flat Files, XML, Informatica Power Center 10.0, Informatica PowerExchange 9.6, Tableau, Rally.
Client: Humana Inc. Jan 2016 – Dec 2016
Role: Sr. ETL Informatica Analyst
Location: Louisville, KY
Humana Inc. is a for-profit American health insurance company it has had over 13 million customers in the U.S. its provide services that include Medicare Advantage (MA), Prescription Drug Plans (PDP) Patients Demographic Data, Insurance Information, Medical History and Appointments.
Each source systems provide huge volume of data with different dataset they are Provider, Claims, electronic medical records, medical diagnostics, patient monitoring systems, drug discovery which is the exact replica. The major job involves is to integrate the different dataset from different sources then to perform cleansing, comforting and transforming the data to the staging area then loading the data into the Data Mart (Data Warehouse) and make the relevant data available for reporting.
Developed ETL programs using Informatica to implement the business requirements.
Communicated with business customers to discuss the issues and requirements.
Responsible for design, development and maintenance of Data Marts including Sales, Policy, Customer Reporting and Claims leveraging Informatica Power Center ETL tool, Oracle, DB2 and PL/SQL.
Worked with Business Analyst and application users to finalize Data Model, functional and detailed technical requirements.
Data Cleansed and Address Validated through Address Doctor for the Facets System.
Used Case Converter for Contact Data such First Name and Last Name .
Extensively worked on Data Integration using Informatica for the Extraction transformation and loading of data from various database source systems
Design and developed Exception Handling, data standardization procedures and quality assurance controls
Exported Mappings from IDQ to Informatica Power Center
Responsible for Data Warehouse Architecture, ETL and coding standards.
Developed Capacity Planning/Architecture/ Strategic Roadmaps/Implementing standards.
Used Informatica as ETL tool, and stored procedures to pull data from source systems/ files, cleanse, transform and load data into databases.
Conducted a series of discussions with team members to convert Business rules into Informatica mappings.
Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
Created Mapplets and used them in different Mappings.
Done extensive bulk loading into the target using Oracle SQL Loader.
Managed Scheduling of Tasks to run any time without any operator intervention
Developed stored procedure to compare source data with warehouse data and write the records to spool table and used spool table as lookup in transformation if not present.
Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
Created the SQL Views, Table pairs and created the DVO jobs for validating the Source and
Target Tables data as part of Automation Testing.
Developed and executed automated test scripts in Informatica DVO
Executed the Test cases, Test Scenarios in Test lab tab of Quality Center.
Scheduling the DVO jobs to run every day in PROD for data reconciliation purpose.
Leveraged workflow manager for session management, database connection management and scheduling of jobs.
Assisted in developing different kinds of reports from existing Universes using Business Objects.
Created Unix Shell Scripts for Informatica ETL tool to automate sessions and cleansing the source data.
Experienced in Debugging and Performance tuning of targets, sources, mappings and sessions. Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations and mapplets.
Delivered all the projects/assignments within specified timelines.
Environment: Informatica Power Center 9.1, Oracle, shell scripting, SQL Server, Flat Files, XML, Informatica Power Exchange, Erwin 4.1.2, Winscp, Control-M,DB2.
Client: TrueValue Feb 2015 - Dec 2015
Role: Informatica Developer
Location: Duluth, GA.
The True Value Company is an American retailer-owned hardware cooperative company with over 5,000 independent retail locations worldwide, The database facilitated maintains data related to all purchase orders and inventory in warehouse, including technology, marketing, sales, product management, finance and more. For this project there were multiple systems having Customer information. The client wanted to have a single source of truth for the Customer information for better Customer Segmentation for Targeted Sales and Marketing.This data marts/Data warehouse is an integrated Data Mine that provides feed for extensive reporting.
Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
Create new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.
Develop the mappings using needed Transformations in Informatica tool according to technical specifications
Created complex mappings that involved implementation of Business Logic to load data in to staging area.
Used Informatica reusability at various levels of development.
Developed mappings/sessions using Informatica Power Center 9.1 for data loading.
Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.
Building Reports according to user Requirement.
Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
Implemented slowly changing dimension methodology for accessing the full history of accounts.
Write Shell script running workflows in Unix environment.
Optimizing performance tuning at source, target, mapping and session level
Participated in weekly status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.
Environment: Informatica Power Center 9.1, Oracle, Teradata, UNIX shell scripting, Flat Files, XML, Informatica Power Exchange.
Client: Mercedes Benz Mar 2011 - Dec 2014
Role: ETL Developer
Location: Chennai, India.
The primary objective of this project is to capture data different Sales, Products and financial related data from multiple OLTP Systems and Flat files. Extracted Transformed Loaded data in to data warehouse using Informatica Power center and generated various reports on a daily, weekly monthly and yearly basis. These reports give details of the various products that are sold.
Analysis of the specifications provided by the clients.
Used data from various sources (Flat Files, Oracle, XML, SQL Server) using different kinds of transformations like Router, Sorter, Stored Procedure, Source Qualifies, Joiner, Aggregator, lookup, Expression, XML, Connected and Unconnected Lookups, Sequence Generator, Union, Update Strategy to load data into target tables.
Code review, assignment development work to off shore team and guiding them during development and Unit Test Phases to implement logic and troubleshoot the issue that they were experiencing.
Followed the required client security policies and required approvals to move the code from one environment to other.
Worked on Data Cleansing, Data Conversion and process implementation.
Worked on Incremental Aggregations for improving performance in Mappings and Sessions.
Worked with Memory Cache for static and dynamic cache for better throughput of sessions containing Lookup, Joiner, Aggregator and Rank transformations.
Developed ETL mappings, transformations using Informatica Power center 8.x.
Used Informatica partitioning to improve data structure.
Validated Data to Maintain Referential Integrity Developed UNIX shell scripts to move source files to archive directory.
Developed shell scripts for job automation, which will generate the log file for every job.
Created Informatica Frame work for Audit and Error Balancing.
Worked on back end programming such as PL/SQL procedures, functions and packages.
Prepared a production monitoring and support handbook for ETL process.
Coordinated offshore and onsite team with a total of 7 ETL developers.
Environment: Informatica Power Center 8.6, Oracle 10g, DB2, UNIX, Flat Files, Shell Scripting, PL/SQL.
Client: Nationwide Insurance May 2009 – Feb 2011
Role: Data warehouse Developer
Location: Chennai, India.
The project is about creating the business intelligence. Data from various online transaction processing (OLTP) applications and other sources is selectively extracted, related, transformed and loaded into the data warehouse using Informatica Power Center 8.1 ETL tool. Then the transformed data from data warehouse is loaded into an OLAP server to provide Business Intelligence Analysis Services.
Designed and developed end-to-end ETL process from various source systems to staging area, from staging to Data Marts.
Involved in Business Analysis and requirement gathering.
Designed the mappings between sources to operational staging targets.
Used Informatica Power Center as an ETL tool for building the data warehouse.
Employed Aggregator, Sequencer and Joiner transformations in the population of the data.
Used Teradata as a Source and a Target for few mappings. Worked with Teradata loaders within Workflow manager to configure FastLoad and MultiLoad sessions.
Data transformed from SQL Server databases and loaded into an Oracle database.
Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica.
Creation of test cases and execution of test scripts as part of Unit Testing.
Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
Environment: Informatica Power Center 8.1, Teradata, Oracle 9i, DB2, SQL Loader, SQL *Plus, PL/SQL UNIX, XML.
Bachelors in Electronic & Communication Engineering