Sign in

informatica developer

San Diego, California, United States
August 29, 2017

Contact this candidate




Contact: 408-***-****


Around 9 years of IT experience in all phases of Software Development Life Cycle (SDLC) which includes expertise in the field of Enterprise Data Warehousing, Data Integration and Master Data Management.

Professional Overview

Experienced in all stages of the software lifecycle Architecture (Waterfall model, Agile Model) for building a Data warehouse.

Well acquired on Informatica PowerCenter 10.x/9.x/8.x/7.x Designer Tools (Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, and Transformation Developer), Workflow Manager Tools (Task Developer, Worklet and Workflow Designer) and Repository Manager & Admin console.

Involved in Troubleshooting data warehouse bottlenecks, Performance tuning - session and mapping tuning, session partitioning & implementing Pushdown optimization.

Extensively worked on various transformations like Lookup, Joiner, Router, Rank, Sorter, Aggregator, Expression.

Experience in using Informatica Client Tools - Designer, Source Analyzer, Target Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager and Workflow Monitor.

Experience in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS) using Data modeling tool ERWin, MS Visio and Dimensional modeling techniques (Kimball and Inmon), Star and Snowflake schema addressing Slowly Changing Dimensions (SCDs).

Proficiency in data warehousing techniques like data cleansing (DVO), slowly Changing Dimension (SCD) (Type I, Type II and Type III), Surrogate key assignment, Change Data Capture (CDC)

Expertise in Installing, Managing and configuring Informatica MDM Hub Server, Informatica MDM Hub Cleanse, Cleanse Adapters (Address Doctor), Informatica MDM Hub Resource Kit.

Expertise in creating B2B DT Studio experience with parsing unstructured healthcare data standards to be used by ETL for mapping source data to populate Data Warehouse and Data Mart.

Expertise in creating Mappings, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties and Batch Group creation.

Experience in the successful implementation of ETL solution between an OLTP and OLAP database in support of Decision Support Systems/Business Intelligence

Experience in file transfer thru SFTP and B2B

Analyzing the regulatory restrictions and need for identification of golden record for Master Data.

Experience in Integration of various data sources like Oracle, SQL Server, DB2, Flat Files, XML files, DB2 files.

Strong knowledge in Data Warehousing concepts, hands on experience using Teradata utilities (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, Teradata Administrator, SQL Assistant, Pmon, Data mover), UNIX.

Expert Knowledge of Integration Services (SSIS), Analysis Services (SSAS) and Reporting Services (SSRS)

Experience in UNIX shell scripting, FTP, Change Management process and EFT file management in various UNIX environments.

Experience in PL/SQL object orientation like Records, PL/SQL Table by using bulk command etc., and LOBS (BLOB/CLOB/NLOB).

Ability to write complex SQLs needed for ETL jobs and analyzing data, and is proficient and worked with databases like Oracle 12X/11x/10g, SQL Server 2005, Teradata, DB2, MSSQL, Excel sheets, Flat Files, Sybase, COBOL files and XML files.

Experience in Performance Tuning of mappings, ETL procedures and process and involved in complete life cycle implementation of data warehouse.

Proficient in Post production support activities with effective change management configurations, Handling Incident tickets, change requests and Problem tickets using Remedy Incident Management tool and using the scheduling tools like Tivoli, IPM, and Cron job.

Ensure that user requirements are effectively and accurately communicated to the other members of the development team and Facilitate communications between business users, developers and testing teams.

Areas of Expertise:

Informatica Tools: Informatica 10.1 (PowerCenter, Power Mart, Power Exchange), Informatica Data Quality (IDQ) 9.6.1, IDE/Informatica Analyst, Informatica MDM 10.1, IDD, SIF

Databases: Oracle 12c, 11g/10g, SQL Server 2012/2008, MySQl, TeraData 14

Data Modeling Tools: Erwin 7.0/4.0, MS Visio, Star Schema, Snowflake Schema, Dimension Data Modeling, Fact tables, Dimension Tables

Query Tools: SQL Developer, Toad, SQL*Plus, SQL*Loader, T-SQL.

OLAP/Reporting Tools: OBIEE 10.1.3, Tableau

Languages: Unix Shell Scripting, Perl, Python, XML, JSON, WSDL, SQL*Plus, Pro*C

IDE/Development tools: Eclipse, Toad

Version Control: Perforce, Clear Case, Visual Source Safe

Development Methodologies: Agile, Waterfall, Test Driven Development

Operating systems: Linux, Unix, Widows 10,7, Windows 2003/2007, Windows XP

Web Technology: J2EE, HTML, CSS, VB Script, Java Script, Wordpress

Professional Overview

LPL Financial - San digo, CA Sep 2016 to Present

Senior Informatica Developer

Description: LPL Financial is the largest organization of independent financial advisors in the United States. LPL Financial was formed in 1989 through the merger of two brokerage firms - Linsco (established in 1968) and Private Ledger (established in 1973). LPL Financial advisors help clients meet investment goals with a number of financial services, including equities, bonds, mutual funds, annuities, insurance, and fee-based programs.


•Documented user requirements, translated requirements into system solutions and developed implementation plan and schedule.

•Extracted data from relational databases DB2, Oracle and Flat Files.

•Developed Complex transformations, Mapplets using Informatica to Extract, Transform and convert into DPP System.

•Azure Cloud Computing and Microsoft Technologies in Designing, Developing and Implementing Intranet, Internet, Client/Server Applications

•Developed data Mappings, Transformations between source systems.

•Implemented Aggregate, Filter, Join, Expression, Lookup and Update Strategy transformations.

•Used debugger to test the mapping and fixed the bugs.

•Created sessions, sequential and concurrent batches for proper execution of mappings using server manager.

•Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.

•Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.

•Azure and SQL Azure and in Azure web and database deployments

•Responsible to design, develop and test the software (Informatica, PL SQL, UNIX shell scripts) to maintain the data marts (Extract data, Transform data, Load data).

•Extracted the data from mainframe to data warehouse using Informatica Power Exchange(CDC).

•Used Informatica Power Exchange Change Data Capture (CDC) for creation of data maps using mainframe tables.

•Used Python scripts to update content in the database and manipulate files.

•Worked with team of developers on Python applications for RISK management.

•Organized data in the report Inserting Filters, Sorting, Ranking and highlighting data.

•Included data from different sources like Oracle Stored Procedures and Personal data files in the same report.

•Responsible for daily verification that all scripts, downloads, and file copies were executed as planned, troubleshooting any steps that failed, and providing both immediate and long-term problem resolution.

•Responsible for managing Data Modeling, physical and logical databases using ER Studio.

•Check in Logical Database in ER Studio and Physical in Star Team to manage compatibility.

•Designed, developed, implemented and maintained Informatica Power Center and IDQ 9.6.1 application for matching and merging process.

• Utilized MS Excel as a reporting tool for Access databases using VBA and internal Excel query function

•Database Management- (SQL Server Management Studio, MS Excel, Rational Rose)

•Utilized of Informatica IDQ 9.6.1 to complete initial data profiling and matching/removing duplicate data.

•Automated the Informatica jobs using UNIX shell scripting.

•Wrote PMCMD and PMREP commands and Korn Shell scripts to automate various administrative tasks.

Environment: Informatica Power Center 10.1 (Repository Manager, Designer, Work Flow Manager, Work Flow Monitor, Source Analyzer, Mapplet Designer, Mapping Designer, Python, Workflow Designer, Task Developer, Worklet Designer, Mapplets, Mappings, Workflows, azure, Sessions, Re-usable Transformations), excel, Informatica Power Exchange CDC, Oracle 12c, SQL Developer, IBM DB2, Control Center

Insight Enterprises Inc - Chicago, IL Jun’2015 to Sep’ 2016

Senior ETL Informatica Developer

Description: Insight is a leader in providing smart, cutting-edge technology solutions for organizations of all sizes. From developing unique strategies to delivering the products, services and expertise, we’ll help your business run more efficiently and modernize through Intelligent Technology Solutions.


•Involved in design, development and maintenance of database for Data warehouse project.

•Involved in Business Users Meetings to understand their requirements.

•Converted business requirements into technical documents- BRD, explained business requirements in terms of technology to the developers.

•Used of Kettle for data integration delivers powerful extraction, transformation and loading (ETL) capabilities.

•Created IBM DB2 that contain database server and support the relational model and used object-relational features and non-relational structures like JSON and XML.

•Used of Snap Logic tool for the Big Data Integration and Cloud Integration

•Snap logic provide visibility inti integration including performance, reliability, and utilization.

•It control and monitor performance of SnapLogic orchestrations and administers the lifecycle of data and process flows.

•Used of different tools of Kettle in software like GeoKettle ETL, JasperSoft ETL etc.

•Used of DB2 as a MVC frame and for traditional product packaging.

•Used of Kettle java based architecture and open XML based configuration, It included support for integration of security and data management tools

•Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).

•Used of Mongo DB as an open source software avoids the traditional table-based relational database structure in favor of JSON-like documents with dynamic schemas (MongoDB calls the format BSON)

•Used of Informatica B2B for onverts structured and unstructured data to and from more broadly consumable data formats to support business-to-business and multi-enterprise transactions.

•Developed Object system for a particular attribute or entity that involved class and class based objects.

•Mongo DB provide the support to regular expression searches. Queries can return specific fields of documents, range queries and include user-defined JavaScript functions

•Installed and configured Hadoop MapReduce, HDFS, developed multiple MapReduce jobs in java for data cleaning and pre- processing

•Used of RESTfull API for generate documentation by the same tool

•Loaded home mortgage data from the existing DWH tables (SQL Server) to HDFS using Sqoop.

•Developed Data Flow diagrams to create Mappings and Test plans.

•Populated HDFS and Cassandra with huge amounts of data using Apache

•The Data flow diagrams ranged from OLTP systems to staging to Data warehouse.

•Developed Test plan to verify the logic of every Mapping in a Session. The test plans included counts

•verification, look up hits, transformation of each element of data, filters, and aggregation and target.

•Developed complex Informatica mappings using various transformations- Source Qualifier, Normalizer,

•Filter, Connected Lookup, Unconnected Lookup, Update strategy, Router, Aggregator, Sequence Generator, Reusable sequence generator transformation.

•Extensively used SCD's (Slowly Changing Dimension) to handle the Incremental Loading for Dimension

•tables, Fact tables.

•Designed various mappings for extracting data from various sources involving Flat files, Oracle, Sybase and SQL Server, IBM DB2.

•Used of Dataflux to data integration, migration and consolidation to data quality

•Used of Dataflux tolls, Its Data management studio that provides a unified development and delivery environment to provide a single point of control to manage data quality, data integration, master data management (MDM) and other data initiatives.

•Used of Infosphere MDM to design functions and web services

•Worked to manage of IBM Infosphere master data for single or multiple domains - customers, patients, citizens, suppliers, locations, products, services offerings, accounts.

•Created Informatica Power Exchange Registration and Data map.

•Created scripts to create new tables, views, queries for new enhancement in the application using TOAD.

•Designed and developed real-time data warehouse.

•Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM Hub using ETL tool that was coming from various source systems.

•Worked on Informatica cloud and extracted data from Sales Force source.

•Worked on Debugging and Troubleshooting of the Informatica application.

•For debugging utilized Informatica debugger.

•optimization, pre and post stored procedures to drop and build constraints.

•Experience in building of the Repository (RPD) using the OBIEE toolset.

•Developed OBIEE technology such as Answers, Dashboards, and BI Publisher.

•Identifying automated decision-making possibilities.

•Processed machines generated data using Hadoop, Hive and Pentaho. Summarized data was further loaded into MYSQL for reporting

•Worked on Teradata utilities BTEQ, MLOAD, FLAOD and TPUMP to load staging area.

•Data load from file to Hadoop cluster, BigData summarization using Hive (using Pentaho) and loading of summarized data into Oracle and MYSQL for data visualizations and ad-hoc reporting

•Worked on Audit login, Error login and Reference check while loading data to Data warehouse.

•Created Unix Script for ETL jobs, session log cleanup and dynamic parameter.

•Performed Unit testing, Integration testing and System testing of Informatica mappings.

•Wrote Unix scripts, Perl scripts for the business needs.

•Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process.

Environment: Informatica PowerCenter 9.5/9.1, Informatica Cloud, Kettle, IDQ, MongoDB, HDFS, DB2, Informatica Power Exchange 9.1, HDFS, HQL Metadata editor, Shell, IBM MDM 8.5, Scripting, Perl Scripting, Windows XP, Putty, Infosphere, Dataflux MDM, DB2 Mainframe, MDM Consultant, Informatica MDM OBIEE 11g, Oracle Exadata, Teradata13, MSSQL Server 2008 R2, T-SQL, Erwin 8, SQL Server 2008, TOAD, PL/SQL Developer, Linux, Unix.

Connecture, Inc - Brookfield, WI Oct’ 2014 to Jun’ 2015

Sr. Informatica Developer

Description: Connecture, Inc. (CNXR) is a leading web-based consumer shopping, enrollment and retention platform for health insurance distribution. its solutions offer a personalized health insurance shopping experience that recommends the best fit insurance plan based on an individual's preferences, health status, preferred providers, medications and expected out-of-pocket costs. Its customers are payers, brokers, government agencies, and web-based insurance marketplace operators, who distribute health and ancillary insurance.

The project(CMS Medicare) is aimed at developing an integrated data mart for long-term decision-making, strategic plans, support and solutions giving mainstream users the ability to access and analyze and help stakeholder to automate key functions in the insurance distribution process, allowing its customers to price and present plan options accurately to consumers and efficiently enroll, renew and manage plan members.


Co-ordination from various business users' stakeholders and SME to get Functional expertise, design and business test scenarios review, UAT participation and validation of data from multiple sources.

Created Mappings using Mapping Designer to load the data from various sources, using different transformations like Source Qualifier, Expression, Lookup (Connected and Unconnected), Aggregator, Update Strategy, Joiner, Filter, and Sorter transformations.

Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters.

Worked on Power Center Designer tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.

Used Debugger within the Mapping Designer to test the data flow between source and target and to troubleshoot the invalid mappings.

Worked with existing Python Scripts, and also made additions to the Python script to load data from CMS files to Staging Database and to ODS.

Administered Informatica Power Center, Power Exchange CDC for multiple source/target databases including Oracle, SQL Server and DB 2.

Worked on SQL tools like TOAD and SQL Developer to run SQL Queries and validate the data.

Scheduled Informatica Jobs through Autosys scheduling tool.

Extensively used advance chart visualizations in Tableau like Dual Axis, Box Plots, Bullet Graphs, Tree maps, Bubble Charts, Water Fall charts, funnel charts etc., to assist business users in solving complex problems.

Developed of key indicators and the appropriate tracking reports with graphical and written summations using summary and Annotations to assist in the quality improvement initiatives.

Created quick Filters Customized Calculations, Conditional formatting for various analytical reports and dashboards.

Studying the existing system and conducting reviews to provide a unified view of the program.

Involved in creating Informatica mappings, Mapplets, worklets and workflows to populate the data from different sources to warehouse.

Created Global Prompts, Narrative Views, Charts, Pivot Tables, Compound layouts to design the dashboards.

Responsible to facilitate load testing and benchmarking the developed product with the set performance standards.

Used MS Excel, Word, Access, and Power Point to process data, create reports, analyze metrics, implement verification procedures, and fulfill client requests for information.

Used Teradata utility like BTEQ, FLOAD MLOAD and TPUMP scripts to load the data into Teradata tables.

Involved in testing the database using complex SQL scripts and handled the performance issues effectively.

Involved in Onsite & Offshore coordination to ensure the completeness of Deliverables.

Environment: Informatica PowerCenter 8.6.1/8.1.1,Tableau 9.2,Cognos 9, SQL Server 2008, IDQ 8.6.1, Oracle 11g, PL/SQL, TOAD, Putty, Autosys Scheduler, UNIX, Teradata 13, Erwin 7.5, ESP, WinScp

Farmers Insurance Group - Chicago, IL Jun’ 2012 to Oct’ 2014

ETL Developer

Description: Farmers Insurance Group is one of the main Insurance groups in US. They have various kinds of polices for various lines of business and personal usage like AUTO, FIRE, and LIFE etc. This project involves planning, development, implementation, maintenance and supporting BI and ETL applications of Enterprise wide Data warehouse (EDW) section. That is extracting data from E-CMS, SCV, APPS and FIPPS. This project titled MLCDM (Multi line Customer Data mart) is intended to give necessary information about the number of quotes and polices to make appropriate decisions about business and agents.


•Partly involved in gathering business requirements and worked closely with various Application and Business teams to develop Data Model, ETL procedures to design Data Warehouse.

•Involved in designed and developed star schema model for target database using ERWIN Data modeling.

•Extensively used ETL Informatica tool to extract data stored in MS SQL 2000, Excel, and Flat files and finally loaded into a single Data Warehouse.

•Use of T-SQL to develop Stored Procedures and Queries which are used as our source to pull data using Informatica.

•Used various active and passive transformations such as Aggregator, Expression, Sorter, Router, Joiner, connected/unconnected Lookup, and Update Strategy transformations for data control, cleansing, and data movement.

•Designed and developed Mapplets for faster development, standardization and reusability purposes.

•Implemented Slowly Changing Dimension Type 1 and Type 2 for inserting and updating Target tables for maintaining the history.

•Used Debugger to validate transformations by creating break points to analyze, and monitor Data flow.

•Tuned performance of Informatica Session by increasing block size, data cache size, sequence buffer length and Target based commit interval, and mappings by dropping and recreation of indexes.

•Worked along with the QA Team and provided production support by monitoring the processes running daily.

•Involved in pre and post session migration planning for optimizing data load performance.

•Interfaced with the Portfolio Management and Global Asset Management Groups to define reporting requirements and project plan for intranet applications for Fixed Income and Equities.

•Performed Unit testing during the mapping phase to ensure proper and efficient implementation of the transformations.

•Wrote UNIX Shell Scripts and pmcmd command line utility to interact with Informatica Server from command mode.

Environment: Informatica Power Center 8.6, MS SQL 2000, Oracle 9i, SQL, T-SQL, SQL Navigator, Erwin 4.1, UNIX Shell Scripting, Windows XP and 2000, Flat files, SQL, shell Programming, UNIX scripting, Windows NT.

Hanover Insurance Group, India Apr’ 2011 to Jun’ 2012

ETL Developer

Description: This project collects quote data from POS that is applicable to Home Quote such as information about the Quote, Prospect, Primary Residence etc. structure it so it can be reported, and provide analysts with the ability to query and report quote information. The ability to perform these functions will enable the analysts to measure the effectiveness of price actions and their impact on loss performance and retention.


•Code Review & Test results Reviews.

•Writing the SQL Scripts for extracting data in the application and applying those Scripts in the Database of the Application according to the Requirement.

•Test reviews according to Unit Test Plans of other Team Members Development activities.

•Code changes to the existing Informatica Mappings.

•Unit testing the modified and new Reports

•Responsible for all the deliverables of scheduling application. This include bug fixing issues and as well as developments to the current application.

•Responsible for work allocation and monitoring during development phase.

•Responsible for all the clarifications/technical issues raised at offshore.

•Responsible for Unit tests and issue resolution as a result of new configuration / Code changes going into production.

•Responsible for ensuring code/functionality conformance to Hanover Insurance Group standards & guidelines.

•Responsible for achieving stability in the project by effective planning, scheduling and execution of activities.

•Interaction with End User in resolving the Issues at the time of Issuing Policies.

Environment: Informatica 8.6, Oracle, DB2, UNIX, Tivoli Maestro scheduler, QMF, Toad.

Veeda Clinical Research, India Jan’ 2009 to Apr’ 2011

Informatica Developer


•Initial analysis of Raw Data received from business Users. Data use to differs from County to county and State to state.

•Worked on SQL * Loader to load the data from flat files obtained from various vendors.

•Wrote SQL and PL/SQL programs to retrieve required information from the database.

•Created Indexes on the table for faster retrieval of the data to improve database performance.

•Created database objects like tables, views, materialized views, procedures and packages using oracle tools like TOAD, SQL Developer and SQL*Plus.

•Handled errors using exception handing extensively for the ease of debugging and displaying the error message in the application.

•Involved in ETL in order to meet the requirements for extraction, transformation, cleansing, and loading from source to target.

•Involved in ETL process from development to testing and production environments.

•Extracted data from various sources and loaded it into target database using Informatica Power Center 7.1.

•Tuned mapping and SQL queries for better performance and efficiency.

•Performed unit testing and validated the Data.

•Provided Weekly status report to manager on the process and timeliness.

•Performed performance tuning of the process at the mapping level, session level, source level and the target level.

Environment: Informatica 7.1/8.6, Oracle, SQL, PLSQL, Unix, SQL Developer, SQL*Loader.


Bachelor of Technology Degree: “Computer Science Engineer”

References available upon request

Contact this candidate