Search for: Jobs   Resumes
Keywords or title City, state or zip code

Informatica ETL Developer

Post jobs
Country: United States
State: Nebraska
City: Omaha
ZIP: 68124
Salary: 100k
Posted date: 9/29/2009   all resumes
Contact Info: *********@*****.***
Show Full Resume Send Message Free
Email me new resumes like this

Resume Text:

Tincy Treesa Francis
Ph: 402-***-****

• 7+ yrs of experience in Datawarehousing Technology Requirement Analysis, Data Analysis, Application Design, Application Development which includes, Requirement gathering, Customization, Maintenance and Testing of various Business Applications in Data Warehouse and Training Specialist.
• Extensive hands-on expertise using Informatica 8.x/7.x/6.x/5.x and Erwin Data Modeler with strong business understanding of Banking, Insurance, Finance and Pharmaceuticals-Healthcare and Telecom sectors.
• Good understanding of relational database management systems like Oracle, TeraData, DB2, and SQL Server and extensively worked on DataIntegration using Informatica for the Extraction transformation and loading of data from various database source systems and mainframe Cobol and VSAM files.
• Familiar with Ralph Kimball and/or Bill Inmon methodologies and Automation of ETL processes with scheduling tools and exception-handling procedures
• Good knowledge of Installing and Configuring PowerCenter 7/8
• Experience in writing shell scripting for various ETL needs and worked on integrating data from flat files (mainframe).
• Development experience on Windows NT/2000/XP, UNIX (Solaris, HP UX, AIX) platforms.
• Experience in Informatica mapping specification documentation, tuning mappings to increase performance, proficient in using Informatica Server Manager to create and schedule workflows and expertise in Autosys and ESP scheduling.
• Having experience in leading team and possess excellent communication and interpersonal skills, ability to quickly grasp new concepts, both technical and business related and utilize as needed.
• Strong skills in Requirement gathering and study, Gap Analysis, Scope Definition, Recommendations to Business Process Improvements, Development of Procedures, Forms & Documentation, Effort Estimation, Resource Planning, Project Tracking, UAT.

Educational Background

• BE (Bachelor of Engineering)
BE in Electrical& Electronics from VTU Karnataka India


• Teradata V2R5 Certified Professional
• Informatica PowerCenter 7 Cerfified Designer (Level N Certification)

Technical Skills

ETL Tools Informatica 5.x/6.x/7.x/8.x (PowerMart & PowerCenter)
Data Modeling Tools Erwin 3.x/4.x
Databases SQL Server 2005,Oracle 10g/9i, Teradata V2R5, DB2
TeraData Utilities BTEQ, FastLoad, MultiLoad
Programming Languages COBOL(Mainframe),SQL, PL/SQL, Unix Shell Scripting and C
Operating System MS–Windows 9x/NT/2000/XP, Red Hat Linux, UNIX (Sun Solaris 5.8)
Other Tools Siebel, Cognos, TOAD 8.x and SQL Plus (Database tool), Autosys.

Professional Experience

BlueCross and Blue Shield of NE, Omaha, NE Sept08- till date.
ETL Developer

EDW, Medicare, Capsule and Recon: Blue Cross enterprise data warehouse deals with different kinds of health claims which are categorized as Facility, Professional, FEP. Data is coming from various sources like Oracle, SQL Server, Mainframe etc which will be loaded in to EDW based on different frequencies as per the requirement. The entire ETL process consists of source systems, staging area, ODS layer, Datawarehouse and Datamart.

Roles & Responsibilities:

• Involved in gathering requirements and created design documents and mapping documents.
• Involved in design and development complex ETL mappings and stored procedures in an optimized manner. Used Power exchange for mainframe sources.
• Created metadata queries which would help in restart and to find out any missing link conditions or disabled tasks in a workflow.
• Fine-tuned existing Informatica maps for performance optimization.
• Debugged mappings by creating logic that assigns a severity level to each error, and sending the error rows to error table so that they can be corrected and re-loaded into a target system.
• Involved in the Unit testing and System testing.
• Worked on bug fixes on existing Informatica Mappings to produce correct output.

Environment: Informatica PowerCenter 8.6, Informatica Power Exchange, Oracle 10g, MS SQL Server 2005, Oracle, windows, Mainframe.

AvivaUSA, Des Moines, IA Jan’08 –Aug’08
Senior ETL Programmer

AdminServer Reporting, ProductionServer Reporting & 403B Reporting: AvivaUSA gets the source data for Insurance policy owners, agents, beneficiaries, payees etc from various source systems like AS400, Oracle, SQL Server , DB2 and Mainframe. Data from these sources acquired into the Stage database. Data cleansing and aggregation is done for the data and populated into ODS layer and then to Datwarehouse for maintaining history and Data marts for reporting requirements. Data Integration is done using Informatica tool.

Roles & Responsibilities:

• Involved in gathering of business scope and technical requirements and created technical specifications.
• Developed complex mappings and SCD type-I, Type-II and Type III mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router and SQL transformations. Created complex mapplets for reusable purposes.
• Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
• Created synonyms for copies of time dimensions, used the sequence generator transformation type to create sequences for generalized dimension keys, stored procedure transformation type for encoding and decoding functions and Lookup transformation to identify slowly changing dimensions.
• Fine-tuned existing Informatica maps for performance optimization.
• Worked on Informatica Designer tools –Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer and Server Manager to create and monitor sessions and batches.
• Involved in the development of Informatica mappings and also tuned for better performance.
• Debugged mappings by creating logic that assigns a severity level to each error, and sending the error rows to error table so that they can be corrected and re-loaded into a target system.
• Involved in the Unit testing, Event & Thread testing and System testing.
• Analyzed existing system and developed business documentation on changes required.
• Made adjustments in Data Model and SQL scripts to create and alter tables.
• Extensively involved in testing the system from beginning to end to ensure the quality of the adjustments made to oblige the source system up-gradation.
• Worked on various issues on existing Informatica Mappings to produce correct output.
• Efficient Documentation was done for all phases like Analysis, design, development, testing and maintenance.
Environment: Informatica PowerCenter 8.5, Oracle 10g, MS SQL Server 2005/2000, Linux/UNIX Shell Scripting, DB2, AS400, Mainframe TOAD, SQL Plus, ESP Scheduler.

Pfizer, New London, CT Oct’06 – Dec’07
ETL Lead Programmer

Pfizer DIF QDAM-(Development Information Factory Quality Data Acquisition Management): The goal of this project was to create a development Information Factory which consists of ODS, Datawarehouse and Datamarts. Data from different Source Systems is integrated into the Staging area, also the QDAM Rules, Exceptions Log and Tracking Log are acquired from the QDAM Application Database to the Staging and then ODS, DW and DM are created. ODS: Operational Data Store which would contain the cleaned, transformed and integrated data from the 4 different sources. DW: The Data Warehouse maintains Historical Data from the Source Systems and also the History of the Quality Rules, Exceptions Log and Tracking Log. DM: Datamarts provides a consolidated reporting environment for WWD meeting the decision-support information needs of the organization. Various stages of Development for DIF are Stage 1 – ETL to acquire the data from 4 sources and the QDAM application data into the Stage DB. (About 150 mappings). Stage 2 – ETL to load Tracking, Exception and QDAM Rules history data from Staging into the DIF Data Warehouse. (About 10-15 mappings). Stage 3 – ETL to apply the QDAM rules and load the data passing the Rules into the DW (About 10-15 mapplets). Stage 4 –Existing ETL in the Acquisition group and Integration Group to include a join/filter to QDAM Exception tables to include this data for reprocessing (113 Integration mappings, 23 Acquisition mappings, 208 Acquisition/Integration mappings). Stage 5 – ETL to load the Exceptions Data Mart (10-15)
Roles & Responsibilities:

• Participated in all phases including Requirement Analysis; Client Interaction; Design, Coding, Testing and Documentation.
• Developed Informatica SCD type-I, Type-II and Type III mappings and tuned them for better performance. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, mapplets and others.
• Debugged mappings by creating logic that assigns a severity level to each error, and sending the error rows to error table so that they can be corrected and re-loaded into a target system.
• Database Design and development.
• Involved in writing test scripts, unit testing, system testing and documentation.
• Design and development of UNIX Shell Scripts to handle pre and post session processes.
• Extracted data from legacy systems and uploaded into Oracle Datawarehouse.

Environment: Informatica PowerCenter 8.1.1, Oracle 10g, MS SQL Server 2005, Linux/UNIX Shell Scripting, ERWIN 4.x, TOAD, SQL Plus, Cognos.

Charles Schwab, SF Jan’05 – Sep’06
Sr. ETL Programmer

Integrated Datawarehouse (IDW): Charles Schwab has embarked upon a strategic initiative to build an Integrated Data Warehouse (IDW) that will enable fact-based decision-making. The data warehouse being built will serve as a firm foundation for Business Intelligence and Data Analytics subsequently. The subject area covered in this phase is CUSTOMER, TRADES, ORDER, ITEM, BPT, ACCOUNT, PROFITABILITY, ASSET TRACKING and SECURITY. The process loads the Bank customers and account details, Household wise/Category wise Exclusive, Qualified and Practice revenue amounts, which will be used by the Decision Support System for the analysis.

Roles & Responsibilities:

• Gathered and analyzed requirements by meeting business stakeholders and other
technical team members prior to the design of ETL.
• Understanding High Level ETL Requirement specifications to develop LLD for type-I, SCD Type-II and Type III mappings and done rigorous testing for various test cases.
• Preparation of QA migration and PROD migration Documents after development and testing in Development environment and executed Quality Assurance testing.
• Developed complex mappings as per the requirement using almost all transformations and effectively used Debugger for identifying errors and resolving them.
• Have done Informatica performance tuning and query tuning.
• Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
• Team leader who was responsible for the assignment of tasks for the team members, creation of Metrics reports, weekly status reports, preparation of project plan and quality related documents & migration documents etc. Used share point to maintain documents and tracking.
Environment: Informatica PowerCenter 7.1.2, Oracle 9i, Teradata V2R5, Linux/UNIX Sun Solaris 5.8, UNIX Shell Scripting, TOAD, Mainframe, Teradata SQL Assistant, SQL Plus.

WellPoint Insurance, San Diego, CT Nov’03 – Dec’04
ETL Programmer

WellPoint is the nation's leading health benefits company serving the needs of approximately 34 million medical members nationwide. It is offering a broad range of medical and specialty products. WellPoint is a Blue Cross or Blue Cross Blue Shield licensee in 14 states: California, Colorado, Connecticut, Georgia, Indiana, Kentucky, Maine, Missouri, Nevada, New Hampshire, New York, Ohio, Virginia, and Wisconsin.
The goal of Enterprise Data Warehouse (EDW) program is to create a single, authoritative source for analytical reporting. The flow of this project is to getting the data on to the Enterprise Data Warehouse (EDW) using the Persistent Data Source (PDS) as the Input. Taking the PDS as input and applying all the business rules and transformations, the data would be loaded onto the Target Data Base, which is standing on Universal Data Base (UDB). Loading of data into the EDW is done using the ETL Tool Informatica. Loading of data into EDW process need to be scheduled using Informatica scheduler.

Roles & Responsibilities:

• Replicate operational tables into staging tables, Transform and load data into enterprise data warehouse tables using Informatica from their legacy systems and load the data into targets by ETL process through scheduling the workflows.
• Developed Informatica objects - Mappings, sessions, workflows based on the prepared low level design documents.
• Involved in developing the SQLs, which are used to apply all business rules on the data before load into the target tables.
• Implemented optimization techniques for performance tuning and wrote necessary Pre & Post session shell scripts.
• Performed testing, knowledge transfer and provide technical support and hands-on mentoring in the use of Informatica.
• Debug the Informatica mappings and validate the data in the target tables once it was loaded with mappings.
• Using Erwin data modeler for maintaining consistency between the dimension model and the database.
Environment: Informatica (Power center 6.1), Oracle 9i, TOAD 7.8, UNIX, SQL Server, Erwin Data Modeler 4.1, Cognos.

Procter & Gamble, Cincinnati Nov’02 – Oct’03
ETL Programmer/Analyst

CPG, RSS and Delphi are three projects populates P&G Data warehouse to fulfill different P&G User needs. The project is basically dealing with Production Support for CPG, RSS and Delphi and any future enhancements in these will also be handled.
Procter & Gamble is currently customizing the SIEBEL Application to bring the existing Commercial Product Group (CPG) Customers, Products and other associate entities data into SIEBEL to facilitate the CPG sales people and managers with more information about their accounts. CPG takes care of the non-retailing part of the P&G business. Their customers are mainly organizations, which buy P&G products for non-retailing purpose. RSS is the division of P&G which takes care of the retailing part of the business. RSS has Retail Merchandisers who go out to the Stores and make sure that, the P&G products are displayed in the optimal way to maximize sales, fix out of stocks, fix display tags etc.
Delphi is an analytics dashboard that is intended to showcase one complete version of key business metrics to P&G senior management. Specifically the target audience is senior managers in Customer Business Development across US, Canada and Puerto Rico. Delphi currently enables these three entry points to analyze the data holistically from each of their perspectives.

Roles & Responsibilities:

• Converted user defined functions of business process into Informatica defined functions.
• Developed and support the extraction, transformation and load process (ETL) for a Data Warehouse from their legacy systems using Informatica and provide technical support and hands-on mentoring in the use of Informatica
• Debug the Informatica mappings and validate the data in the target tables once it was loaded with mappings.
• Prepared and implemented data verification and testing methods for the Data Warehouse as well as to design and implement data staging methods and stress testing of ETL routines to make sure that they dont break on heavy loads.
• Performed ETL Administrator jobs like maintaining Folder/User level Privileges,
Scheduling Jobs based on target system dependencies.
Environment: Informatica (Power center 6.1), Oracle 9i, TOAD 7.8, UNIX, Erwin Data Modeler 4.1, Cognos

HP, Bangalore, India Sept’01 – Oct’02
ETL Programmer/Analyst

GSCB – BIDW Tool (Global Solutions Center – Bangalore) is a web based application that was developed to integrate the various data sources across businesses for GSCB operations to have a centralized and consistent data store. It provided a detailed cockpit view of daily/ weekly/ monthly/ quarterly/ annual reports based on each operational tower and their respective processes. This provided a single window to all stakeholders for tracking and reviewing the metrics and reporting details and automated the reporting process. It is a central repository of data/ metrics for historical references and statistical analysis and to provide automated mail alerts to stakeholders for monitoring and tracking of operations.

Roles & Responsibilities:

• Involved in design and development of data models using Erwin data modeler.
• Developed mappings, sessions and workflows (ETL) for SCD types I, II and III to meet the above requirements.
• Involved in Analysis, Documentation and Testing of workflows.
• Used mapplets and different transformations to meet the complex logic.
Environment: Informatica (Power center 6.1), Oracle 8i, TOAD 7.8, UNIX, Erwin Data Modeler, Cognos.

CDAC, Trivandrum, India Mar’01 – Aug’01
Mainframe Developer.

Credit Online Information System is designed for business-to-business and business-to-customer transactions. Business issue dealt with financial transfer and management of customers. While starting an account, the account number is being generated automatically. Different types of Reports can be generated. Software is developed to Inquire, Add, Update, Browse, and Delete the details available in a DB2 Table called Account Master. For every debit or credit entry against a valid account number the master is updated. Software is developed to handle reporting of least transacted accounts based on last transaction date. Credit handling based on credit available and credit maximum period. A VSAM file is maintained for archiving records.

Roles & Responsibilities: Developed programs in COBOL with embedded DB2 SQL, CICS maps.
Environment: Mainframe OS/390, VS COBOL, DB2, CICS.

Similar Resumes

informatica resume
United States - etl developer, informatica, etl, cognos,...

ETL Informatica Developer
San Francisco, CA, CA - etl, informatica, developer, programmer, saudi,...

Data Warehouse (ETL) Developer
Irvine, CA - informatica developer, etl developer, teradata,...

Sr. Informatica/ETL/Data Analyst/Developer
Cranbury, NJ - developer, informatica, etl, architect, salesforce,...

Informatica Developer with 2 years of Data ware Housing experience
India, Pune, MH - abinitio, initio, informatica, plsql, bim, nasik,...

Informatica ETL Developer With Teradata/Oracle Knowledge.
Pittsburgh, PA - sales force, etl developer, teradata, informatica,...

Informatica developer
India, mumbai, MH - datastage, obiee, etl developer, gtm,...

ETL Developer
United States - sample, developer, informatica, etl,...

Bsc with 3+ years Exp in Informatica
India, Bangalore, KA - informatica, plsql, years exp, etl, cognos,...

informatica teradata developer
India, Hyderabad, AP - joiner, neha, teradata, informatica, clearing, etl,...

©2014 - Home - Latest Resumes - - Update or Delete my resume