Post Job Free
Sign in

Informatica Developer

Location:
Arizona City, AZ, 85123
Posted:
August 05, 2016

Contact this candidate

Resume:

Vinesh

214-***-**** x *** ******@****************.*** /**************@*****.***

SUMMARY:

Experience as a Software Engineer / Informatica Developer for over nine years in developing large-scale Data Warehouse and Client/Server Applications, including Data Profiling, Data Migration, Data Modeling, Business Process, Design and Development, Integration and Testing, Database Programming, SDLC, Project Management and Production Support.

Experience in Health, Financial, and Insurance domains.

Experience in complete Software development life cycle (SDLC) with a strong background in Design/Modeling, database development and implementation of various business intelligence and data warehouse / datamarts projects that cover gathering Business Requirements, Development, Implementations and Documentation.

Strong work experience with Business Users to analyze the business process model and made necessary changes to schema objects to cater users reporting needs.

Extensive experience in ETL methodologies for supporting Data Migration, Data Exchange, Data Transformation using Informatica Power Center v 9.x/8.x/7.x/6.x/5.x Suite.

Implemented Informatica Naming standards and Best Practices for the full life cycle of data warehouse projects, right from design to development through go-live and support

Well acquainted with Informatica Designer Components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designer, Workflow Manager and Monitor.

Sound knowledge of Relational and Dimensional modeling techniques of Data warehouse (EDS/Data marts) concepts and principles (Kimball/Inman) - Star, Snowflake schema, SCD, Surrogate keys, and Normalization/De normalization

Data modeling experience in designing and implementing Data Mart and Data Warehouse applications using Erwin.

Experience in integration of various data sources like Oracle, DB2, COBOL copybooks, SQL Server, Flat Files, Excel, and XML files

Worked on Slowly Changing Dimensions (SCD's) and its implementation (Type1, Type 2, and Type 3) to keep track of historical data.

Experience in using Business Objects to create reports

Extensively used various Performance Tuning techniques to improve the performance for data loads

Experience interacting with Business users and Architecting high level requirements, high level design and detailed level design documents.

Strong Quality Assurance and debugging skills in ETL Process

Solid understanding of OLAP concepts and challenges with large sets of data

Proficient in Toad, SQL navigator, and UNIX Shell Scripting

TECHNICAL SKILLS:

Data Warehousing ETL : Informatica Data Quality (IDQ) 9.x, Informatica PowerCenter 9.x/8.x/7.x/6.x/5.x, (Source Analyzer,

Warehouse Designer, Transformation Developer, Mapplet Designer,

Mapping Designer, Repository Manager, Workflow Manager, Workflow

Monitor and Informatica Server) ETL, Power Exchange 8.0, Repository,

Metadata, Data Mart, OWB 11gR2, OLAP, OLTP.

Data Modeling : Physical Modeling, Logical Modeling, Relational Modeling, Dimensional

Modeling (Star Schema, Snow-Flack, Fact, Dimensions), Entities,

Attributes, Cardinality, ER Diagrams,

Databases : Oracle 11i/10g/9i/8i/7.3, MS SQL Server 7.0/2000/2005/2008,

TERADATA, DB2, OLTP

Programming : SQL, T-SQL, PL/SQL, SQL*Loader, Unix, Shell Scripting, SQL

Tuning/Optimization, C, Java, HTML, Perl, PHP

Tools : TOAD, Business ObjectsSQL Developer, Excel, Word, Autosys,

Control-M, MS ACCESS

Reporting Tools : Oracle Business Intelligence, Cognos, Business Objects.

Environment : UNIX, Windows XP/Vista, Linux.

PROFESSIONAL EXPERIENCE:

American Express, AZ

Sr ETL Developer

March ‘15-Till Present

Amexrican Express also known as Amex is one of the Multinational Financial Services Corporation. The project was a migration project for CCP data which was maintained by AVAYA system to CISCO system. This data is used to evaluate each CCP and provide incentives.

Responsibilities:

Worked closely with Business Analyst and the end users in writing the functional specifications based on the business requirement needs.

Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica power center 9.1.

Involved in massive data cleansing prior to data staging

Experience with high volume datasets from various sources like Text Files, and Relational Tables and xml targets.

Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.

Extensively used Transformations like Router, Aggregator, Joiner, Expression, Lookup, Update strategy and Sequence generator, Procedure. Knowledge in use of SQL and Java Transformations.

Implemented Type1 and Type2 methodologies loading, to keep historical data in data warehouse.

Implemented incremental logic to load the data.

Created E-mail notifications tasks using post-session scripts.

Worked with command line program pmcmd to interact with the server to start and stop sessions and batches, to stop the Informatica server and recover the sessions.

Designed and Developed the Crystal Reports using the Views and Stored Procedures.

Created procedures to drop and recreate the indexes in the target Data warehouse before and after the sessions.

Debugging invalid mappings using break points, testing of stored procedures and functions, testing of Informatica sessions, batches and the target Data.

Environemt: Informatica power center 9.1/9.6, SQL Server 2008, DB2, Unix, PL/SQL, Windows 7, Business Objects, SQL Developer 2005

UNUM, Portland, ME May’13-Feb'15

Sr ETL/ Informatica Developer

Unum is a leading provider of financial protection benefits in the United States and the United Kingdom. Unum's employee benefits portfolio includes disability, life, accident and critical illness insurance, which help protect millions of working people and their families in the event of illness or injury.

Responsibilities:

Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets

Used IDQ to perform data profiling and create mappings.

Involved in migration of the mapps from IDQ to power center

Developed rules and mapplets that are commonly used in different mappings

Used various transformations like Address validator, parser, joiner, filter, matching to develop the maps

Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.

Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager

Implemented SCD1, SCD2 type maps to capture new changes and to maintain the historic data

Worked on extract and load type of mappings

Worked on SQL, Oracle, DB2, Teradata databases

Worked with command line program pmcmd to interact with the server to start and stop sessions and batches, to stop the Informatica server and recover the sessions

Involved in creating stored procedures and using them in informatica

Developed Unix shell scripts to schedule informatica sessions

Responsible for Performance Tuning at the Mapping Level and Session level.

Used Debugger to troubleshoot the mappings.

Created Web Intelligence Ad-hoc and canned reports

Created alerts, filters and inserted Breaks in designing reports.

Responsible for migration of the work from dev environment to testing environment

Responsible for solving the testing issues

Created documents that have the detail explanation of the mappings, test cases and the expected results and the actual results

Environment: Informatica power center 9.1, Informatica Data Quality, SQL Server 2008, DB2, UNIX, PL/SQL, Windows 7,Business Objects, Oracle 11i, SQL Developer 2005

MVP Health Care, Schenectady, NY Aug’11 – April’13

ETL Developer

MVP is a nationally-recognized, not-for-profit health plan providing benefits to members for more than 30 years. The current project FRDM Stabilization is a conversion project from existing ETL tool to Informatica.

Resposibilities:

Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets

Followed the old ETL tool Sagent and developed technical design documents and used them to develop mappings in Informatica.

Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.

Created new mappings and enhancements to the old mappings according to changes or additions to the Business logic.

Worked on PROVIDER, CLAIMS, MEMBER, EDI, EITTR, WEBPROVIDERS subject areas

Developed complex mappings using different transformations like source qualifier, connected look up, unconnected look up, expression, aggregator, joiner, filter, normalize, sequence generator and router transformations.

Worked on several extracts and loads type of mappings.

Worked on XML Source Files.

Worked with SQL Override in the Source Qualifier and Lookup transformation.

Extensively used various Functions like LTRIM, RTRIM, ISNULL, ISDATE, TO_DATE, Decode, Substr, Instr and IIF function.

Developed Re-Usable Transformations and Mapplets.

Used Update Strategy DD_INSERT, DD_UPDATE to insert and update data for implementing the Slowly Changing Dimension Logic.

Developed SCD 1 and SCD 2 to capture new changes while maintaining the historic information.

Reduced the amount of data moving through flows to have a tremendous impact on the mappings performance.

Designed Work Flows that uses multiple sessions and command line objects ( which are used to run the unix scripts)

Worked on SQL server, Oracle, Sybase data bases.

Used Unix scripts for scheduling and executing informatica work flows

Responsible for solving testing issues

Worked extensively on Unit Testing and preparing efficient unit test documentation for the developed code to make sure the test results match with the client requirement.

Prepared detail documentation for the developed code for QA to be used as guide for future migration work.

Environment: Informatica 9.1/8.6.3, SQL Server 2008/2005, Oracle 11i, PL/SQL, UNIX, VIRTUAL WINDOWS XP, WINDOWS 7, IBM Lotus, SQL Developer 2005, SUN SOLARIS

CMA (MDWNY-DOH), Albany, NY Dec’10 – June’11

ETL/Informatica Developer

The Medicaid Data Warehouse (MDW) is the replacement for the eMedNY Medicaid data warehouse which accepts data from the eMedNY Medicaid online transactional processing (OLTP) system, external agency transactional and reporting systems, and external vendor databases. This data is currently transformed and loaded into a Oracle database designed to support analysis, research and reporting. The process that converts this operational data into analytical data is called Extract, Transformation, and Load (ETL).

Responsibilities:

Collected requirements from Business Users, analyzed and prepared the technical specifications

Used ETL tools Informatica 8.6.3/9.1 to extract data from source systems, cleanse Transform, and Load data into databases.

Developed mappings using Power Center -Designer for data transformation as per the technical requirements

Designed mappings for different subject area CLAIMS, DENIED CLAIMS, PROCEUDRE, PROVIDER, REFERENCE, MEMBER,WMS

Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy and Sequence generator, Procedure.

Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.

Used Update Strategy DD_INSERT, DD_UPDATE, DD_DELETE and DD_REJECT to insert, update, delete and reject items based on the requirement.

Implemented slowly changing dimension (SCD) for accessing the full history of Customer and transactions of customer. Implement Type I, II, and III changes in slowly changing dimension tables.

Worked extensively with Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading

Created mapping using TERADATA, COBOL, DB2 sources

Used Reusable Transformations and Reusable Mapplets for different validations

Built PLSQL procedures, functions as a part of custom transformations

Analyzed newly converted data to establish a baseline measurement for data quality in data warehouse.

Setting up batches and sessions to schedule the loads at required frequency using Power Center server manager

Involved in Version control of the jobs to keep track of the changes in the Development Environment.

Designed views and materialized views to create accumulated data based on the detail facts the data

Extensively worked on UNIT TESTING and created different unit test case documents for different subject areas.

Designed work flows, that are used to run a set of maps together one after the other, the results such as the execution time, start time, end time and status are stored in a run time repository which can be accessed by other users

Created and scheduled Sessions, Jobs based on demand, run on time and run only once using Workflow Manager.

Used debugger to test the mapping and fixed the bugs

Developed UNIX shell scripts to schedule the jobs for running

Tuned and optimized mappings to reduce ETL run times thereby ensuring the mappings ran within the designated load window.

Documented the purpose of mapping so as to facilitate the personnel to understand the process and incorporate the changes as and when necessary.

Co-ordinate with QA and BI reporting teams for any data related issues.

Environment: Informatica 9.1/8.6.3, SQL Server 2008/2005, Oracle 11i, OBIEE, PL/SQL, TERADATA, UNIX, VIRTUAL WINDOWS XP, WINDOWS 7, SQL Developer 2005,SUN SOLARIS

Citigroup, Wall-Street, NY Apr 09 – Nov 10

ETL/ Informatica Developer

The project is for Citi Group, involved in disbursal of loan amounts for various purposes like: Personal Loan, Vehicle Loan, Housing Loan, Consumer Durable Loans, etc. The company requires different level of analysis regarding loan amount, type of customers, type of payment schedules, interest rates (variable or fixed), defaulters list and the penal interest calculations, etc. The data warehouse captures data from their Transactional Database maintained under Client/Server Architecture.

Responsibilities:

Worked closely with Business Analyst and the end users in writing the functional specifications based on the business requirement needs.

Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin 4.5 to design the business process, dimensions and measured facts.

Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica powercenter 8.6.1.

Migrating from informatica 8.1.1 mappings into informatica 8.6.1 which consists of grid technology

Understanding the domain and nodes as well as by using the informatica integration service to run the workflows in 8.6.1.

Involved in massive data cleansing prior to data staging.

Created mapping using XML, COBOL sources.

Worked on power exchange to access the data from different sources and to avoid manual coding

Innovative design for Change data capture of Suppliers and millions of Account every business day

Experience with high volume datasets from various sources like Oracle, Text Files, and Relational Tables and xml targets.

Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.

Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy and Sequence generator, Procedure. Knowledge in use of SQL and Java Transformations.

Created E-mail notifications tasks using post-session scripts.

Worked with command line program pmcmd to interact with the server to start and stop sessions and batches, to stop the Informatica server and recover the sessions.

Created procedures to drop and recreate the indexes in the target Data warehouse before and after the sessions.

Implemented Type1 and Type2 methodologies in ODS tables loading, to keep historical data in data warehouse.

Debugging invalid mappings using break points, testing of stored procedures and functions, testing of Informatica sessions, batches and the target Data.

Tuning the complex mappings based on source target and mapping, session level.

Involved in performance and tuning the ETL processes.

Wrote SQL, PL/SQL, stored procedures & triggers, cursors for implementing business rules and transformations.

Developed UNIX Shell Scripts for scheduling the sessions in Informatica.

Used SQL tools like TOAD 9.5 to run SQL queries and validate the data in warehouse.

Environment: Informatica Power Center 8.6.1/8.1.1, OBIEE 10.1.3.4, Oracle11i, Informatica Power Connect / Power Exchange, MS Access, TOAD, XML,PL/SQL, SQL server 2008, Windows, UNIX

Mutual of Omaha, Omaha, NE Aug 08 – Mar 09

ETL/Informatica Developer

Mutual of Omaha is one of the nation’s leading P&C Insurance providers, which offer a variety of products and services including life insurance, property, auto, business and home insurance. The primary objective of the project was to make a data warehousing system out of life insurance customers having different insurance policies.

Responsibilities:

Developed several Informatica load maps using PowerCenter/PowerMart for extraction, loading and transformation (ETL) of data into Oracle database.

Developed several Informatica error maps to perform Error handling by capturing errors and error messages and inserting them into the Error tables.

Used Star schema and Snowflake schema in relational, dimensional and multidimensional modeling

Extracted data from flat files and oracle database, applied business logic using several transformations and loaded them in the central oracle database.

Used Informatica Workflow Manager to create, Schedule, execute and Monitor Sessions, Batches/Worklets and Workflows.

Used informatica to load into Teradata Database

Created parameter files for load start time, load end time and status code.

Wrote SQL, PL/SQL, stored procedures & triggers, cursors for implementing business rules and transformations.

Used debugger to test the data flow and fix the bugs before running the sessions

Tuned the mappings at session level by setting the buffer size, cache size, increasing commit intervals etc.

Used TOAD to import data from different schemas and to export the query results into XLS sheets, text files and several other formats.

Created complex mappings in using Aggregate, Expression, Router, Sequence Generator, Update Strategy, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.

Developed UNIX scripts to automate load process and email the status to specific distribution lists.

Used SSIS to automate the SQL server databses

Migrated the sessions, workflows into production.

Generated reports based on user requirements.

Environment: Informatica 7.1.3, Oracle 10.2, MS SQL Server 2005, PL/SQL, TOAD, DB2, Microsoft Excel, OBIEE, Windows NT, Rational Rose.

Wesgrow Software Ltd, India Nov 06 – July 08

ETL/Informatica Developer

WESGROW SW is a custom oriented and quality conscious software house strongly believes in customer’s business needs driving the technology and application needs. Credit Card Management System (CCMS) is a web-based software system for credit card management that provides Internet access to bank commercial card customers. CCMS allows Cardholders to view their statements and associated transactions and also it provides the billing and invoicing to the customers.

Responsibilities:

Extensively involved to implement customer data mart and also data ware house.

Analyzed the source data coming from Oracle, SQL server, and flat files.

Extracted data from different source systems like Oracle, SQL server, flat file.

Worked with Star Schema and Snow flake schema for the data warehouse.

Extensively used Informatica Power center for extracting, transforming and loading into different databases.

Designed, developed and tested the different Mappings according to Requirements.

Analyzed the load dependencies and scheduled the workflows accordingly to avoid the loading conflict issues.

Analyzed Session Log files in case the session fails in order to resolve errors in mapping or session configurations.

Developed UNIX shell scripts as part of the ETL process for scheduling.

Extensively worked in Oracle SQL, PL/SQL, SQL*Plus, SQL*Loader, Created database objects like Tables, Indexes

Performed unit testing, knowledge transfer and mentored other team members.

Environment: Informatica PowerCenter 5.1.1, Oracle 9i, MS SQL SERVER 2000, SQL, PL/SQL, SQL* Loader, UNIX Shell Script.



Contact this candidate