Post Job Free

Resume

Sign in

ETL Informatica

Location:
Mount Laurel, NJ
Posted:
December 01, 2016

Contact this candidate

Resume:

Professional Summary:

** ***** ** **** ********** in ETL (Extraction, Transformation and Loading) of data from various sources into EDW, ODS and Data marts using Informatica Tools in Insurance, Retail, Banking and Health care domain projects.

3 years experience Data conversion projects for existing clients by using Pentaho 5.2 ETL tool like kettle, spoon. Spoon is a GUI for Extraction, Transformation and Loading (ETL) engine that uses a metadata-driven approach.

2 years of work experience in BI Tools (Actuate, SSRS, Tableau Reporting) and Java

Experience in the Implementation of full lifecycle in Data warehouse, Operational Data Store (ODS) and Business Data marts with Dimensional modeling techniques Star Schema and Snow flake Schema.

Knowledge on the HIPAA transactions and ANSI X12 Code Set 837 (I/P/D), 835, 270 and 271

Good explore and experience java Technology like OOP, Core java, JDBC

Experience in Software Development Life Cycle (SDLC) and agile methodology.

Experience in design, development and maintenance of software applications in Information Technology, Data warehouse and RDBMS concepts.

Expertise on Informatica Mappings, Mapplets, Sessions, Workflows and Work lets for data loads.

Experience in Performance Tuning of Targets, Sources, Sessions, Mappings and Transformations.

Worked on Exception Handling Mappings for Data Quality, Data Profiling, Data cleansing and data validation by using IDQ

Good exposure in Informatica MDM where data Cleansing,De-duping and Address correction were performed

Good explore in Informatica DVO for test the data validation by write SQL queries source and Target database and schedule tasks and reporting test result.

Experience in developing Informatica Reusable components for using them across the projects.

Extensively worked with Informatica Mapping Variables, Mapping Parameters and Parameter Files.

Worked on Slowly Changing Dimensions - Type 1, Type 2 and Type 3 in different mappings as per the requirements

Databases like Oracle, Teradata, SQL Server, Microsoft Access and Worked on integrating data from Flat files like fixed width /delimited, XML files

Experience in writing Stored Procedures, Functions, Triggers and Views on Oracle, Teradata, SQL Server, PL/SQL.

Extensively worked on Monitoring and Scheduling of Jobs using UNIX Shell Scripts

Worked with PMCMD to interact with Informatica Server from command line and execute the Shell script.

Experience on Migrating the environment from Informatica 8.x to 9.x.

Experience on ER Data Modeling tools like Erwin, ER-Studio and Visio in developing Fact & Dimensional tables, Logical and Physical models.

Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats

Very strong knowledge of Informatica Data Quality(IDQ) transformations like Address, validator, Parser, Match, Exception, Merge, Standardizer and other significant

Expertise on tools like Toad, Autosys, and SQL Server Management Studio. Involved in Unit testing, Functional testing and User Acceptance testing on UNIX and Windows Environment.

Completed documentation in relation to detailed work plans, mapping documents.

Experience in managing onsite- offshore teams and coordinated test execution across locations

Excellent communication skills, documentation skills, team problem solving ability, analytical and programming skills in high speed, quality conscious, multi-tasked environment.

I have knowledge SSIS,ALM, JIRA, SVN, share point tool & application for support project repository.

EDUCATION

Bachelor's degree in engineering in Electrical and Electronics, JNTU, Hyderabad

CERTIFICATIONS:

NCFM Certification from NSE (National Stock Exchange, India).

Core Java Certification from Sun Microsystems.

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center 9.5/8.x (Source Analyzer, Data Warehouse designer, Mapping Designer, Mapplet, Transformations, Workflow Manager, Workflow Monitor, Work lets), Informatica MDM,Data Profiling, Data cleansing, Informatica DVO,OLAP, OLTP.

Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin 4.1

Database: Oracle Exadata 11g/10g/9i/8i, SQL Server 20012/2008/2005, Teradata, Postgres 9.4.

DB Tools: TOAD, SQL*Plus, PL/SQL Developer, SQL * Loader, Teradata-BTEQ and SQL Assistant

Reporting Tool: Cognos 10(Framework Manager, Reports tools,) Actuate 11/10, SSRS, BIRT, Tableau 8.2,

Programming: SQL, PL/SQL, Java, UNIX Shell Scripting

Environment: Windows 7/XP/2000, MS DOS.

Others: Autosys, SSIS, Pantaho 5.2, JIRA, SVN, MS Excel, MS Word, MS Outlook, MS Office products

PROFESSIONAL EXPERIENCE:

Client: TD Bank, Mount Laurel. NJ

Duration: May 15 – Still

Title: Sr.Informatica Developer

Project: Collection & Recovery: Implementation of a best in class Collections and Recovery System solution will reduce process variance and non-compliance with regulatory requirements. The Loss Prevention Department is responsible for the collection and recovery activity of delinquent and problem loans. Loss Prevention is pursuing a single system solution for Collection and Recovery activities including inbound/outbound collections, workflow management for Bankruptcy, Litigation, Foreclosure, Repossession, Probate, Loss Mitigation and Charge Off, as well as lettering capabilities, credit loss processing and other special activities.

Responsibilities

I have involved in implementing the end to end ETL life cycle

Understanding the Requirement from Business users meetings

Translating the Requirements into Design like HLD, LLD

Re-designed multiple existing Power Center mappings to implement change request (CR) representing the updated business logic

Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes

Involved in designing the dimensional model like STAR schema, snowflake schema

Created Source and Target column mapping sheet

Prepared the DDL Scripts for creating the tables and supporting table records

Improvement in operational efficiencies and controls through workflow automation

Enhanced performance management with reportable data elements

Integration of risk based models into collection strategies developed by Loss Preventions analytics team, utilizing both internal and external data elements

Stronger overall quality assurance and compliance controls using system rules

To modify or replace certain existing processes that move data from the US Staging environment to the Dodd-Frank regulating reporting tables and extracts, for data fields currently sourced from the RMSY database, specifically, to source those data fields from CAS/CACS tables in US Staging

Analysis the requirement and work assign to team and track status report on daily.

Playing 40% Team Lead this project and remain 60% writing ETL coding & testing.

Extracted data from multiple database and Flat files and stages into a single place and applied business logic to load them in the central oracle database.

In this project data process multiple layers are Source, staging, Temp and DDM and Reports layer.

HP ALM defect tracker tool used for track the defects status from testing team.

Used AutoSys CA tool for Schedule Informatica jobs & monitoring jobs for PRD support

Shell script used for files transfer and Archive old files

Shell scripts used for call the Informatica workflows passing input parameters

Oracle Exadata feature data compression implemented for Fact tables.

Writing minus queries for data counts and same implemented as DVO

Implemented data cleansing for Dimension tables by using Informatica Data Quality(IDQ)

Build out best practices regarding data staging data cleansing and data transformation routines within the Informatica MDM solution

Define and build best practices regarding creating business rules within the Informatica MDM solution

Created Datamaps via PowerExchange to connect VSAM COBOL copybook file reads.

Code Migration from 8.x to 9.5.1 version

Production support exiting applications

Two projects performance improved

Client: Keenan - Phoenix, AZ

Duration: Feb 12 – Apr 15

Title: Sr. Informatica Consultant

Product: Javelina (Healthcare Claims Process) - Javelina is Java web based product and this product used for claims processing system of healthcare insurance clients. For data warehousing & conversion module we are Informatica power center tools are used. We designed different kind’s reports by using Actuate, SSRS, Cognos and Tableau reports tools. This product support any RDBMS database like Postgres, MS SQL Server, Taradata and Oracle

Responsibilities

Worked as a Data migration consultant by converting various complex objects like Customer Master Data, Vendor Master Data, Joint Operations Agreements, Joint Ventures, and Division of Interests etc.

Extracted data from multiple database and Flat File and stages into a single place and applied business logic to load them in the central oracle database

Created Source and Target column mapping sheet

Used Informatica Power Center for extraction, loading and transformation (ETL) of data in the data warehouse

Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans

Playing Team Lead this project and have 5 member team (offshore & Onsite module).

Analysis the requirement and work assign to team and track status report on daily.

Knowledge on the HIPAA transactions and EDI 837 (I/P/D), Eligibility 270 and 271

Used Informatica repository manager to create folders and add users for the new developers.

Developed complex mappings in Informatica to load the data from various sources

Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy and Sequence generator, Procedure.

Extensively used Informatica debugger to figure out the problems in mappings. Also involved in troubleshooting existing ETL bugs.

Extensively used Toad for SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.

Used the PL/SQL Procedures for Informatica mappings for truncating the data in target tables at run time.

Work on Teradata utilities like fastload, Mload for bulk data loading.

Explore in Teradata client tools BTEQ, SQL Assistant used for write SQL queries and testing.

Implemented Exception Handling Mappings by using Data Quality, Data Profiling, Data cleansing and data validation.

Implemented Informatica MDM for data Cleansing, De-duping and Address correction were performed

Good explore in Informatica DVO for test the data validation by write SQL queries source and Target database and schedule tasks and reporting test result.

Implemented data cleansing for Dimension tables by using Informatica Data Quality(IDQ)

IDQ used for Address and customer names and matching and data masks.

JIRA system used for tracking bug/enhance of tracking.

Designed the custom error control logic with in the pipeline to capture and load the bad records to a control table, and recover the workflow in the event of failures.

Used Informatica Power Center Workflow manager to create sessions, batches to run with the logic embedded in the mappings

Created procedures to truncate data in the target before the session as per requirement.

Created deployment groups, migrated the code into different environments

Written documentation to describe program development, logic, coding, testing, changes and corrections

Followed Informatica recommendations, methodologies and best practices from velocity documentation.

Implemented SOAP, Rest Web Services for Real Time process.

Provided support during various phases of project and plan the agile process.

Was involved in production support to make sure all issues are fixed in the respective turn-around times.

Environment: java, Informatica Power Center 8.6, IDQ, Oracle 11g/10g, Actuate, Tableau, MS-SQL server and JIRA, Toad, HP Quality Center, MS Office Suite

Clients: HCC, UHH, Well systems, Health first, BF&M, ICBL, EMIH

Location: Phoenix, AZ and India (offshore)

Role: Sr. Informatica Developer (duration: Sep 07 - Feb 12)

Product: Javelina phageI - Javelina is Java web based product and this product used for claims processing system of health care insurance clients. For data warehousing & conversion module we are Informatica power center tools are used. We designed different kind’s reports by using SSRS, BIRT and Tableau, MS SQL Server, Taradata and Oracle

Responsibilities:

Responsible for gathering requirement of the project by directly interacting with client and made analysis accordingly.

Coordinated the work flow between onsite and offshore teams.

Defined various facts and Dimensions in the data mart including Fact tables, Aggregate and Summary facts.

We are schedule the all Informatica jobs, monitoring by using Autosys GUI tool.

Implemented SOAP web service for Report server authentication.

Extracting, Scrubbing and Transforming data from Flat Files, Oracle, SQL Server, Teradata and then loading into Oracle database using Informatica and Pentaho

Worked on optimizing the ETL procedures in Informatica 9.1/8.6 version.

Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.

Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Implementing logical and physical data modeling with STAR and SNOWFLAKE techniques using Erwin in Data warehouse as well as in Data Mart.

Used Type 1 and Type 2 mappings to update Slowly Changing Dimension Tables.

Involved in the performance tuning process by identifying and optimizing source, target, and mapping and session bottlenecks.

Configured incremental aggregator transformation functions to improve the performance of data loading. Worked Database level tuning, SQL Query tuning for the Data warehouse and OLTP Databases.

Used Informatica repository manager to create folders and add users for the new developers.

Maintained stored definitions, transformation rules and targets definitions using Informatica repository manager.

Configured Informatica Server to generate control and data files to load data into target database using SQL Loader utility.

Used Active batch scheduling tool for scheduling jobs.

Checked Sessions and error logs to troubleshoot problems and also used debugger for complex problem trouble shooting.

Negotiated with superiors to acquire the resources necessary to produce the project on time and within budget. Get resources onsite if required to meet the deadlines.

Delivered projects working in Onsite-Offshore model. Directly responsible for deliverables.

Developed UNIX Shell scripts for calling the Informatica mappings and running the tasks on a daily basis.

Wrote Oracle PL/SQL procedures and functions whenever needed.

Created & automated UNIX scripts to run sessions on desired date & time for imports.

Environment: Java, Kettle 4/5.2, spoon, carte, Informatica Power Center 8.6, PL/SQL, Oracle 10g, TOAD, Erwin, SSRS, Unix, SQL Server 2008, Query Surge, Windows XP, Visio 2003.

Syntel (India) Ltd, India - July’2006– Aug 2007

Client: State Street Trustees Limited UK

Role: ETL Analyst

Project#1: UK Collective Fund Services (CFS)

This project helps State Street to report Breaches in the financial functioning of Clients to the Risk Management and Compliance (RMC). CFS (Collective Fund Services) has been appointed by a number of clients to perform the pricing administration function for regulated and non-regulated funds.

Responsibilities:

Participated in documenting the existing operational systems.

Involved in the requirements gathering for the warehouse. Presented the requirements and a design document to the client.

Created ETL jobs to load data from staging area into data warehouse.

Analyzed the requirements and framed the business logic for the ETL process.

Involved in the ETL design and its documentation

Designed and developed complex aggregate, join, lookup transformation rules (business rules) to generate consolidated (fact/summary) data using Informatica Power center 8.6

Designed and developed mappings using Source qualifier, Aggregator, Joiner, Lookup, Sequence generator, stored procedure, Expression, Filter and Rank transformations

Development of pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions

Evaluated slowly changing dimension tables and its impact to the overall Data Warehouse including changes to Source-Target mapping, transformation process, database, etc.

Collect and link metadata from diverse sources, including relational databases Oracle, XML and flat files.

Extensive experience with PL/SQL in designing, developing functions, procedures, triggers and packages.

Developed Informatica mappings, re-usable Sessions and Mapplets for data load to data warehouse.

Designed and developed Informatica mappings and workflows; Identify and Remove Bottlenecks in order to improve the performance of mappings and workflows and used Debugger to test the mappings and fix the bugs

Scheduling the sessions to extract, transform and load data in to warehouse database on Business requirements using Active batch scheduling tool.

By using Kettle tool we are migrated 2 project legacy system database& File system to RDBMS Oracle database.

Created different Transformations for loading the data into target like merge, Joiner, dimension lookup/update, Lookup, sorting, Java Script, Switch, Filter, function and Sequence Generator transformations.

Have knowledge on Pentaho Report development.

Kettle used for development and carte used for execution of scripts

Created SHELL SCRIPTS for generic use.

Developed and maintained optimized SQL queries in the Data Warehouse.

Environment: Windows XP/NT, Informatica Power center 8.6, JIRA, UNIX, Oracle 10g, SQL, PL/SQL, ER-win, Actuate, SVN,TOAD, Web logic server

Project#2: DICE Project

Responsibilities:

Design and Construction of BIRT Reports (open source code provided by eclipse).

Write Stored Procedures for application support.

Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems.

Support and testing of DICE Portal application for client demos.

Involved in creating Data Marts and altering and creating new dimensions and Facts

Design of new Cube reports using Actuate Analytics Cube designer.

Prepare project documents.

Excel reports by using Actuate eSpreadsheet Reporting tool

Migration older version to Latest version for Actuate.

Actuate iServer maintenance and backup.

Environment: Informatica, Actuate reports, Oracle 9, Servlets, JSP, Web sphere application server 5.1, PVCS

Nelito System Ltd, India (worked TCS vendor) May’2004– July 2006

Client: - Merrill lynch, U.K

Projects: Merlin and Endeavour Project

Role: ETL Developer

Summary: Merlin is the project oriented towards generating online enquiries/ Reports to Merrill Lynch internal users (with in firewall but spread across the globe), for the historic data (with daily uploads) with performance being the main criteria.

Enquiries (Online screen),

Reports (Generated using Actuate)

Client server based n-tier architecture is followed where thick client (Swing based) forms the UI tier, EJB/Java on Web logic constitute the middle tier, Actuate Server for Report service and Oracle is the database tier.

Merlin application users and their credentials will be stored in the Microsoft Active Directory server.

Responsibilities:

Extensively worked with the data modelers to implement logical and physical data modeling to create an enterprise level data warehousing.

Created and Modified T-SQL stored procedures for data retrieval from MS SQL Server database.

Automated mappings to run using UNIX shell scripts, which included Pre and Post-session jobs and extracted data from Transaction System into Staging Area.

Extensively used Informatica Power Center to extract data from various sources and load in to staging database.

Extensively worked with Informatica Tools - Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, Repository server and Informatica server to load data from flat files, legacy data.

Created mappings using the transformations like Source qualifier, Aggregator, Expression, Lookup, Router, Filter, Rank, Sequence Generator, Update Strategy, Joiner and stored procedure transformations.

Designed the mappings between sources (external files and databases) to operational staging targets.

Involved in data cleansing, mapping transformations and loading activities.

Developed Informatica mappings and Mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.

Involved in the process design documentation of the Data Warehouse Dimensional Upgrades. Extensively used Informatica for loading the historical data from various tables for different departments.

Environment: Informatica, JIRA, SVN, Actuate Reports, Web logic, PL/SQL, MS Access, SQL Server, Windows 2000, UNIX

Indus Business Systems Ltd, India Sept 2001 - May 2004

Project: MedPlexus

Role: ETL Engineer

Client - Key solutions Inc. (www.medplexus.com)

Description: MedPlexus is a healthcare provider system for the U.S healthcare industry in general and for the physician's offices in specific. This information system will help the physicians to automate their information claims, billing, Payments and appointments processing requirements.

Responsibilities:

I have involved in implementing the end to end ETL life cycle

Translating the Requirements into Design like HLD, LLD

Involved in designing the dimensional model like

Prepared the DDL Scripts for creating the tables and supporting table records

Experienced working with Heath care Information for mapping data from legacy systems to target systems

Extracted Data from legacy Sources by using Kettle.

Extensively used Pantaho tools like Spoon, carte, kitchen, Transformations.

Cleanse the source data, Standardize the Vendors address, Extract and Transform data with business rules, and built data module using Spoon Designer.

Extracted data from different sources of databases. Created staging area to cleanse the data and validated the data.

Designed and developed complex Aggregate, Expression, Filter, Join, Switch, Lookup and Update transformation rules.

Developed schedules to automate the update processes and sessions and batches.

Analyze, design, construct and implement the ETL jobs using kitchen.

Environment: Pantaho 3.8, iReports, Actuate Reports, JBoss, Windows NT, PL/SQL, Excel, Oracle.



Contact this candidate