Post Job Free
Sign in

Data Informatica

Location:
New York City, NY
Posted:
April 30, 2020

Contact this candidate

Resume:

Ashok Vemuru

469-***-**** adc05h@r.postjobfree.com

Summary:

Over 14+ years of experience in software design, development, maintenance, testing and troubleshooting on ETL/DWH applications.

Expertise in Requirement Analysis, Design, Coding, and Testing & Implementation of ETL/DWH projects using Informatica Power Center 10.x/9.x/8.x/7.x, Informatica Power Exchange, Informatica BDE, HIVE, Hadoop, AMBARI, Salesforce, IDQ, SAP HANA, SAP FMS,PL/SQL, ORACLE, SQL Server, IBM DB2, SSIS,TIDAL and UNIX Shell Scripts

Extensive experience with ETL tool Informatica in designing the Workflows, Worklets, Mappings, Configuring the Informatica Server and scheduling the Workflows and sessions using Informatica Power Center 10.x/ 9.x/8.x/7.x

Vast experience in Designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, XML, Web Services, HTML, Java Transformation, SAP RFC Function calls, IDOC Load.

Experience in integration of various data sources like SQL Server, Oracle, XML, IBM Mainframe Flat files, DB2 using ETL Tools.

Well Experienced in doing Error Handling and Troubleshooting using various log files.

Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.

Experienced in Performing Unit testing, system testing, regression testing, smoke testing and system integration testing.

Expert in designing Fact & Dimension Tables, Physical & Logical data models using ERWIN 4.0 and Erwin methodologies like Forward & Reverse Engineering.

Expert in SSIS package using Import/Export wizard, creating packages, connection manager, Control Flow, Data Flow, Event Handlers, scheduling and troubleshooting.

Proficient working with Unix/Linux environments and writing UNIX shell scripts and Perl scripts.

Expert in design and development of Business Intelligence reports using BI tools Business Objects,SSRS,Cognos and knowledge in microstrategy,

Have Got Proven experience leading Team.

Worked as Data Analyst in understanding the data requirements and updating the Mapping Documents.

Collaborated with Informatica Admin in process of Informatica Upgradation from PowerCenter 8.6 to PowerCenter 9.1.

Experience includes thorough domain knowledge of Business Financial system, Banking, Healthcare Information technology, Retail Industry and Insurance Industry.

Have excellent written and oral communication skills with the ability to communicate appropriately in business and technical situations at all levels.

TECHNICAL EXPERTISE:

Data warehousing

Informatica Power Center 10.1/10.2/9.5/9.1/8.6/7.1/6.2, OWB, Informatica Power Exchange, Informatica Cloud, Salesforce, Metadata Reporter Data Profiling, Data cleansing, OLAP, OLTP, Star & Snowflake Schema, FACT & Dimension Tables, Physical & Logical Data Modeling, Data Stage 7.x/8.x, Erwin 4.0/4.1, Informatica BDE, Hive, Ambari, SAP HANA, SAP FMS, SAP ECC, SSIS,SSRS,SAS, Snowflake,AWS

BI Tools

Business Objects XIR2/6.0/5.1, Cognos 8

Databases

SQL Server 2008/2005,SSIS, Oracle 11g/10g/9i, Sybase, Tera Data 6,My SQL, MS-Access, DB2 8.0/7.0, NETEZZA 7.0,

Languages

Java/J2EE, XML,COBOL,PL/SQL,UNIX Shell Scripting

Operating System

Windows XP/NT/2000, LINUX

Other Tools

Autosys, Maestro Scheduler (Tivoli), Mercury Quality center, Lotus Notes. Tidal

DBProgramming &Tools:

RDBMS, Joins, Indexes, Views, Functions, Triggers, Clusters, Procedures, SQL*Plus, SQL*Loader, Export/Import, TOAD, SQL Navigator

EXPERIENCE

Project: FMS, Snowflake Project Jan 2018-

Tapestry Inc, New York, NY

Role: Informatica ETL Developer

Description:

Tapestry, Inc., decided to move their current platform to SAP S4 HANA. This strategic initiative has three major projects FMS, POS and HRIS.

The objective of FMS is to implement streamlined, standard global processes, provide greater financial visibility and enable us to more efficiently integrate future acquisitions. Different phases of project ensure conversion from SAP ECC legacy to SAP S4, SAP S4 to A360 and ETL process do send data different outbound interfaces.

Roles/Responsibilities:

Working as a Data warehouse Sr ETL developer and handling ETL & BI Design, Development, Testing and Deployment.

Convert the existing SAP Legacy data to SAP S4 through ETL Jobs.

Extract data from SAP S4 and load data to SAP HANA Tables. Data loaded will be used for reporting purpose.

Convert the existing Hive Tables to Snowflake Tables and test the current Jobs in A360

Load data from Hive Table to Snowflake Tables and check the performance.

Used Informatica to extract data from A360 Snowflake tables and use the data for Outbound ETL extract processed.

Extensively used WEB API Calls to load A360 Hadoop data to Visulon, JOOR, ECV Iterfaces

Used SAP S4 RFC Function calls to load data to S4

Used Informatica BDE Developer to load to A60 Hadoop system.

Developing mappings using HTTP, Transaction Control, BAPI, XML transformations to load data to S4 and GUI Applications.

Involved in Testing methodology approach discussion

Extensively used Informatica Developer to extract data from SQL Server tables and load to SQL Sever tables

Migration of ETL Jobs to AWS Cloud.

Involved in Unit testing, Integration testing, UAT and deployment of workflows. Moved code across Development, Test and Production Repositories.

Created MFT, Tidal Jobs for file transfer and scheduling of Jobs

Environment: Informatica Power Center 10.2, Informatica Developer 10.1, Hadoop, Hive, SSIS,SSRS, Ambari, Netezza 7.0, SQL Server 2005/2008, Oracle 11g, PL/SQL, SAP, Flat files, UNIX Shell Scripts, Tidal Scheduler, Snowflake, AWS.

Project: APTOS –POS System, PETL September 2015- December 2017

Coach Inc, New York, NY

Role: Informatica ETL Developer

Description:

COACH has all the Planning data in CPS Database and since it was getting retired data, new process was setup pulling from new PETL database.

COACH implemented new APTOS –POS System and as part of project ETL mappings and workflows were setup to extract data from SAP ECC Systems and load data to APTOS System.

Roles/Responsibilities:

Understanding requirements and preparation/review Mapping Document and technical specification documents.

Extract data from PETL Database and load data to NETEZZA tables using Informatica 9.6 ETL Tool. .

Real time ETL Trickle Jobs were setup to take care of Customer Orders and send data to APTOS.

Extensively used Informatica Designer to design, develop ETL jobs for extracting, transforming and loading the data

Worked extensively on Unix Scripts along with ETL mapping and workflows to take care of sending data to all stores in US and Europe

Worked extensively with Parameters and Variables, Pre SQL/Post SQL’s.

Extensively used Informatica Developer to extract data from SQL Server tables and load to SQL Sever tables

Creating reports on SSRS.

Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.

Involved in Unit testing, Integration testing, UAT and deployment of workflows. Moved code across Development, Test and Production Repositories.

Environment: Informatica Power Center 9.6, Netezza 7.0, SQL Server 2005/2008, Oracle 11g, PL/SQL, SAP ECC, Flat files, UNIX Shell Scripts, Tidal Scheduler, APTOS, MFT, Hive, SSIS, SSRS,DAX

Project: BI/DM-EDW, CDW Conversion Nov 2012- August 2015

Coach Inc, New York, NY

Role: Informatica ETL Developer

Description:

COACH has most of the data warehouse information in DM and CDW applications. DM and CDW are the important applications which process huge amount of Data. To take care of Business needs and for faster report generation, current DM and CDW applications are being converted to EDW Datawarehouse using Informatica ETL Tool and Netezza. CDW applications developed in OWB converted to EDW Datawarehouse using Informatica ETL Tool.

Roles/Responsibilities:

Understanding requirements and preparation/review Mapping Document and technical specification documents.

Extract the current DM and CDW applications information and load data to EDW Tables using Informatica 9.1/9/6 ETL Tool. .

Critical Customer Information data was read from Salesforce using Informatica cloud.

Remove the existing flat files as source and read the data directly from SAP Tables using Informatica 9.1

and load data to EDW Tables.

Extensively used Informatica Designer to design, develop ETL jobs for extracting, transforming and loading the data

Created new workflows, sessions, mappings to extract the information from SAP sources and load data into EDW and CDW Tables.

Worked extensively with Parameters and Variables, Pre SQL/Post SQL and Emails in converting the current applications

Written Unix Shell Scripts to execute the existing and updated workflows.

Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.

Creating new workflows and sessions for EDW Conversion and executing the designed jobs using Tidal Scheduler.

Developed Test Cases according to Business and Technical Requirements and prepared SQL scripts to test data.

Involved in Unit testing, Integration testing, UAT and deployment of workflows. Moved code across Development, Test and Production Repositories.

Monitoring of application batch processes to ensure successful completion of jobs.

Environment: Informatica Power Center 9.1, Netezza 7.0, SQL Server 2005/2008, Oracle 11g, PL/SQL, SAP, Flat files, UNIX Shell Scripts, Tidal Scheduler.

NJM Insurance Group, Trenton, NJ Dec 2011- Oct 2012

Project – GC CC Conversion/QA Tracker Enhancement

Informatica Lead/Data Analyst

New Jersey Manufacturers Insurance (NJM) handles Worker Compensation and General Claims (PIP) claims through Legacy System. NJM purchased Guidewire ClaimCenter Tool which is a UI based application to maintain the Worker Compensation and GC Claims.

NJM is converting the GC CC claims from Legacy System to Guidewire Claim Center Database using Informatica ETL Tool. Claims initiated from 2007 will be considered for GC CC Conversion. GC CC Conversion mainly consists of converting Auto Reserve, Subrogation and Salvage file data to Claim Center Database.

Responsibilities:

Understanding the Mapping Documents and creating High Level/Low Level Design Documents for GC Auto Reserve, Subrogation and Salvage Files.

Created mappings and workflows for populating the Auto Reserve, Subrogation and Salvage Files data to Claim Center Tables using Informatica 9.1 ETL Tool.

Creating new workflows and sessions for GC Conversion and executing the designed jobs using Tivoli Scheduler.

Used Informatica Power Exchange 9.1 to load data from Legacy System to Staging Tables.

Extract the WC and PIP Claim Information for GUI using Informatica 9.1 ETL Tool and load into QI Tracker Tables.

Updating the ETL existing mappings and testing the same for QI Tracker Enhancement.

Written Unix Shell Scripts to execute the existing and updated workflows.

Documenting the Unit Test Cases and testing the new workflows in DEV, QA and UAT.

Migrated the Informatica Power Center mappings and Code/Folder migration from one environment to another as part of release management.

Involved in Unit and System Testing of ETL Code (Mappings and Workflows).

Monitoring of application batch processes to ensure successful completion of jobs.

Handled Informatica administrative tasks like moving, copying sessions, workflows from DEV to QA.

Interacting with Business Users/App Leads of different UI applications making them understand about the new address validation tool.

Environment: Informatica Power Center 9.1, SQL Server 2005/2008, DB2, PL/SQL, Oracle 10g, Tivoli Scheduler, Informatica Power Exchange 9.1, SOAP UI, XML, SSIS, Cognos 8.

CareFirst-BCBS, OwingsMills, MD Sep 2011- Nov 2011

Project – Data Informatica (EDW-CMDB)

Informatica Developer/Tester

CareFirst BlueCross BlueShield is a not-for-profit health care insurer and offers a comprehensive portfolio of products and administrative services to individuals and groups in Maryland, the District of Columbia and northern Virginia. Its largest health care insurer in Mid-Atlantic region and nationally recognized as ‘Best in Blue’ insurer.

CMDB is combined member database which contains membership information from Carefirst legacy systems

FACETS Tables information is extracted through Informatica and loaded into CMDB Database. Informatica Jobs are executed through Unix Scripts. Enhancement Daily and Historical HIPPA workflows and loading the data into CMDB Tables.

Roles/Responsibilities:

Understanding requirements and preparation/review of High level, Low level design documents and technical specification documents.

Extract the Membership information from FACETS Oracle Tables using Informatica 8.6 ETL Tool and load into CMDB DB2 Tables.

Developed ETL mappings using different transformations like Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.

Worked with Parameters and Variables, Pre SQL/Post SQL, Pre session and post session commands, Emails in CMDB Mappings and Sessions.

Written Queries, procedures, created Indexes, primary keys and data bases testing.

Created sessions and workflows to sequentially execute the designed jobs.

Responsible for monitoring all the sessions that are running, scheduled, completed and failed Debugged the mapping of the failed session to check the progress of data load.

Worked with PMCMD commands like start workflow, schedule workflows, stop workflow, abort workflow etc.

Migrated the ETL Informatica Power Center mappings and Code/Folder migration from one environment to another as part of release management.

Involved in Unit and System Testing of ETL Code (Mappings and Workflows).

Environment: Informatica Power Center 8.6, Oracle 10g, DB2, PL/SQL, Mainframes, Flat files, UNIX Shell Scripts, CA7 Scheduler, Reflection Client

Bank of America, Dallas, TX Sept 2009- Aug 2011

Project – Card Services (Statements-BACARDI)

ETL Developer

Bank of America CARD Information (BACARDI) is a central repository for card specific data. Consumer Credit data resides in Bacardi. The Bacardi data mart receives data from various sources and imports this data into tables. Bacardi data is mainly used for Marketing, Reporting and static analysis of data. Bacardi receives information from FM, CROL, Statements, Source and other different Card applications.

Statements and Access Check Application are the processes which format and create Statements for cardholders on a monthly basis. Statements application uses the customer information provided by FM and generates Statements. Statement reports are sent to Source/Bacardi which are the data warehouses for Bank of America. SOURCE is getting retired and Statement reports needs to be loaded into Bacardi Tables.

Responsibilities:

Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build the DataMart.

Extract the flat file information sent by Statements and Fiserv from mainframes, reformat it and load it into Bacardi Tables using Informatica 8.6 ETL Tool. .

Developed data Mappings between source systems and warehouse components for converting SOURCE Reject reports to BACARDI application.

Created various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Union, Update Strategy, Stored Procedures, and Router etc. and Fine-tuned the mappings for optimal performance.

Developed multiple jobs where the source was COBOL file & performed number of validations using hash files & moved the data to Oracle environment.

Extensively defined & used Parameters and Variables in Mapping for many ETL Jobs.

Troubleshooting database, workflows, mappings, source, target to find out the bottlenecks and improved the performance

Implemented various Performance Tuning techniques on Sources, Targets, Mappings, Workflows and data base tuning.

Migrated the Informatica Power Center mappings and Code/Folder migration from one environment to another as part of release management

Used Informatica Powercenter Data Masking option to protect the confidential customer account information

Extract the flat file account information sent by Statements from mainframes, reformat it and load it into Bacardi Tables using Informatica 8.6 ETL Tool.

Updated the mappings to accept cents value (BAL_AM) in VRU Tables. 2 Informatica workflows (wf_Load_VRU_PASS) modified to accept decimal value instead of Integer.

Involved in Production support by Monitoring Batch cycle, accepting production tickets and solving the same.

Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.

Generated reports using Cognos.

Testing the ETL mappings and Involved in solving the defects logged by QA’s in Team Tracker

Analyze entire code of the application and develop Application Information document describing various business rules of the application.

Environment: Informatica Power Center 8.6, Data Stage, Informatica Power Exchange, Oracle, DB2, SQL, PL/SQL,UNIX Shell Scripts,Perl,Maestro Scheduler (Tivoli),Cognos

Bank of America, Dallas, TX May 2007- August 2009

Project – Card Services (Canada AML Enhancement)

ETL Developer

Bank of America CARD Information (BACARDI) is a central repository for card specific data. Consumer Credit data resides in Bacardi. The Bacardi data mart receives data from various sources and imports this data into tables. Bacardi data is mainly used for Marketing, Reporting and static analysis of data. Bacardi receives information from FM, CROL, Statements, Source and other different Card applications.

CROL application sends Canada customer information data which is updated in Bacardi Tables.

LOB wanted to have new fields to be added to the existing functionality and this has to be updated in the Bacardi Tables. 4 new Bacardi new tables and three new fields have to be added to the existing tables

Roles/Responsibilities:

Understanding requirements and preparation/review of High level, Low level design documents and technical specification documents.

Extract the transmission file from CROL Unix system, reformat it and load it into Canada Bacardi Tables using Informatica 8.6 ETL Tool.

Involved multiple job sequences, exec command/transformer/lookup/filter/join stages

Developed various jobs to read from VSAM Files from source system, Flat Files and Database, transform and load to Target databases

Implemented Custom application support to quickly apply data masking algorithms to any sensitive data, in any format

Developed common routine mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.

Written SQL Queries, Triggers, PL/SQL Procedures, Packages and UNIX Shell Scripts to apply and maintain the Business Rules.

Worked with QC and Prod Support teams and bug fix and followed-up with QC-Tickets.

Created sessions and workflows to sequentially execute the designed jobs.

Written various Unix Shell Scripts for scheduling and formatting the files.

Responsible for monitoring all the sessions that are running, scheduled, completed and failed Debugged the mapping of the failed session to check the progress of data load.

Involved in Unit and System Testing of ETL Code (Mappings and Workflows).

Environment: Informatica PowerCenter 8.6, Informatica Power Exchange, Oracle, DB2, SQL, PL/SQL, Mainframes, VSAM, UNIX Shell Scripts, Maestro Scheduler, Cognos.

ING, Hartford, CT March 2005- April 2007

Project – Data Integrated Financial Systems

Informatica Developer

ING is a global financial institution of Dutch origin offering banking, insurance and asset management to over 85 million private, corporate and institutional clients in more than 50 countries. ING employs over 130,000 people in 50 countries, including more than 10,000 in the US, and has been operating in America for over 100 years.

The project was involved in developing a target Marketing Data Warehouse. The system involves in building Data warehouse to improve overall quality and productivity of existing processes and deploying reliable and accurate information. This information is delivered in various data marts and that are used to formulate analytical information and generating reports. Informatica Power center is used for data collection, transformations for various sources and to create integrated data warehouse for metrics reporting.

Role: ETL Developer

Responsibilities:

Extensively worked with business analysts for gathering the requirement of customers.

Responsible for converting functional requirements in to technical requirements.

Extensively used the ETL client tools Designer, Workflow Manager, and Workflow Monitor.

Extensively used ETL tool for supporting data extraction, transformations and loading process, in a corporate-wide-ETL Solutions.

Extensively used the Data Warehousing Informatica tool to extract, transform and load the data from Oracle, Flat Files, to DB2.

Worked with UNIX scripts in Workflow manager for locating source and targets.

Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level, System Level and the Target Level.

Developed Test Cases according to Business and Technical Requirements and prepared SQL scripts to test data.

Generated the mapping and workflow specifications using Crystal reports as a part of the post-code technical transformation document.

Involved in Unit testing, Integration testing, UAT and deployment of workflows. Moved code across Development, Test and Production Repositories.

Environment: Informatica Power Center 7.2, PL/SQL, TOAD, Oracle 8i, Flat Files, Batch Scripts, XML



Contact this candidate