Post Job Free

Resume

Sign in

Informatica Lead Developer

Location:
Malden, MA
Posted:
February 26, 2018

Contact this candidate

Resume:

ac4mg9@r.postjobfree.com

: +1-319-***-****

LinkedIn URL : www.linkedin.com/in/venkatesh-gilla-ramamoorthy-75255a17

SUMMARY

12+ years of experience in Requirement gathering, Gap analysis, Designing, Coding, testing, implementation, Production support, Resource Management in Data warehousing with business knowledge of Insurance, Pharmaceutical, Telecom, Banking and Travel domains

Good working experience in SDLC methodologies like waterfall, Agile

Strong Datawarehouse ETL experience of using Informatica Power Center 10.1/9.6/9.1/8.6.1/7.1 PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor

Experienced with Data Extraction, Transformation and Loading(ETL) from disparate Data Source databases like Oracle, SQL Server, flat files, CSV files, XSD and XML files and loaded into target warehouse by using various transformations in Informatica

Worked with different Informatica transformations like Aggregator, Lookup, Joiner, Filter, Router, Update strategy, Transaction Control, Union, Normalizer, SQL in ETL development

Experienced in Debugging and Performance tuning of targets, sources, mappings and sessions and experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations and Mapplet.

Worked on scalability and performance tuning batch processing to handle large volumes.

Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting

Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions

Having good Experience on Scheduler Tool like Tidal, AutoSys, Control M, ActiveBatch

Working knowledge on Informatica Master Data Management (MDM)

Excellent interpersonal and communication skills, and experienced in working with senior level managers, business users, and developers across multiple disciplines

Effectively managed globally dispersed teams of up-to 20 members

Technical Skills:

ETL Tools

Informatica Power Center 10.1/9.6/9.1/8.6.1/ 7.1

Data Modeling

Dimensional Data Modeling, using Star Join Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling

Databases

Oracle 11g/10g/9i, SQL Server

Scripting

Unix Shell scripting, SQL stored procedures

Scheduler Tools

Control -M, Autosys, Tidal, ActiveBatch

MDM Tool

Informatica/Siperian MDM, IDD, SIF

DB tools

SQL / PL SQL, TOAD, Oracle Developer

Operating System

UNIX (Putty, WinSCP), Windows

Versioning tools

TFS, SVN

Release Management tools

SWIM, IBM Urban Code (uDeploy)

Testing Tools

HP Quality Center, QTP

Academic proliferation

Master of Financial Management (M.F.M) from Pondicherry University, India, 2010

Bachelor of Commerce (B.Com) from AVS College of arts & science, Periyar University, India 2001

Employment Summary

From

To

Duration

Company Name

Designation

Role

Dec '16

Present

Horizon International

ETL Lead

ETL Lead

Sep '15

Nov '16

1 year 3 months

Syntel Inc

Project Manager

ETL Lead

Dec ‘14

Sep ‘15

9 months

Formac Inc

Senior System Analyst

Project Lead

Mar‘08

Dec ‘14

7 years 9 months

Cognizant Technology Solutions

Senior Associate

Project Lead

Apr ‘06

Feb ‘08

1 year 11 months

Verizon Data services India P Ltd

Analyst

Developer

PROJECT DETAILS

GCC, Boston, MA Feb 2017 – Jan 2018

Position: Informatica lead

GCC is a global enterprise comprised of a family of travel companies. Few of its major divisions are River cruise, Small ship cruise and Overseas adventure travel. The Data Service team (Data warehouse), plays an centric role in maintaining data quality and data governance, which sourced from different systems like web, call center, etc).

Responsibilities:

Lead design, development and implementation of the ETL projects end to end

Involved in the analysis of source to target mapping provided by data analysts and prepared function and technical design documents

Responsible for ETL technical design discussions and prepared ETL high level technical design document

Extracted data from flat files, XML, Oracle,Sql server using Informatica ETL mappings and loaded to Data Mart

Created complex Informatica mappings using transformations Unconnected Lookup, joiner, Rank, Source Qualifier, Sorter, Aggregator, Lookup and Router transformations to extract, transform and loaded data mart area

Worked extensively on shell scripting for file management

Created re-usable transformations/mapplets and used across various mappings

Developed Informatica workflows/worklets/sessions associated with the mappings using Workflow Manager

Created pre-session, post session,pre-sql,post sql commands in Informatica

Created and maintained Active Batch jobs to schedule Informatica Workflows

Expert in performance tuning of Informatica code using standard informatica tuning steps

Work closely with DBAs, application, database and ETL developers, change control management for migrating developed mappings to PROD

Responsible for Production support on rotation basis among team members

Attend release meeting, clarification meeting with BA/end users & sprint retrospective meeting

Provide weekly status report to Project Manager

Skills Used: Informatica 10.1, Oracle 12c, MS SQL Server 2008, Toad, UNIX, Putty, ActiveBatch Scheduler, Team management, Agile

Aegon Asset Management, Cedar Rapids, IA

Feb 2015 – Dec 2016

Position: Informatica lead

Aegon Asset Management (AAM) is a leading global investment manager with $375 billion in assets managed/advised by about 250 investment professionals in North America, Europe and Asia.

Investment Data Warehouse (IDW) acts as a is AAM’s centralized reporting repository. IDW extracts and holds financial information received from multiple investment source systems, as well as market data. Data is cleansed, transformed and validated for accuracy before storing the data into one central storage repository. The system provides our business units a consolidated view of data for query and reporting across all asset classes

Responsibilities:

Involved in all phases of SDLC from requirement, design, coding, testing, deployment and production support

Worked extensively in Informatica Power Center Repository Manager, Designer, Workflow Manager/Monitor

Used the techniques like Incremental aggregation, Incremental load and Constraint based loading for better performance.

Created Reusable Transformations, Mapplets, Sessions and Worklets and made use of the Shared Folder concept using shortcuts wherever possible to avoid redundancy.

Expertise in designing and implementing SCD - slowly changing dimensions types (1, 2 and 3) and CDC – change data capture

Involve in tuning the mappings in the transformations by tracking the reader, writer, transformation threads in the session logs and used tracing level to verbose during development & only with very small data sets

Involved in DB Performance tuning of different tables which has huge data by creating partitions and using stored procedures.

Used Oracle Index techniques, partitioning concepts on DB to improve query performance

Used various debugging techniques and informatica debugger tool to debug the mappings

Create/modify Control M jobs to schedule workflows

Provide estimations on ETL deliverables and oversee the quality of ETL Deliverables

Involved in Unit testing, Iterative testing to check whether the data loads into target are accurate, which was extracted from different source systems as per the user requirements

Support for code deployment in QA and PROD environment and responsible for code validation

Skills Used: Informatica 9.6, Oracle 11g, SQL Server 2008, SQL Developer, Toad, UNIX, Putty, WinSCP, Control-M Scheduler, Team Foundation Server (TFS), BizTalk, Team management, Agile

Sanofi - NA AMS, Bridgewater, NJ

May 2014 – Dec 2014

Position: Transition Lead

Sanofi is the diversified global healthcare leader, focused on patients’ needs. Sanofi operates in three primary business areas: Pharmaceuticals, Human Vaccines and Animal Health.

The objective of Sanofi - North America Application Maintenance Services (NA AMS) is to maintain all the Sanofi applications under one roof. The supported applications are CCT (Common Customer Table) which centralizes customer information across the system; SRT (Spend Review Tool) which centralizes all the expenses related to clinical trials, meetings, product promotions.

Responsibilities:

Manage and distribute work to offshore team and lead for timely and defect free deliverables

Handle 8 offshore resources and individual task allocation

Responsible to lead Knowledge transition from Prime vendor to CTS for all the client application. Need to ensure that all KT have documented and capture video of all KT sessions

Involved in requirement gathering on new requirements /enhancements. And responsible for preparing and getting sign off from client on requirement understanding document, design document and Unit test case scenarios

Worked extensively in Informatica Power Center Repository Manager, Designer, Workflow Manager and Workflow Monitor

Worked with flat file sources, CSV sources and Oracle sources.

Extensively used shell scripting for FTP of files, renaming the files, etc. Wrote scripts to run workflows using pmcmd command from the UNIX server.

Extensively worked with Aggregator, Sorter, Lookup, Router, Expression, Filter, Update Strategy, and Stored Procedure transformations.

Created reusable transformations and mapplets for better re-usability.

Created reusable sessions and workflows and extensively used parameter files and variables.

After development, performed extensive unit testing to check the proper and timely flow of data. Performed tuning of sessions by monitoring the cache size of the transformations

Support for code deployment in QA and PROD environment and also responsible for code validation

Responsible for Production support activities like Jobs monitoring, log the load statistics, analysis and resolve incase of any production issues, coordinate with business to fix any source file issues, coordinate with DBA to resolve any table space or file system related issues. And also responsible for deliverables out of daily/weekly/monthly jobs

Responsible for resolving the tickets raised by end users and adhoc requests from client. This includes the data analysis, manipulating the production data to fix the issues (with client approval) and Close the ticket with resolution

Analysis existing code/process and apply performance techniques

Coordinate with other application teams, in order to avoid production job conflicts due to dependency

Responsible for status report to management on weekly basis

Skills Used: Informatica 9.1, Oracle, UNIX, Tidal Scheduler, Team management

ING Annuity - Risk Management,

(West Chester, PA / Chennai, India)

Jul 2009 – Apr 2014

Position: Informatica Lead

ING offers a comprehensive array of financial services to retail and institutional clients. ING Group offers banking, investments, life insurance and retirement services to over 85 million private, corporate and institutional clients in more than 40 countries. The objective of this project is to implement Pegasus Datawarehouse system. The ETL process can be categorized in three phases like, source to Staging, Staging to Pegasus data warehouse and Algo process to the Pegasus data warehouse. The Staging area represents the assimilation of data from legacy systems like: Market, Reference, Garwin, Hartford and Bloomberg. All these integrated data are loaded to Pegasus datawarehouse. The Pegasus data sourced and processed through Algo process (Apply calculations using C++ Algorithm). The output of Algo process acts as a source for Pegasus system. The files will be loaded in to tables of Pegasus database.

Responsibilities:

Analyze the requirements and arrange for clarification sessions with Business to complete the requirement analysis

Identify the requirement gap to analysis and get a clear understanding before the start of the coding design and also an exercise to minimize the requirement defects

Responsible for Requirement Traceability Matrix

Review & finalizing the coding and deliverables

Design and create Autosys jobs to perform various tasks like file watcher jobs, executing informatica workflow etc. (creation of JIL)

Involved in implementation of ETL performance tuning techniques like Session Partitioning, Incremental Aggregation, Indexing lookup condition columns, optimizing SQL query, etc

Point of contact for project audits. The audit includes the study of the issues and risk to the project. Based on the monthly project audit, the project health score will be calculated. Also as part of audit, the codes will be verified, in order to check, whether all the industry coding standards has been adhered.

Prepare QA/PROD implementation plan and support on QA/PROD code deployment

Responsible for code validation after post deployment.

Study existing Informatica, Oracle codes and suggest the best practice / enhancements to team

Responsible for Level 2 production support, which includes analysis and resolution within specified SLA

Responsible to update production job run books and inform Level 1 production support, in case of any job impact due to the new requirement / enhancements

Responsible for task allocation and Project management

Carryout the project quality assurance demonstration to project stakeholders

Involve in project core team meeting to streamline the requirement and process

Skills Used: Informatica 9.1, Oracle 11g, Toad, UNIX, Putty, WinSCP, SVN, Autosys Scheduler

Bank of New York Mellon, Chennai, TN

Apr 2009 – Jun 2009

Position: Informatica Developer/ Lead

BNYM is global financial services company operating in 34 countries. BNY Mellon is a leading provider of financial services for institutions, corporations and high-net-worth individuals. BNYM led by its own management team and focuses on fulfilling the product and service needs of the independent agents in its respective region. BNYM provides superior services in the following areas like, Asset Servicing, Asset and Wealth Management, Issuer Services and Treasury Services.

The BNY Mellon Hub is the information backbone of the Bank of New York Mellon account. It serves as the focal point for the day-to-day operations which include Maintaining Associate information, Work order management, Knowledge sharing, reporting and much more. The sub-project of the BNYM is the ‘Fund system’, acts as the leaf of the Asset and Wealth Management service. This project uses a two tiered stage approach; the data will be loaded into the EDW in three basic steps. Step one is extracting atomic data from the application systems into an acquisition stage database, Step two will involve performing all the EDW logic to move the data into another stage database with the same overall structure as the EDW base tables and Step three is the move from the base stage database to the EDW base tables. The main objective of this project is to reduce the manual interventions and automate the existing process and to tune the existing ETL Informatica objects and Oracle objects to improve the code execution performance.

Responsibilities:

Developed mappings, sessions, and workflows in Informatica PowerCenter

Developed standard and reusable mappings and mapplets using various transformations like Expression, Aggregator, Joiner, Router, Lookup (Connected and Unconnected) and Filter

Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance

Modified the shell scripts as per the business requirements

Involved in Unit testing, Iterative testing to check whether the data loads into target are accurate, which was extracted from different source systems as per the user requirements

Prepared and used test data/cases to verify accuracy and completeness of ETL process

Skills Used: Informatica 7.1, Oracle SQL/ PL/SQL, TOAD, MicroStrategy, UNIX, Team management

ING,( Tokyo, Japan / Chennai, India)

Mar 2008 – Feb 2009

Position: Development Lead

ING Life is a NETEHRLANDS based Life Insurance Company currently holds two individual systems for Life-A and Life-J. This project is to implement a Data Integration solution to load the data from the existing two source systems to the three layers: Staging, Information HUB and ODS.

Data analysis and data profiling are done on source systems for data integration and need to implement the BATCH and REAL TIME process for all the layers. Staging Area represents the assimilation of data from legacy Life/J and Life/A sources. Data quality analysis and enrichment occurs against this data prior to migrating it into the strategic enterprise information environment. The staging area is utilized for loading source data only. The goal is to open and close communication with the source system as efficiently as possible. The Information Hub helps facilitate the Data movement into the ODS/DW from the Stage database.

Responsibilities:

Involved in gathering and analyzing the requirements and preparing business rules.

Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex logics while coding a mapping.

Worked with Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository Manager.

Developed and maintained ETL mappings to extract the data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle.

Developed Informatica Workflows and sessions associated with the mappings using Workflow Manager

Involved in creating new table structures and modifying existing tables to fit into Data Model

Extracted data from different databases like Oracle and external source systems like flat files using ETL tool

Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.

Developed Mappings, Mapplets, Reusable Transformations, Source and Target definitions

Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements.

Involved in Performance Tuning of mappings in Informatica.

Good understanding of source to target data mapping and Business rules associated with the ETL processes

Skills Used: Informatica 8.6, Informatica Versioning, Oracle (SQL, PL/SQL), UNIX, TOAD, Putty, Tortoise SVN, Team management

Verizon Data Services Pvt Ltd, Chennai, India

May 2007 – Feb 2008

Network Metric Protocol (NMP)

Position: Analyst

NMP was a complete solution for its Landline Telecom service. The project was to identify the operation process and to automate/performance tune, which would gain time/cost/manual efforts. My role was to lead the team to automate the manual process and performance saving action which would reduce the operation cost and steps.

Responsibilities:

Involved in analyzing and development of the Data Warehouse.

Worked on Informatica power center tool - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets and Transformation Developer

Created various mappings using Aggregate, Filter, Join, Expression, Lookup, Update Strategy and Router.

Extensively used ETL to load data from different databases and flat files to Oracle.

Involved in the development of Informatica mappings and also tuned them for better performance.

Created and scheduled Sessions and Batches through the Informatica Server Manager.

Worked with sessions and batches using Server Manager to load data into the target database.

Testing for Data Integrity and Consistency.

Skills Used: Informatica 7.2, Oracle (SQL, PL/SQL), Unix, Business Objects, Team management

Verizon Data Services Pvt Ltd, Chennai, India

Dec 2006 – Apr 2007

DBViews

Position: Analyst

As a part of the security features, NMP’s sensitive data, like clients master information, were masked from the view of offshore users for preventive action of malpractice and to ensure only the onsite users to have all the privilege on the data.

Responsibilities:

Involved in development, testing, and operations as well as performance tuning

Reduced the manual operational steps by identifying the areas that could be automated and then did automation hence cutting the manual steps by 50%

Designed and created different MicroStrategy objects such as advanced Metrics, Filters, and Prompts

Assisted with the creation of schema objects, facts, attributes and set relationship

Implemented the change control requests (CCR) on a need basis from client side

Delivered the reports to the business owners on weekly/ monthly basis

Skills Used: Informatica 7.2, Oracle, Unix, Business Objects, Team management

Verizon Data Services Pvt Ltd, Chennai, India

Apr 2006 – Nov 2006

NMP Upgrade to 10g

Position: Analyst

As a part of Technology Competitive and entering Advance features of Oracle 10g, Verizon decided to move its NMP Project which was developed in Oracle 8i version into Oracle 10g. My role was to ensure that the Oracle 8i code behaved the same in Oracle 10g.

Responsibilities:

Involved in development, testing, and operations as well as performance tuning

Reduced the manual operational steps by identifying the areas that could be automated and then did automation hence cutting the manual steps by 50%

Assisted with the creation of schema objects, facts, attributes and set relationship

Prepared Software Engineering Quality Process related documents

Implemented the change control requests (CCR) on a need basis from client side

Delivered the reports to the business owners on weekly/ monthly basis

Skills Used: Informatica, Oracle, Unix



Contact this candidate