SRINIVAS.G Mailto: *****.******@*****.***
Mobile: 609-***-****
Informatica ETL Tech Lead
Professional Summary
Having total 10 + years of experience in IT, and worked on Data-ware housing Technologies
like ETL for Analysis, Design, Development, and Implementation of business application systems in
various domains like Insurance, Telecom Communications, Sales and Distribution, Pharmaceuticals,
HR Payroll and Energy.
Worked in ETL Tools like Informatica, Ab-Initio and Reporting tools like Cognos, OBIEE.
Worked in complete Software Development Life Cycle (SDLC), starting from requirement
gathering, analysis, design, testing, debugging, implementation, post implementation support
and till maintenance.
Good in Onsite-Offshore work co-ordination.
Worked on Oracle Database supporting OLTP and Data Warehouses.
Extensively worked with informatica for extracting data from legacy systems to Stage and then
Cleanse, Processes and Loading.
Well versed with RDBMS like Oracle9i. And can able to work with DB2 and SQL Server.
Proficient in understanding business processes / requirements and translating them into
technical requirements.
Experience in writing Test Plans, Developing and Maintaining Test scripts.
Sound experience in Performance tuning of ETL process. Reduced the execution time for huge
volumes of data for a company merger projects.
Experience with Oracle Data Integrator (ODI).
Good at Data warehousing techniques like Star and Snow Flake Schemas.
Knowledge in working with UNIX and LINUX OS.
Having Knowledge on reporting tools like SAP BO, OBIEE and and Cognos.
Hands on experience in scheduling Informatica workflows using DAC (Data
Warehouse Application Console) and Tidal.
Quick to learn.
Worked in Agile work environment.
Good team player, strong interpersonal and communication skills combined with self-
motivation initiative and positive attitude.
Looking ahead for great personal and professional career in a fair working environment with
opportunities to grow.
Technical Skills
Data Warehouse Informatica Power Centre 9.x, 8.6.x, 8.x
/ETL Tool Trained on Ab-inito, Data Stage.
Microsoft BI Trained on SQL Server 2005 and 2008
Integration services.
OLAP tool’s Knowledge on BO XI R2,
Congo’s 8 and Micro strategy.
Languages C, SQL, PL/SQL.
Databases Oracle 9i, 10g, Netezza and SQL Server
2005.
Front-End Tools TOAD, Oracle SQL Developer, Db-
Comparator, Release Manager, VSTF,
WINSCP, Aginity work bench and Putty.
Operating Systems UNIX, Linux and Windows
Certifications SQL-Server BI -70-445
Educational Qualifications
Masters in Electronics from Osmania University (M.Sc.-2006).
Bachelor of Computer Application Science from Osmania University (B.Sc.-2002).
Project Profile
Title # 1 AMERICAN EXPRESS
Client Workforce Analytics & Spend Analytics
Location PHONEIX, AZ
Role ETL Informatica Tech Lead
Duration Nov -2014 to Till date
Team size 6
Environment Database: Oracle, flat files. Informatica 9.5
O/S: Windows, Unix.
Project Scope
Work Force Analytics: This particular module is developed for reporting purpose of various reports
across all the divisions for American express for showing the expenditure incurred on the resources at
American express permanent employees and contractors across all the segments and locations.
Spend Analytics: This particular module is developed for reporting purpose of various reports for the
business to better understand the spend amount across all the segments and divisions of American
Express. This reports would give the better understanding of the amount spend over a period.
Role & Responsibilities:
Requirement gathering from the business and leading the project.
Designing the whole solution to integrate Employee attributes from HR Analytics tables with
timesheet data feed from Clarity tool. This is to create comprehensive data set that can be
analyzed through Tableau.
Designing the solution to integrate existing Spend data ware house data to GBS data ware
house, this is to create comprehensive data set that can be analyzed through Tableau.
Developing Informatica ETL mappings to load data from flat files to Stage and Stage to Star
schema. Also mapping to integrate attributes from HR Data warehouse and loading the
comprehensive combined data into 3 different summary tables.(Similarly for the Spend Data
ware house)
Development of ETL strategy to combine data from existing HR data warehouse and timesheet
data from clarity tool using Informatica joiner, expression, lookup, router, normalize sequence
number generator transformations.
Creating Unix shell script to automate source file load from share point and push it to SFT
profile for Informatica ETL load.
Creating currency conversion table and loading currency conversion data from rss feed. Also
loading factual data after converting it into US dollars using the currency conversion table.
DAC configurations for full load and incremental load. Creating DAC tasks and execution plan.
Creation of Design and Technical documents for development purpose.
Coordinating with offshore development team and on shore business team for development and
requirements gathering.
Creating required views for reporting in Tableau reports.
Handling the release activities on monthly basis.
Project Profile
Title # 2 Devon Energy
Client Devon-BIOM
Location OKLAHOMA CITY, OK
Role ETL Informatica Lead
Duration October -2013 to Nov 2014
Team size 5
Environment Database: Oracle, SAP BW, SQL Server
2008. Informatica 9.5 O/S: Windows, Unix.
Project Scope
Devon Energy Corporation is among the largest U.S.-based independent natural gas and oil
producers. Based in Oklahoma City, company's operations are focused on North American and Canada
exploration and production.
The current existing BIDW is developed in PPDM model 3.7 to support BI analytics team to make day
to day decisions based on BI CUBES developed in SSRS and SSAS.
Role & Responsibilities:
Leading a team of four offshore team members and assisting them for overall delivery.
Analysis of existing data model to support ongoing changes to data model and BI analytics
team.
Actively involving in the 24*7 support and doing the enhancements wherever required.
Analyzing and fixing the code in production instantly after doing quick testing in production-
like environment whenever there is a job failure due to the code issue.
AD Scope includes estimating, work planning and assignments.
AM includes manage Informatica support team and production support SLA's.
Identifying the existing performance issues and tuning the Informatica workflows for optimal
performance.
Implementing the changes to existing data PPDM 3.7 to 3.8 model and supporting the ongoing
changes.
Analysis of performance bottle necks and improve the performance at Informatica and data
base.
Preparing the documentation for easy maintenance of support project.
Automated the jobs by considering their internal dependencies and based on upstreams using
Tidal as scheduling tool.
Closely working with Business users during the month-end, quarter end and year end support
activates and provides timely updates for financial closure which are critical to the Business for
reporting purpose.
Implementation JAWS analytics tool for various work load balances.
Creating the OBIEE reports based on adhoc requirements from Business.
Providing necessary support for OBIEE queries whenever required.
Expertize with Informatica (IDQ) toolkit, analysis, data and cleansing.
Used IDQ for Data matching, and data conversion.
Used IDQ to perform data profiling and create mappings.
Involved in migration of the code from IDQ to power center.
Experience in creating mapplets in IDQ/ Informatica Developer for Data standardization.
Working over the incidents/tickets that are created by different business users by prioritizing
them based on issue criticality.
Actively participating in the DR (Disaster Events) that take place over the weekends in every
quarter.
Update Business whenever there is any delay in the batches (for any reason) and plan
accordingly so that there should not be much impact to Business.
Working along with the team as a part of the regular activities in day-to-day work along with
the other responsibilities.
Assign the team members accordingly to complete the work based on their competent.
Created partitions to handle huge data at database level.
Track the status of work from time to time and provide timely updates to team lead.
Provides spontaneous response to the queries posted by business users and supporting teams.
Handling escalations in the best way whenever there are any.
Scheduling the calls with Business users to understand the issues clearly and solve the issue
accordingly.
Prepared Documents like Technical Design Doc, Production Support Doc, Deployment Doc
Lead and encourage the team to participate in multiple events that are conducted by
Organization.
Title # 3 LEVEL-ADBI
Client Level 3 Communications.
Location Broomfield, CO
Role ETL Informatica Lead
Duration March-2013 to September-2013
Team size 25
Environment Database: Oracle Exadata, SQL Server
2008. Informatica 8 and Informatica 9 O/S:
Windows, Unix.
Project Scope
Level 3 Communications Is an American multinational telecommunications and Internet service
Provider Company headquartered in Broomfield, Colorado.
It operates a Tier 1 network. The company provides core transport, IP, voice, video, and content
delivery for most of the medium to large Internet carriers in North America, Latin America, Europe, and
selected cities in Asia.
Project follows agile implementation for work allocation
Role & Responsibilities:
Participated in discussion with senior management team for the better understanding of
business requirements Analysis for developing mapping documents for already developed jobs
from Ab-initio to Informatica.
Leading the team of three for project delivery.
Involved in Analysis phase of business requirement and design of the Informatica mappings
using low level design documents.
Designed the project in such a way that there are 2 phases, SDE and SIL, as per the BI
standards at Informatica (9.1) level.
STAGE - Source dependent Extraction, where the data gets extracted from different source
systems to staging tables. This is the staging area.
ODS- Source Independent loading, where the transformed data will be loaded into the final
target tables.
Helping the team in resolving technical/non-technical issues.
Involved in migration of Informatica migration from Informatica 8 to 9 version.
Involved DB migration from Oracle to Oracle Exa-Data and implementing NRT (Near Real
Time) updating to ODS from front end UI to ODS.
Participated in the DAC implementation.
Made an informatica migration setup from oracle 10g to Oracle ExaData.
The EP’s were scheduled at DAC level as per client specifications after creating the tasks,
assembling ‘subject areas’ and building the ‘Execution plans’.
Prepare the road-map for the completion of tasks and plan to share the work among the team as
per each individual capability.
Involved in Performance tuning by determining bottlenecks in sources, mappings and sessions.
Created effective Test Cases and did Unit and Integration Testing to ensure the successful
execution of data loading process.
Troubleshooting the loading failure cases, including database problems
Analyzed Session Log in case session fails to resolve errors in mapping or session
configurations.
Used TOAD to run SQL queries and validate the data in warehouse and mart.
Involved in designing the ETL testing strategies for functional, integration and system testing
for Data warehouse implementation.
Created Materialized views at database level and refresh them at ETL level for ease access of
data for OBIEE.
Title # 4 TW Harris Payroll.
Client Time Warner’s
Location NYC, NY
Role ETL Informatica Lead
Duration April-2012 to March-2013
Team size 15
Environment Database: Netezza, Mainfarme and DB2
Workday Tools: Informatica9.1.1and VM-
Ware O/S: Windows, Unix.
Project Scope
Time Warner Inc: A global leader in media and entertainment with businesses in television networks,
film and TV entertainment and publishing, uses its industry-leading operating scale and brands to
create, package and deliver high-quality content worldwide through multiple distribution outlets.
This project is done to integrate the data among all the divisions Time Warner and populate the data in
to a single EDW related to HR Payroll system Workday, and use the same data to various reporting
purpose.
Role & Responsibilities:
Involved in Preparing BRD(Business Requirements Documents) and Functional specs and High
Level Design Documents
Involved in Preparing Technical Specification documentation documents and ETL Mapping
documents (Detail Design Documents).
Worked with Sources such as Relational, Flat files, Xml’s and Web services as sources and
targets.
Worked as part of migration of data from People soft to Oracle and Main frame DB2.
Created Power Exchange Registration for CDC on DB2 mainframe tables, created
DataMap to extract data from file on mainframe.
Created FTP scripts to transfer fixed with flat files from Oracle to Main frame system
or downstream systems which require printing checks for their employees.
Created FTP jobs to transfer files on to Tibco system to other third party systems like
Open text etc..
Used Informatica to extract data from Workday to staging and then loading the data to EDW
data warehouse and data mart, EDW built on Netezza database and Workday HR-Payroll
system.
Developed Mapping by using the Transformations like Source Qualifier, Filter, Joiner,
Expression, Aggregator, Sequence generator, Lookup, Union, Update strategy and xml parser
transformation.
Developed Mapplets and Re-useable Transformations.
To cleanse data at the point of entry into the Warehouse and (ODS) to ensure that the
warehouse provides consistent and accurate data for the business decision making.
Used Agility workbench Netezza for faster application design and development.
Performed target load plan to load data into the various targets.
Responsible for Developing Transformation rules from source system to Data Warehouse.
Involved in dealing with the slowly changing dimensions.
Performance Tuning of the Mapping and Interacted with client team and Business users to
perform analysis of business requirements and Interpreted transformation rules for all target
data objects.
Responsible for preparing Unit Test Plan, Unit Test Scripts for all phases of testing unit testing,
Integration testing, system testing and responsible for bug fixing after user acceptance test.
Title # 5 NSN H5
Client Nokia Siemens Networks(Finland)
Role Senior Software Engineer
Duration December-2009 to April-2012
Team size 7
Environment Database: Oracle10g, Oracle Exa-Data, SQL
Developer Tools: Informatica8.1.1and
8.6.1O/S: Windows Server 2003, Unix and
Linux.
Project Scope
NSN Informatica: These systems are concerned with taking care of retail and sales analysis
and distribute analysis of the products. The company is searching for business enhancement and
competitive edge using information technology to make better business decision. NSN H5 has branches
across Finland, Germany, Singapore and Sweden. It has Odd 17 Clients and approximately 200 end
users; they all use the one common system from NSN warehouse. This will help them intelligent
information reports regarding their existing business situation to analyze specifically.
ISC Opera Reporting:
This project is developed for the analysis of SAPR/3 and SAP/BW transactional and Logistic data from
Opera-DW. It is a data warehouse solution with a front end reporting tool as Business Objects to
provide operative reports and business measures for Nokia Siemens Networks. It includes actual order
delivery data transferred from SAP R/3, software delivery data from SWD, product related data from
MxPDM, logistic data from MLS-installations and history data.
Role & Responsibilities:
Monitoring daily data loads and rectifying if any problem occurs during data loads.
Creating documentation for support process for existing applications.
Uploading .Csv files through informatica tool.
ABAP Code generation against SAP program mappings.
Involved in creating mappings, sessions and workflows.
Scheduling and un-scheduling the workflows.
Working on RMT tickets.
Made an informatica migration setup from oracle 10g to Oracle ExaData.
Fixing data issue in month end report (IOTDR) if problem occurs.
Worked with Sources such as Relational and Flat files. Used Informatica to extract data into
Data Mart built on Oracle, SQL server.
Developed Mapping by using the Transformations like Source Qualifier, Filter, Joiner,
Expression, Aggregator, Sequence generator, Lookup, Union and Update strategy.
Developed Mapplets and Re-useable Transformations.
Identifying and rectifying performance bottlenecks.
Used TOAD for faster application design and development.
Resolve issues in Transformations and mappings.
Performed target load plan to load data into the various targets.
Involved in the Development of Informatica mappings and Mapplets and also tuned them for
Optimum performance and Batch Design.
Performance Tuning of the Mapping and Interacted with client team and Business users to
perform analysis of business requirements and Interpreted transformation rules for all target
data objects.
Involved in Preparing Unit Test Plan, Unit Test Scripts for all phases of testing unit testing,
Integration testing, system testing and responsible for bug fixing after user acceptance test.
Effectively handling a team of four for production support and platform and Application support
For NSN Informatica.
Responsible for providing the user access privileges for users groups.
Involved in Creating folder, groups and handling all platform level tickets in production
environment.
Involved in server up-gradation from 8.1.1 to 8.6.1
Title # 6 GLOBAL BUSINESS
INFORMATION(GBI)
Client JANSSENPHARMACEUTICALS INC,
USA
Role Senior Software Engineer
Duration May-2009 to December-2009
Team size 6
Environment Database :Oracle10g Tools
Autosys :Informatica8.1.1, TOAD 7.1
O/s :Windows and Linux
Project Scope
GBI (Global Business Information), Janssen Pharmaceuticals, Inc., a pharmaceutical company of
Johnson & Johnson, They Manufacture and market many first-in-class prescription medications and are
poised to serve the broad needs of the healthcare market – from patients to practitioners, from clinics to
hospitals. This particular project was developed to maintain all the details of the Prescription doctor’s
details, Market retail sales and distribution and patience details that use their medication.
All data maintained in the GBI for their sales force or call center in order to increase the sales.
Role & Responsibilities:
Responsible for understanding the Functional specs and Involved in preparing the High Level
Design Documents.
Involved in the Preparing the Low Level Design documents and ETL Mapping documents
(Detail Design Documents).
Developed Mapping by using the Transformations like Source Qualifier, Filter, Joiner,
Expression, Aggregator, Sequence generator, Lookup, Union and Update strategy.
Developed Mapplets and Re-useable Transformations.
To cleanse data at the point of entry into the Warehouse and (ODS) to ensure that the
warehouse provides consistent and accurate data for the business decision making.
Identifying and rectifying performance bottlenecks.
Used TOAD for faster application design and development.
Create Autosys and Tivoli Schedules as per the ETL Informatica requirements.
Managing Autosys and UNIX development and testing process for ETL Projects.
Migrating Autosys jobs from stage to production.
Dealing with abends and perform restart, force complete and cancelling of jobs in Autosys.
Supporting various applications in production and all test environments and providing support
while running the batch cycles.
Analysing new projects that have implemented or planning to implement Autosys scheduler and
suggesting a scheduling model for Autosys job scheduling.
Analysing the Autosys jobs for existing projects and suggesting scheduling improvements.
Worked with Development team to understand various projects and help them implement
Autosys to reduce a lot of manual effort in testing batch cycle in lower environments.
I have been involved in building a High speed testing environment where more than 250
applications are running batch cycle to perform integrated and regression testing.
Resolve issues in Transformations and mappings.
Performed target load plan to load data into the various targets.
Involved in Preparing Unit Test Plan, Unit Test Scripts and Unit Test Results.
Title # 7 MMRP and MMBC
Client MICROSOFT, USA
Role Senior Software Engineer
Duration August-2008 to April-2009
Team size 7
Environment Databases: SQL Server2008 Tools: SQL Server
Integration Tools2005 and 2008.
Project Scope
MMRP and MMBC (Mid Market Relationship Program and Mid Market Business Communication),
this project has been done for Microsoft (USA), to target their mid market customers. And maintain
their customer details using licenses for validating and renewing their products and providing their
customers better customer support.
Role & Responsibilities:
Working close with MSBI-Coe for Knowledge Transferring.
Involved in the Design Phase.
Reviewing and preparing technical documentation
Preparing the basic documentation for functional and technical Specs.
Handling the change requests to make the changes for ongoing project.
And validate this changes and preparing the Test Cases for this changed requests.
Involved in system integration testing and performed regression testing for the changes applied
in the existing code.
Handling the production issues.
Supporting the on-shore team from off-shore code deployment.
Title # 8 MMSEA
Client AUTO CLUB GROUP, USA
Role Software Engineer.
Duration February-2007 to July-2008
Team size 4
Environment Database :Oracle9i, Oracle10g
Tools :Informatica8.1.1, TOAD 7.1
O/s :Windows and Linux
Project Scope
MMSEA (Medicare, Medicaid, and SCHIP Extension Act of 2007), this an insurance project and it has
been done for the Auto Club Group (USA), this module will help for the ACG for reporting claims to
CMS Govt. Repository who already claimed through the any one of the Pvt. Insurance Companies and
does not allow the claimant to have two claims. Auto Club Group major-Ely provides the auto and other
general insurances
Role & Responsibilities:
Responsible for understanding the functional specs and involved in preparing the High Level
Design Documents.
Involved in the preparing the Low Level Design documents and ETL mapping documents
(Detail Design Documents).
Developed mapping by using the transformations like Source Qualifier, Filter, Joiner,
Expression, Aggregator, Sequence generator, Lookup, Union and Update strategy.
Developed mapplets and used re-useable transformations.
To cleanse data at the point of entry into the Warehouse and (ODS) to ensure that the
warehouse provides consistent and accurate data for the business decision making.
Identifying and rectifying performance bottlenecks.
Used TOAD for faster application design and development.
Resolve issues in transformations and mappings.
Performed target load plan to load data into the various targets.
Involved in preparing Unit Test Plan, Unit Test Scripts and Unit Test Results documents.
Title # 9 HCF
Client Hospitals Contribution Fund, Australia
Role Software Engineer.
Duration Nov-2005 to February-2007
Team size 5
Environment Database :Oracle9i Tools:Informatica7.1.1
O/s: Windows
Project Scope
HCF (Hospitals Contribution Fund of Australia) is one of the biggest insurance organizations,
providing different products Life insurance, Mutual funds, Home and Auto insurances and ECT. They
provide insurance for groups and individual. This project has been done for HCF to cover Groups
Benefits Division like Life Insurance and Disability. The case management mart has been developed to
store data from those sources and will be stored in centralized database so that they can do the business
analysis on this integrated data.
Role & Responsibilities
Involved in preparing the Technical Documents.
Developed mappings to meet the requirements as per design document.
Used most of Transformations like Source Qualifier, Filter, Joiner, Expression, Aggregator,
Sequence generator, Lookup and Update strategy.
Taking care of issues and change request from onsite team.
Fixing the new changes in developed mappings.
Involving in code reviews.
Validating the source data for finding the Mismatches.
Involved in unit testing to ensure the development as per the business functional.
Validating the source and target data for finding the mismatches.
Title # 10 SMDM
Client Daiichi Sankyo, Europe.
Role Software Engineer.
Duration June-04 to Nov-2005
Team size 6
Environment Database: Oracle9i and DB2 Tools:
Informatica7. 1 O/s: Windows and Unix.
Project Scope
SMDS (Sales Management Data Source), Daiichi Sankyo is a leading global pharmaceutical company
and a member of the Daiichi Sankyo Group. The organization, which includes U.S. commercial
operations and global clinical development (Daiichi Sankyo Parma Development), is headquartered in
New Jersey. This data base is used to collect all its retail sales information for various departments for
reporting.
Roles & Responsibilities
Developed mappings, workflows to meet the requirements as per design document.
Taking care of issues and change request from onsite team.
Worked in source qualifier, filter and expression transformation.
Extracted Source data from flat files using Informatica and Loaded into Oracle targets.
Involved in preparation of production support documents.
Preparing unit test cases and validating the source data with target data for finding the
mismatches.