Professional Summary
**+years of IT experience in SDLC which includes requirement gathering, designing, implementing and testing.
Over 8 + years of Technical and Functional experience in Decision Support Systems - Data warehousing implementing ETL using Informatica Power Center 9.6/9.1/8.6, Informatica Power Exchange (PWX).
Over 5 + years of experience of leading multiple ETL projects of team members between 5-10 members.
Worked with Onsite and Offshore model and lead the team.
Good knowledge in interacting with HIVE, Python, MongoDB, Informatica DT and Data Validation Option.
Very good proficiency in Informatica Data Quality 9.6.1, Informatica Metadata and Repository Management.
Extensively worked on Informatica Analyst tool 9.6.1, Address Doctor as initial phase.
Good knowledge in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
Experience in interacting with Business Managers, Product Owners, Analysts, and end users to correlate Business logic and specifications for ETL Development and documenting Source-to-Target Mappings, Unit Test cases and Deployment documents.
Worked on both waterfall model as well as agile methodology projects.
Worked on tools like HPQC for defect tracking.
Production support, resolving the production job failures, interacting with the operations support group for resuming the failed jobs.
Worked with the team to ensure deliverables on time.
Involved in Unit testing, System testing, UAT to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
Worked with reporting team to help understand them the user requirements on the reports and the measures on them. Helped them in creating canned reports.
Migrated repository objects, services and scripts from development environment to production environment. Extensive experience in troubleshooting and solving migration issues and production issues.
Actively involved in production support. Implemented fixes/solutions to issues/tickets raised by user community.
Modified the shell scripts as per the business requirements.
Good in Collecting, Analyzing and Interpreting Data in an organized way.
Extensive experience in Production Support participating with disciplines of supporting the IT systems/applications which are currently being used by the end users. Involved in performance tuning.
Certifications
Certified Scrum Master
Technical Skills
Databases
Oracle 10g/9i, SQL Server 2016, Snowflake, Teradata
Operating Systems
UNIX, Windows 2000/NT/XP
Languages
Unix Shell scripts, HTML
ETL Tools
Informatica PowerCenter, Informatica Data Quality, Informatica Power Exchange(CDC), DVO, Informatica Cloud, Informatica MDM
BI Tools
Micro strategy
Tools/Technologies
Visual Source Safe, IIS, SVN
Web
ASP, XML, VBScript, JavaScript
Design
Visio
Cloud Tools
Amazon S3, Redshift, CloudWatch, IAM
Education Details
M.Sc (Computer Science) – Osmania University - 2003
B.Sc (Computer Science) – Osmania University - 2001
Professional Experience
Zimmer Biomet
Warsaw, IN April 2018—Tilldate
Informatica Lead
Zimmer Biomet is a publicly traded medical device company,team designs, manufactures and markets effective, innovative solutions that support orthopaedic surgeons and clinicians in alleviating pain and improving the quality of life for people around the world. Our musculoskeletal technologies and a wide range of related products and services make us partners to healthcare providers in more than 100 countries.
Worked with Informatica 10 to create Source/Target connections, monitor, and synchronize the data with various systems.
Troubleshoot the existing code as well as new code as the requirement changes.
Responsible for Testing and Validating the Informatica mappings against the pre-defined ETL design standards.
Involved in different Team review meetings.
Leading the team in coding, testing, performance tuning in developing Informatica mappings and workflows.
As a lead Data Quality developer in this team initiated the process of Data profiling by profiling different formats of data from different sources.
Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
Identifying ETL specification based on Business Requirements/Mapping Document. Formulating and documenting ETL process design.
Environment: Informatica 10, IDQ, Oracle 10g,SAPHANA,SAPBW, MS SQL server 2012, Flat files.
T. Rowe Price
Baltimore, MD November 2017—March 2018
Informatica Lead
T. Rowe Price is an American publicly owned investment firm. The company offers mutual funds, sub advisory services, and separate account management for individuals, institutions, retirement plans, and financial intermediaries. Additionally, the organization offers investment planning and guidance tools. EDS Enterprise DataMart is designed to provide analytical data services to US Intermediaries Business Unit. EDS ingests data from Enterprise Salesforce CRM, MDM, Product and Assets into EDM for business analytics and research used by leadership teams in order to design the marketing strategies.
Worked with Informatica Cloud to create Source/Target connections, monitor, and synchronize the data in SFDC.
Worked with Informatica cloud for creating source and target objects, developed source to target mappings.
Troubleshoot the existing code as well as new code as the requirement changes.
Responsible for Testing and Validating the Informatica mappings against the pre-defined ETL design standards.
Involved in different Team review meetings.
Leading the team in coding, testing, performance tuning in developing Informatica mappings and workflows.
As a lead Data Quality developer in this team initiated the process of Data profiling by profiling different formats of data from different sources.
Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
Identifying ETL specification based on Business Requirements/Mapping Document. Formulating and documenting ETL process design.
Environment: Informatica 10, IDQ, Informatica Cloud, Snowflake, Unix script, MS SQL server 2012, Flat files.
The Reinalt-Thomas Corporation
Scottsdale, AZ
Cognizant USA June 2015—October 2017
Informatica Lead
The aim of this project is to convert customer vehicle data information from old fitment guide to new fitment guide. The customer who enters to the POS will provide his vehicle make, model and year and style of this vehicle and internally web method call Informatica web service to provide his customer information along with the dimensions of the vehicle if he is an existing customer else it would provide a new customer id and same dimension details. We are even using Address doctor to valid his address and suggest nearby stores he can visit to change his tires. We have a product review (Tires) catalog where we will be providing with suggested list of tires that fit his vehicle and its rating. If he purchases online we have built a shipping block details that sends him with tracking information. For Customer and Vehicle Cleansing we are using data quality mappings which are scheduled to run on daily basis. We are using SAP IDOC and BAPI’s to load data to CAR (Customer Access Repository).
Identifying ETL specification based on Business Requirements/Mapping Document. Formulating and documenting ETL process design.
Extensively using Informatica client/Power exchange tools to extract data from different sources like SAP, SAP IDOC, SQL Server, AS/400.
Troubleshoot the existing code as well as new code as the requirement changes.
Responsible for Testing and Validating the Informatica mappings against the pre-defined ETL design standards.
Involved in different Team review meetings.
Developed ETL mappings as per the development standards.
Leading the team in coding, testing, performance tuning in developing Informatica mappings and workflows.
Building extensively IDQ mappings for customer and vehicle data cleansing and exporting to PowerCenter and using as normal transformations in building the logic.
Implemented Real time Change Data Capture (CDC) using Informatica Power Exchange.
Used Informatica 9.6 to load data into SAP CAR system using SAP IDOC and BAPI’s.
Configured Address Doctor which can cleanse whole world address Data and enhanced by making some modifications during installation.
Involved in implementing the Land Process of loading the customer Data Set into Informatica MDM from various source systems.
Involved in technical supporting, troubleshooting data warehouse application to meet Service Level Agreement(SLA's)
Handling of CR's (Change Requests) and enhancements for existing application and followed change management process and created deployment documents.
Ability to create ETL framework with reusable assets.
Distribution of work to onsite and offshore team and follow-ups.
Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
Used IDQ’s standardized plans for addresses and names clean ups.
Used IDQ to complete initial data profiling and removing duplicate data.
Environment: Informatica 9.6/10, Informatica Powerexchange(CDC),IDQ, SAP, AS/400,Flatfiles, Powershell script, MS SQL server 2012.
Toyota Motor Sales
Torrance, CA
Cognizant USA January 2015—May 2015
Informatica Lead/Data Analyst
C360 NextGen is a project extended from existing Endeca based C360. The objective is to build Big Data platform that provides 360-degree view of data sourced from various systems including unstructured data from social media. 15 systems have been identified as source for this project. Hive/Impala will be used for data storage, and Parquet is used for columnar data compression. The aim of this project exercise is to understand the data anomalies from various source systems with business users from all Toyota brands and various functional departments, such as Public Relations, Marketing, and Customer Relations and provide data to C360 Portal.
Reading through the data and present the findings, and translate them into an understandable document.
Worked with business SMEs on developing the business rules for cleansing. Applied business rules using Informatica Data Quality (IDQ) tool to cleanse data.
Presented Data Cleansing Results and IDQ plans results to the OpCos SMEs.
Worked with Informatica to load data into various target tables after performing the data quality.
Documented Cleansing Rules discovered from data cleansing and profiling.
Involved in complete understanding of business requirement and involved in analyzing the sources to load in Oracle warehouse.
Designed Sources to Targets mappings from SQL Server, Excel/Flat files to Oracle using Informatica Power Center.
Responsible for working with DBA and Data Modellers for project coordination.
Responsible for Detail design of each interface.
Responsible for developing complex Informatica mappings using different types of transformations like UNION transformation, Connected and Unconnected LOOKUP transformations, Router, Filter, Aggregator, Expression and Update strategy transformations for Large volumes of Data.
Developed SDE mappings to load large amount of data from different sources to staging area, SID mappings for load the data from staging area to target system.
Implemented slowly changing dimension Type 1 and Type 2 for Change data capture using Version control.
Responsible for creating Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor.
Responsible for Defining Mapping parameters and variables and Session parameters according to the requirements and performance related issues.
Created various tasks like Event wait, Event Raise and E-mail etc.
Created Shell scripts for Automating of Worklets, batch processes and Session schedule using PMCMD
Tuning the Informatica mappings to increase the performance.
Wrote test plans and executed it at UNIT testing and also supported for system testing, volume testing and User testing. Also provided production support by monitoring the processes running daily.
Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality.
Created deployment documents for migration the code to higher environments.
Developed Reusable codes which can be used thru out the application.
As an Onsite Coordinator provided direction, feedback and coaching to build the capability of Off-shore project team and other onsite coordinators.
Environment: Informatica 9.1, Informatica Powerexchange(CDC),IDQ, Oracle 11g, Data quality, Toad, PL/SQL, Erwin,
.
Allstate Investment
NorthBrook, IL
Syntel Private Ltd March 2013—December 2014
Sr. Developer
Cross Asset Data Warehouse contains information gathered from multiple source systems into a central data store for various investments such as: Taxable Fixed Income, Equities, Derivatives, and Mortgage Loans. XA is the data warehouse built with Oracle as database, on UNIX server and Informatica as ETL tool. All the Informatica jobs are triggered via UNIX and Tivoli. It is used for reporting purposes and sometimes to disseminate data to downstream systems which run their own analytical processes to come up with reports: There are about 60-70 users using 200 reports from BO and 2200 from Micro Strategy (the AdHoc reporting runs into thousands). But the IT generated reports (on a daily, weekly and monthly) are probably around 600-700.
Responsible for requirement gathering, data analysis, design, development, testing and systems implementation.
Identifying ETL specification based on Business Requirements/Mapping Document. Formulating and documenting ETL process design.
Extensively using Informatica client tools to extract data from different sources like flat files and oracle.
Troubleshoot the existing code as well as new code as the requirement changes.
Performed Unit Testing and Created UNIX Shell scripts and provided on-call support during the off hours.
Did overnight schedule monitoring during critical month-end/quarter-end positions, source system upgrades and during integration/system tests.
Responsible for Testing and Validating the Informatica mappings against the pre-defined ETL design standards.
Involved in different Team review meetings.
Generated weekly and monthly report Status for the number of incidents handled by the support team.
Leading the team in coding, testing, performance tuning in developing Informatica mappings and workflows.
Verify the data quality or data validation through Informatica data validation option tool (DVO) and Data Quality (IDQ).
Created Reusable transformation and Mapplet based on the business rules to easy the development process and responsible for document the changes.
Environment: Informatica 9.1, Oracle, DVO, Toad, PL/SQL MS Visio, Windows 2000, UNIX.
Allstate Insurance
Chennai, India
Syntel Private Ltd January 2011—February 2013
Sr. Developer
Allstate sells Encompass Insurance branded property and casualty products exclusively through Independent Agents. In the current environment, the independent agent facing reports are obtained from a variety of source data stores. Among them, Independent Agent Enterprise Data Warehouse (IAEDW), Actuarial Data Mart (ACTM) and Agency Data Mart (ADM) are the primary sources for most of the information. There are also significant, localized reports maintained at the business unit level. Thus exists a disparate reporting environment supported by a variety of underlying technologies, leading to multiple versions of truth. The recently concluded analytics and reporting strategy and data design project at Encompass had laid out a roadmap to consolidate the reporting environment and provide access improvement. The scope of this project will be to Build and Implement the Agency Data Mart.
Provide profiling, cleansing and validation reports in preparation for a data migration.
Responsible for requirement gathering, design, development, reviewing, testing and systems implementation.
Leading the team in coding, testing and code migration.
Involved in migrating Informatica ETL application and Database objects through various environments such as Development, Testing, UAT and Production environments.
Assisted change management reviews and impact analysis with the Business team.
Identify the stake holders, priorities the task items which could be delivered in the next cycle as per part of engagement.
Involved on various HIPPA claims validation and verification process to understand the source data.
Environment: Informatica 9.1, IDQ, SQL, UNIX Scripting, Flat Files, Windows NT/2000.
American Express
Chennai, India
Cognizant Technology Solutions India Pvt Ltd August 2010—December 2010
Sr. Informatica Developer
This project was to migrate old legacy FIRS system to new IRIS system and decommission the FIRS and data would be feed to new key reports will be generated to support R&C activities for those accounts.
Responsible for Design and Development of ETL and the reporting application.
Communicated with the users to observe various business rules in implementing the data warehouse.
Modified existing mappings for enhancements of new business requirements.
Check in/Checkout the code, Test Plan, Test cases and unit test case.
Creation, configuring, scheduling and running of Sessions, Worklets and Workflows.
Preparing the unit test cases and perform Unit testing and peer reviews at the end of each phase.
Trouble shooting and debugging the issues evolved during the project execution by providing the root cause analysis.
Coordinating with the various levels of production support teams to provide quick analysis and problem resolution through data/code fixes.
Environment: Informatica 8.6, VBSCRIPT, Oracle9i, SQL, PL/SQL.
Kohls
Chennai, India
Cognizant Technology Solutions India Pvt Ltd October 2009—July 2010
Sr. Informatica Developer
This project was to migrate old legacy (PL-SQL) code to new Informatica code to generate 29 incremental extracts (daily/weekly) mainly for order management/commerce related data from newly build ATG-10 and Sterling database for third party of the client for loading into their e-commerce data marts. Since source system building was also in progress parallel hence it made it more challenging with day to day requirement changes. Source was Oracle11g and target as flat files.
Responsible for Design and Development of ETL and the reporting application.
Modified existing mappings for enhancements of new business requirements.
Check in/Checkout the code, Test Plan, Test cases and unit test case.
Creation, configuring, scheduling and running of Sessions, Worklets and Workflows.
Modified existing UNIX Shell Scripting and created new scripts.
Preparing the unit test cases and perform Unit testing and peer reviews at the end of each phase.
Trouble shooting and debugging the issues evolved during the project execution by providing the root cause analysis.
Environment: Informatica 8.6, SQL Server, UNIX script
Merrill Lynch
Hyderabad, India
Satyam Computer Services Ltd. November 2008—September 2009
Informatica Developer
Merrill Lynch is one of the world's leading financial management and advisory companies, with offices in 36 countries. Through Merrill Lynch Investment Managers, the company is one of the world's largest managers of financial assets. This warehouse project is to assist Financial Advisory Center to categorize their customers based on portfolio of services including various types of accounts, personal loans and geographical area. Using different ad hoc analysis, warehouse is supposed to assist in defining strategy for each customer category; it also designed to provide detailed reporting on each lead/advisor/team, as well as, collects and reports on performance metrics. Project was executed in phases building up from extraction, computation logic, dimension models and populating data using Informatica.
Analyse the specifications and identify source data, which needs to be moved to data warehouse.
Participated in the Design Team and user requirement gathering meetings.
Implemented Slowly Changing Dimension type2 methodology for accessing the full history of accounts and transaction information.
Worked on Power Center client tools like Source Analyser, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
Used Debugger to check the data flow in the mapping and made appropriate changes in the mappings to generate the required results.
Developed mappings to load data in slowly changing dimensions.
Created Sessions in workflow manager and monitored it using workflow monitor.
Environment: Informatica 8.6, Oracle 9i, UNIX, Sybase, Windows 2000.
GE Healthcare
Hyderabad, India
Satyam Computer Services Ltd. September 2006—September 2008
Informatica Developer/Production Support
GE Health Care is a member of the General Electric group of companies GEHC is the world’s leading manufacturers of medical diagnostic imaging equipment, including conventional and digital x-ray, magnetic resonance, ultrasounds, nuclear medicine and other related areas. The business interests of the company are widely dispersed with clients across the globe. This makes it necessary for the company to maintain Information Systems, which provides reliable service to its clients with optimal response time and efficiency.
Designed ETL mapping based on Existing ETL logic.
Involved in the development of Informatica mappings.
Resolving the Tickets and build UNIX Scripts for FTPing Files.
Problem Resolution Document Preparation.
Provide 3rd and 4th line support for the Enterprise Data Warehouse.
Environment: Informatica 8.1, Oracle 9i, UNIX Scripts.