Professional Summary:
Over *+ years of experience in Information Technology including Data Warehouse/Data Mart development using ETL/Informatica Power Center across various industries such as Customer Security, Pharmacetuical,Retail,Telecommunications,Lifesciences.
Extensive experience in using Informatica Power Center 9.x/8.x to carry out the Extraction, Transformation and Loading process as well as Administering in creating domains, repositories and folder.
Extensive experience in using Informatica tool for implementation of ETL methodology in Data Extraction, Transformation and Loading.
Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.
Extensively worked on Informatica IDQ for data profiling, data enrichment and standardization.
Hands on experience using Teradata utilities (SQL, BTEQ, Fast Load, Multi Load, Fast Export), UNIX in developing tools and system applications.
Expertise in Extraction, Transformation & Loading of data using heterogeneous sources and targets.
Experience with good understanding of the concepts and handling Repository Manager, Designer and Informatica Admin Console.
Involved in all phases of data warehouse project life cycle. Designed and developed ETL Architecture to load data from various sources like SAP R/3 code, Oracle, Flat files, XML files, Teradata, Netezza and MS SQL Server into Oracle, Teradata, XML, and SQL server targets.
Hands on experience with mappings from varied transformation logics like Unconnected and Connected Lookups, Router, Aggregator, Joiner, Update Strategy, Java Transformations and Re-usable transformations.
Extracted Data from multiple operational sources of loading staging area, Data Warehouse and data marts using SCD (Type1/Type2/Type3) loads.
Extensively created mapplets, reusable transformations and look-ups for better usability.
Good understanding of relational database management systems like Oracle, Teradata, SQL Server and worked on Data Integration using Informatica for the Extraction transformation and loading of data from various database source systems
Strong knowledge of Entity-Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).
Expert in Oracle 11g/10g, Netezza, SQL Server 2008/2005, SQL, PL/SQL Stored procedures, functions, and exception handling using Toad and PLSQL.
Experienced in database programming in PL/SQL (Stored Procedures, Triggers and Packages).
Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklets, Control).
Worked on Adhoc changes and releases the deliverables on time.
Experienced working in agile methodology and ability to manage change effectively.
Worked on JIRA\RALLY tools to maintain for the defect tracking.
Worked on SUB-VERSION tool to maintain the mapping documents.
Responsible for Team Delivery and Participated in Design Reviews.
Expertise in defining and documenting ETL Process Flow, Job Execution Sequence, Job Scheduling and Alerting Mechanisms using command line utilities.
Excellent communication, interpersonal skills and quickly assimilate latest technologies concepts and ideas
Outstanding communication and interpersonal skills, ability to learn quickly, good analytical reasoning and high compliance to new technologies and tools.
TECHNICAL SKILLS:
ETL Tools : Informatica Power Center 9.x/8.x
Databases & Tools : MS-SQL2005-2008,Oracle11.x/10.x,SQL,Views,Toad,Teradata and Netezza
Job Scheduler : Informatica Scheduler, Control-M 6.3 and Autosys
Operating Systems : Windows95/98/2000/2003/NT/XP, UNIX.
Languages : C#, C++, HTML, Java
Other Utilities & Tools : SharePoint, Putty, WinSCP
Version Control : JIRA, Rally
Database Development Tools : Toad, SQL Developer
Education Details:
MCA from Andhra University in 2006.
Bsc (M.P.C) from Andhra University in 2002.
Professional Experience:
THERMOFISHER, Pittsburgh, PA (Sep’15– Present)
Role: Sr.Informatica Power Center Developer/Lead
Project Description: DQA-RSD
Working on the DQA-RSD module and involved in the development of the different components which are part of the EDW. As part of the DQA-RSD (research and safety market division), the data is coming from the SAP system (NA & EMEA) and the target system is oracle.
Responsibilities:
Worked with Business Analyst and application users to finalize Data Model, functional and detailed technical requirements.
Responsible for Data Warehouse Architecture, ETL and coding standards.
Used Informatica as ETL tool to pull data from source systems/ files, cleanse, transform and load data into databases.
Created detailed Technical specifications for Data Warehouse and ETL processes.
Conducted a series of discussions with team members to convert Business rules into Informatica mappings.
Assisted the team in the development of design standards and codes for effective ETL procedure development and implementation.
Used Informatica as ETL tool to pull data from source systems/ files, cleanse, transform and load data into databases.
Created dictionaries using Informatica Data Quality (IDQ) that was used to cleanse and standardized Data. Worked with Informatica and other consultants to develop IDQ plans to identify possible data issues.
Extracted data from SAP R/3 and loaded into Oracle Data Warehouse.
Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
Created Mapplets and used them in different Mappings.
Experienced on working adhoc changes and releases the deliverables on time.
Experience working in agile methodology and ability to manage change effectively.
Worked on RALLY tool to maintain for the defect tracking.
Developed stored procedure to check source data with warehouse data and if not present, write the records to spool table and used spool table as lookup in transformation.
Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
Documented Cleansing Rules discovered from data cleansing and profiling.
Managed Scheduling of Tasks to run any time without any operator intervention.
Leveraged workflow manager for session management, database connection management and scheduling of jobs.
Involved in cleansing and extraction of data and defined quality process for the warehouse.
Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions, index cache to manage very large volume of data.
Scheduling Informatica jobs and implementing dependencies as per the requirement
Responsible for performing SQL query optimization using Hints, Indexes and Explain plan.
Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time
Used Update Strategies for cleansing, updating and adding data to the existing processes in the warehouse.
Created Unix Shell Scripts for Informatica ETL tool to automate sessions and cleansing the source data.
Experienced in Debugging and Performance tuning of targets, sources, mappings and sessions
Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations and mapplets.
Delivered all the projects/assignments within specified timelines.
Moved to code from Dev to QA to PROD by using Deployment process.
Environment: Informatica Power Center 9.5, SAP R/3, Flat files, Oracle, MS SQL server 2008, Erwin, Winscp.
Level3, Denver Co. (Feb’14– Jul’15)
Role: Software Engineering Sr. Analyst
Project Description: JDEdwards Module
Implemented the DWH for the JDEdwards (Finance Module). Customer information is maintained by Level (3) and the data is provided to the customers.
Responsibilities:
Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
Develop high level and detailed level technical and functional documents consisting of Detailed Design Documentation function test specification with use cases and unit test documents
Analysis of source systems and work with business analysts to identify study and understand requirements and translate them into ETL code
Handled technical and functional call across the teams.
Responsible for the Extraction, Transformation and Loading (ETL) Architecture & Standards implementation
Responsible for offshore Code delivery and review process.
Worked extensively on PL/SQL as part of the process to develop several scripts to handle scenarios.
Used Informatica to extract data from Oracle and Flat files to load the data into the Target Table.
Worked in all phases of Data Integration from heterogeneous sources, legacy systems to Target Database.
Worked on Informatica Power Center tool – Source Analyzer, Warehouse designer, Mapping and Mapplet Designer, Transformations, Informatica Repository Manager, Informatica Workflow Manager and Workflow Monitor.
Worked on Adhoc changes and releases the deliverables on time.
Experience working in agile methodology and ability to manage change effectively.
Worked on RALLY tool to maintain for the defect tracking.
Worked on SUB-VERSION tool to maintain the mapping documents.
Involved in Design Review, code review, test review, and gave valuable suggestions.
Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
Created partitions for parallel processing of data and also worked with DBAs to enhance the data load during production.
Moved to code from Dev to QA to PROD by using Deployment process.
Worked by implementing SCD (Slowly Changing Dimensions) Type 1 and Type 2.
Took part in migration of jobs from UIT to SIT and to UAT
Environment: Informatica Power Center 9.1, Oracle 9i, TOAD, Erwin, UNIX.
H&M, (Stockholm, Sweden) (Sep’12-Jan’14)
Role: Data Specialist
Project Description: CDW-FLOW
Implemented the DWH for the CDW-FLOW Customer information is maintained by H&M and the data is provided to the customers.
Responsibilities:
Prepared the required application design documents based on functionality required
Designed the ETL processes using Informatica to load data from Netezza and Flat Files to staging database and from staging to the target database.
Implemented the best practices for the creation of mappings, sessions and workflows and performance optimization.
Involved in migration of mappings and sessions from development repository to production repository
Extensively used Informatica and created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.
Designed and developed the logic for handling slowly changing dimension table’s load by flagging the record using update strategy for populating the desired.
Involved in cleansing and extraction of data and defined quality process for the warehouse.
Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions, index cache to manage very large volume of data.
Used Update Strategies for cleansing, updating and adding data to the existing processes in the warehouse.
Worked extensively on PL/SQL as part of the process to develop several scripts to handle scenarios.
Defects are logged and change requests are submitted using defects module of Test Director using HP Quality Center.
Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance.
Performed data validation, reconciliation and error handling in the load process. Test data to ensure data is masked.
Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
Worked on Adhoc changes and releases the deliverables on time.
Experience working in agile methodology and ability to manage change effectively.
Worked on JIRA tool to maintain for the defect tracking.
Worked on SUB-VERSION tool to maintain the mapping documents.
Involved in migrating objects from DEV to QA and testing them and then promoting to Production.
Involved in production support working with various tickets created while the users working to retrieve the database.
Environment: Informatica 9.x, UNIX, Netezza, Winscp, TOAD, Putty
Eli-Lilly, Indianapolis (Dec ’11 – Sep ’12)
Role: Data Specialist
Project Description: PAYER-RX
The main aim of this project is to identify the HCP’S from across USA along with their Customer ID. For this we are getting the data from flat file and identify the HCP’S with some id’s and create the Customer ID for each HCP’S and creating ID’S accordingly.
Responsibilities:
Created detailed Technical specifications for the ETL processes.
Assisted the team in the development of design standards and codes for effective ETL procedure development and implementation.
Used Informatica as ETL tool to pull data from source systems/ files, cleanse, transform and load data into databases.
Worked on Informatica- Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.
Developed the Informatica mappings using various transformations, Sessions and Workflows. SQL Server was the target database, Source database is a combination of Flat files, Oracle tables, People Soft, Excel files, CSV files etc.
Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
Responsible for Unit Testing, Integration Testing and helped with User Acceptance Testing.
Involved with the DBA in performance tuning of the Informatica sessions and workflows. Created the reusable transformations for better performance.
Optimizing the Mappings and implementing the complex business rules by creating re-usable transformations and mapplets.
Involved with the DBA in performance tuning of the Informatica sessions and workflows. Created the reusable transformations for better performance.
Involved in writing UNIX shell scripts for Informatics ETL tool to run the Sessions.
Fixing and tracking mapping defects and implementing with enhancements.
Managing post production issues and delivering task/projects within specific timeline.
Involved in the mirroring of the staging environment to production.
Collaborated with teams for migration and Production Deployment activities.
Scheduling Informatica jobs and implementing dependencies as per the requirement.
Responsible for performing SQL query optimization using Hints, Indexes and Explain plan.
Played a Vital role in requirement gathering, preparation of engineering gathering requirement specification.
Managed production issues and delivered all assignments/projects within specified time lines.
Worked on all phases of multiple projects from initial concept through research and development, implementation, QA, to live production, by strict adherence to project timelines.
Environment: Informatica Power Center 8.1, Netezza.
Pfizer, Groton, CT (Sep’11 – Dec’11)
Role: Data Specialist
Project Description: Prism Phase 2
The main aim of this project is to develop on Interface 1095 and Implement some interface’s like 1533, 1474.
Interface 1095:
This is a new development and in this creating new tables in oracle and finally load into Target Teradata. In this based on the data we are identifying that if particular employee is taking some leaves for this we have to maintain the data like when he/she is taking leave and we have to maintain the data of those HCP’S.
Interface 1533:
This is already developed Interface and in this we made some development, in development we create the new tables and giving some conditions like Current indicator. Delete indicator and load the data in the Target Tables.
Interface 1474:
This is already developed Interface due to some client requirements we add some new columns in the tables and load the data from source to target.
Responsibilities:
Analyzed the business requirements, design documents & whole life cycle of the project.
ETL Developing in agile methodology environment with daily scrums and two week sprints.
Created procedures to truncate data in the target before the session run.
Prepared Unit test scenarios, test scripts for testing the data from source, staging, data warehouse and data mart and loaded them into Quality center.
Design and map to understand the scope and review Level 2 architecture, High Level Designs, Functional Spec, and Assembly Docs, Mapping Spec provided by project team.
Extensively worked with Teradata utilities like BTEQ, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
Strong experience in Creating Database Objects such as Tables, Views, Functions, Stored Procedures, Indexes, Triggers, Cursors in Teradata.
Designed & coded the ETL requirements for projects – Pfizer prism phase2.
Worked on BTEQ Utility for the insert/update/delete.
Designed Star Schema with fact tables & dimension tables, developed Informatica Mappings to extract data from data mart.
Mapplets & Reusable Transformations were used to prevent redundancy of usage & maintainability.
Worked on the performance issues at various levels such as target, sessions, mappings & sources, identifying performance bottlenecks, troubleshooting issues & resolving the same in the fastest timeframe possible.
Environment: Informatica Power Center 8.x(Designer, Workflow Manager, Workflow monitor, Repository Manager), Teradata, Autosys
Pfizer, Groton, CT (Jun’11 – Sep’11)
Role: Data Specialist
Project Description: One world Data Services Automation
The main aim of this project is to identify the HCP’S from across USA along with their Pfizer Customer ID. For this we are getting the data from flat file and identify the HCP’S with some id’s and create the Pfizer Customer ID for each HCP’S and creating ID’S accordingly.
Responsibilities:
Involved in Requirement gathering, Data analysis, Design, and Implementation of the application.
Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center.
Extracted sources from flat-files, oracle and load them into Teradata.
Extensively worked with Teradata utilities like BTEQ, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
Strong experience in Creating Database Objects such as Tables, Views, Functions, Stored Procedures, Indexes, Triggers, Cursors in Teradata.
Designed & coded the ETL requirements for projects – Pfizer prism phase2.
Worked on BTEQ Utility for the insert/update/delete.
Implemented Slowly Changing Dimensions (Type 1 &2).
Environment: Informatica Power Centre 8.x., Teradata, Autosys
Net gear, SanJose (Nov’10 – May’11)
Role: Software Engineer Level 2
Project Description: Net gear
The main aim of this Project is to get the updated Net gear records for this we are using action flags like insert, update and delete every time when we run the mapping. The action flag will have one of the values – “D” or “I”. “D” indicates that a previously sent record has to be deleted and the NETGEAR system will look up based on the unique identifier and delete the same from the database. “I” indicates a new record from Zyme and the same would have to be inserted.
Responsibilities:
Involved in requirement gathering and created requirement specification to ensure functional correctness, accuracy, functional completeness and usability.
Created, designed, and implemented specifications and developed ETL processes to support the development and implementation of data warehouse projects using Informatica.
Designed and developed Informatica Mappings to load data from Source systems to Data Mart.
Involved in research and design of the business.
Interacted with users to design the architecture of ETL mappings and developed them.
Extensively used Power Center to design multiple mappings with embedded business logic.
Worked extensively on transformations such as Lookup, Aggregator, Expression, Joiner, Filter, Rank, Update Strategy, and Sequence Generator and performed performance tuning of ETL process.
Environment: Informatica Power Centre 8.x., Windows, Oracle
EMC2 (May’10 – Nov’10)
Role: Application Developer
Project Description: CSMO
The main aim of this project is to gather the World wide Employees information of EMC and maintain the data of all employees and update the database whenever new employees get added modify the database any changes in the employee information.
Responsibilities:
Design, Development and Documentation of the ETL strategy to populate the Data Warehouse from the various source systems.
Involved in design and development of complex ETL mappings.
Implemented partitioning and bulk loads for loading large volume of data.
Developed Mapplets, Worklets and Reusable Transformations for reusability.
Identified performance bottlenecks and fine-tuned to optimize session performance.
Identified bugs in existing mappings/workflows by analyzing the data flow and evaluating transformations.
Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures.
Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.
Extensively worked on various Look up Caches like Static, Dynamic, Persistent, and Shared Caches.
Developed workflow tasks like Email, Event wait, Event Raise, Timer, Command and Decision.
Environment: Informatica Power Centre 8.x., Windows, Oracle
Deutsche Bank (Nov’09 – May’10)
Role: ETL Developer
Project Description: The Enterprise Data Warehouse (EDW) receives interfaces from Cards (ECS+, local systems), Banking and business uploads. These files are loaded in raw tables. After this process, a transformation process will run according to business rules and generate working tables, which will be the source for the process that generates the Star Schema.
Responsibilities:
Data Quality Analysis to determine exceptions.
End-to-end ETL development of the Data Mart from MS SQL Server, Excel spreadsheet, flat files.
Responsible for ETL process under development, test and production environments.
Environment: Informatica Power Centre 8.x., Oracle, MS SQL Server.
Universal System Technologies Inc (July’07 – Nov’09)
Role: ETL Developer
Project Description: Dragon DW-QL
Data for Source DWH comes from Oracle Applications and Legacy systems from across different locations (North America, Europe and Asia). Initially 80 percent of the data use to come from Legacy systems and 20 from oracle apps, as the locations started increasing and by migration to oracle apps, 20 percent of the data now comes from Legacy and 80 percent from oracle. Data loading architecture was designed mostly keeping the Legacy data mind initially. Time taken was 12 days for all 60 locations. Objective of Quantum leap is to reduce the loading cycle from 12 days at present to 2 days by redesigning the whole system. Major challenges were to make the system more simpler and efficient and also easy to maintain.
Responsibilities:
As a part of 3-member team, was involved in design, development and implementation of the module.
Provided 24x7 production support.
Environment: Informatica Power Centre 8.x., Oracle