Post Job Free

Resume

Sign in

Sr. Informatica ETL Developer

Location:
Pittsburgh, PA
Posted:
March 08, 2019

Contact this candidate

Resume:

Sreeram Reddy Bommu

Sr. Informatica ETL Developer

ac8pxv@r.postjobfree.com, 412-***-****

Professional Summary:

* ***** ** ** ********** in Analysis, Design, Development, Implementation, Testing and Support of Data Warehousing and Data Integration Solutions using Informatica Powercenter.

Performed all activities related to the development and implementation of Informatica code to load data into large scale databases and flat files.

Extensively worked on data extraction, transformation and loading from various sources like relational databases, flat files and XML files to various targets like flat files and databases.

Strong experience in ETL coding using Informatica PowerCenter 8.6, 9.5, 9.6 and 10.1.

Proficiency in developing SQL, PL/SQL code with various relational databases like Oracle, SQL Server and Teradata.

Worked on UNIX shell scripting to maintain files on the unix server.

Experience in creating unit test cases to ensure successful data processing.

Experience in code migration through Harvest and UDeploy.

Experience in Informatica ETL performance tuning (sources, mappings, targets and sessions) and tuning the SQL queries.

Excellent communication, documentation and presentation skills using tools like Microsoft Word, Visio, Excel and Power Point.

Hands on experience in bug fixing, tuning mappings, identifying and resolving performance issues in various levels like sources, targets, mappings, sessions and workflows.

Extensively worked on improving ETL code performance using session partitioning in Informatica.

Worked on understanding the business rules completely based on High Level document specifications and creating Source to Target specification and Design documents.

8 years of experience in Data Warehousing and understanding OLTP and OLAP concepts.

Experience in implementing code migration to production through change management processes.

Experience in Data Modeling using Star Schema/Snowflake Schema, Fact and Dimension tables.

Experience working on HDFS, HiveQL and Sqoop.

Technical Skills:

Programming: Informatica PowerCenter, UNIX, LINUX, Connect: Direct Database Languages: Oracle 12c/11g/10g SQL, PL/SQL, HiveQL, PostgreSQL Platforms: Windows XP/Vista/7

Tools: Informatica PowerCenter v8.6, v9.5, v9.6, v10.1 Toad for Oracle 11, Toad for Oracle 12

Microsoft SQL Server 2008, Teradata Studio 15, HP ALM CA Harvest Change Manager, WinSCP, Putty, ServiceNow, Urban Code Deploy, Urelease, Tidal Scheduler, CA-7, AppDB, HiveQL and Sqoop Trainings attended:

Informatica PowerCenter, Teradata SQL, Oracle SQL, PL/SQL, HiveQL, Sqoop Certifications completed:

Oracle 10g: SQL fundamentals

Team building activities:

Coordinated with team members to meet the project timelines.

Conducted sessions on Informatica Powercenter to new team members.

Experience mentoring other team members.

Education:

Master of Science in Mathematics July 2005 – May 2010 Birla Institute of Technology & Science, Pilani - Goa GPA: 8.01/10

Bachelor of Engineering in Computer Science July 2005 – May 2010 Birla Institute of Technology & Science, Pilani – Goa GPA: 8.01/10 Professional Experience: Tata Consultancy Services Ltd. Designation: I.T. Analyst Client: Cisco Systems, Inc.

Jul 2017 – Present

Projects: Bookings Gross Margin, Cisco Customer Care Management, Digitized Finance Allocation Location: San Jose, CA, USA, Jul 2017 - Present

Role: Sr.Informatica ETL Developer

Environment: Informatica PowerCenter v10.1; Toad for Oracle 12; Oracle 11g; Teradata Studio 15, Unix, HiveQL, Sqoop, PostgreSQL Cisco Systems, Inc. (known as Cisco) is an American multinational technology company that develops, manufactures, and sells networking hardware, telecommunications equipment, and other high-technology services and products.

Project Description:

Bookings Gross Margin is a project designed to capture the cost factors associated with different combinations of Cisco products & services and sales territories (region) on a yearly basis. Cisco Customer Care Management is a project designed to maintain the costs involved in technical support services of Cisco products and services, it calculates costs like technical assistance center cost and supply service chain cost.

Digitized Finance Allocation is a project designed to calculate gross revenue, shipping revenue, distributed revenue and other costs related to cisco products and services. Responsibilities:

Worked on gathering business requirements from business analyst.

Created Source to Target mapping, design documents based on business requirements.

Involved in ETL performance tuning for loading huge amounts of data to Oracle databases.

Developed ETL code like sources, targets, mappings, mapplets, sessions and workflows.

Performed unit testing and created unit test case documents to make sure proper data processing.

Worked on using various transformations like Stored Procedure, Source Qualifier, Lookup, Filter, Expression, Rank, Union, Router, Aggregator, Joiner and Sorter to develop mappings.

Used mapplets, mapping parameters and variables, session and workflow variables to make ETL code reusable.

Designed and developed various PL/SQL stored procedures to load data to target tables.

Developed SQL queries for performing testing activities.

Worked on persistent cache for proper data load to meet business requirements.

Developed reusable code like mapplets, worklets, reusable transformations and reusable tasks.

Extensively worked in bug fixing of ETL code using debugger.

Worked on code migration to higher environments using UDeploy, Urelease & AppDB for Oracle.

Worked on Tidal scheduler for scheduling of jobs.

Prepared and maintained project support documents.

Extensively worked on sourcing data from hadoop environment using HiveQL and Sqoop.

Worked extensively on root cause analysis for data mismatch issues and taking the necessary action to correct the data in production environment.

Worked as a lead in creating the task plan and assigning tasks to offshore developers and production support team.

Worked on transforming ETL logic from oracle stored procedures and Informatica mappings to postgres SQL queries.

Achievements:

Implemented a complex logic scenario using Informatica PowerCenter persistent cache. Professional Experience: Tata Consultancy Services Ltd. Designation: I.T. Analyst Client: PNC Financial Services Group

Feb 2016 – Jun 2017

Project: Consumer Data Mart

Location: Pittsburgh, PA, USA, Feb 2016 – Jun 2017 Role: Sr.Informatica ETL Developer

Environment: Informatica PowerCenter v9.5, v9.6, v10.1; Toad for Oracle 11, 12; Oracle 11g, 12c; Teradata Studio 15, Microsoft SQL Server 2008, CA-7, HP ALM PNC Financial Services Group, Inc is an American financial services corporation which engages in the provision of diversified financial services, including retail and business banking, residential mortgage banking, specialized services for corporations and government entities, including corporate banking, real estate finance and asset-backed lending, wealth management and asset management. Project Description:

Consumer Data Mart is a data mart used to store PNC mortgage & consumer loan data. Responsibilities:

Worked on creating Source to Target mapping, design documents based on business requirements.

Involved in ETL performance tuning for loading huge amounts of data from SQL Server & Teradata to Oracle databases.

Developed ETL code like sources, targets, mappings, mapplets, sessions and workflows.

Worked on excel sheet report generation from oracle views using Informatica ETL mappings.

Performed unit testing and prepared unit test case documents to make sure proper data processing.

Worked on using various transformations like Stored Procedure, Source Qualifier, Lookup, Filter, Expression, Rank, Union, Router, Aggregator, Update Strategy, Joiner, Normalizer and Sorter to develop mappings.

Used mapping parameters and variables, session and workflow variables to make ETL code reusable.

Designed and developed various PL/SQL stored procedures to perform various calculations related to fact measures.

Developed SQL queries for performing testing activities.

Worked on Informatica partitioning for efficient data processing.

Developed SCD Type 1 and Type 2 mappings for dimension tables processing.

Developed reusable code like mapplets, worklets, reusable transformations and reusable tasks.

Extensively worked on bug fixing of ETL code using Informatica PowerCenter debugger.

Worked on creating UNIX scripts for proper maintenance of files in UNIX server.

Developed Connect: Direct scripts for file transfer.

Worked on code migration to higher environments using Harvest and UDeploy.

Prepared and maintained project support documents.

Involved in implementation of code migration through change requests using ServiceNow.

Attended trainings on HDFS, HiveQL and Sqoop.

Worked on scheduling of informatica jobs using CA-7 scheduler.

Used HP ALM for application functional testing and performance testing activities. Achievements:

Extensively used informatica session partitioning like database partitioning and pass-through partitioning to load huge amounts of data from Microsoft SQL server and Teradata sources to Oracle target database.

Used FORALL and BULK COLLECT INTO with LIMIT clause to achieve better performance when doing updates on large tables.

Professional Experience: Tata Consultancy Services Ltd. Designation: I.T. Analyst Client: PNC Financial Services Group

Apr 2013 – Jan 2016

Project: Consumer Data Mart

Location: Bangalore, India, Apr 2013 – Jan 2016

Role: Informatica ETL Developer

Environment: Informatica PowerCenter v9.5, v9.6; Toad for Oracle 11, 12; Oracle 11g, 12c; Teradata Studio 15, Microsoft SQL Server 2008, CA-7 Project Description:

Consumer Data Mart is a data mart used to store PNC mortgage & consumer loan data. Responsibilities:

Prepared Source to Target mapping, design documents based on business requirements.

Worked on ETL performance tuning for loading huge amounts of data from SQL Server & Teradata to Oracle databases.

Worked on excel sheet report generation from Teradata & Oracle views.

Developed ETL code like sources, targets, mappings, sessions and workflows.

Worked on unit testing, system testing and created unit test case documents for proper data processing.

Used various transformations like Stored Procedure, Source Qualifier, Lookup, Filter, Expression, Rank, Union, Router, Aggregator, Update Strategy, Joiner, Normalizer and Sorter to develop efficient mappings.

Used mapping parameters and variables, session and workflow variables to make ETL code reusable.

Worked on Informatica partitioning for efficient data processing.

Developed various PL/SQL functions, stored procedures to perform various calculations related to data loading of fact tables.

Worked on Slowly Changing Dimension Type 1 and Type 2 mappings for dimension tables processing.

Created reusable code like mapplets, worklets, reusable transformations and reusable tasks.

Worked on creating UNIX, LINUX scripts for file management in UNIX server.

Worked on automating the validation of data into the staging area of database.

Involved in code migration to higher environments using Harvest and UDeploy.

Involved in implementation of code migration through change requests.

Scheduled informatica jobs using CA-7 scheduler. Professional Experience: Tata Consultancy Services Ltd. Client: PNC Financial Services Group

Apr 2011 – Mar 2013

Project: Market Risk Management

Location: Pune, India, Aug 2011 – Mar 2013

Hyderabad, India, Apr 2011 – July 2011

Role: Informatica ETL Developer

Environment: Informatica PowerCenter v8.6, v9.5; Toad for Oracle 11; Oracle 11g Project Description:

Market risk is the risk of losses in positions arising from movements in market prices. It is used to calculate the risk factors daily for different portfolios like Equity Derivatives, Foreign Exchange, Secondary Loan Trading, and Interest Rate Derivatives.

Responsibilities:

Responsible for Business Analysis and Requirements Collection.

Involved in all phases of SDLC from requirement gathering, design, development, testing, code migration and support for production environment.

Worked on Informatica Power Center tools – Designer, Repository Manager, Workflow Manager and Workflow Monitor.

Created Source to Target mapping documents, design documents to outline data flow from sources to targets.

Developed ETL code like sources, targets, mapplets, mappings, sessions, worklets and workflows.

Worked on loading data from flat files, XML files to database based on Dimensional modeling (Snow Flake Schema) through ETL code.

Performed unit testing and created unit test case documents.

Used various transformations like Source Qualifier, Lookup, Filter, Expression, Sequence Generator, Rank, Union, Router, Aggregator, Update Strategy, Joiner, Sorter and Stored Procedure to develop robust mappings.

Developed complex mappings that involved implementation of business logic to load data from staging area to fact tables.

Developed mapping parameters and variables to make code reusable.

Developed reusable code like mapplets, worklets, reusable transformations, reusable tasks, workflow variables.

Worked on different tasks in workflows like sessions, event raise, event wait, decision, email, Assignment, Timer and Command.

Involved in bug fixing of ETL code and performance tuning at source, target, mappings and sessions.

Worked on complex SQL queries, views, functions and stored procedures.

Worked on creating UNIX scripts for managing files in UNIX server.

Worked on creating Connect: Direct scripts for file transfer.

Performed code migration to higher environments using Harvest and preparation of project support documents.

Participated in weekly status meetings as well as conducting sessions on Informatica Powercenter, SQL and PL/SQL to new team members.

Worked on creating analyses and dashboards using reporting tool Oracle BI EE 12c. Professional Experience: Symphony Services Ltd.

Duration: Jun 2010 – Mar 2011

Project: Globally Optimized for Logistics and Delivery Location: Bangalore, India

Role: Informatica ETL Developer

Environment: Informatica PowerCenter v8.6, Oracle SQL, Unix Project Description:

Globally Optimized for Logistics and Delivery is used in retail business by Symphony Services Ltd for ordering and delivering of goods and products.

Responsibilities:

Responsible for ETL code development and bug fixing for Globally Optimized for Logistics and Delivery application.

Involved in all phases of SDLC from requirement gathering, design, development, testing, code migration and support for production environment.

Worked on developing ETL code using Informatica PowerCenter, Oracle SQL and Unix.

Created and maintained high level design documents and project documents.



Contact this candidate