Post Job Free
Sign in

Informatica ETL developer

Location:
Cypress, CA
Posted:
May 12, 2024

Contact this candidate

Resume:

SARIKONDA BHANU SIVA KUMAR

+1-714-***-****

ad5nux@r.postjobfree.com

Linked in Id: bhanu siva Kumar s

(https://www.linkedin.com/in/bhanu-siva-kumar-s-1b9899bb)

Summary

** ***** ** ** ******** experience as ETL lead and Senior ETL Developer with experience in domains Health care, Insurance, Telecom and banking.

Extensive experience in informatica IICS cloud data integration (CDI), IICS cloud application integration (CAI), Informatica PowerCenter, Microsoft SSIS as ETL tools.

Experience in writing complex SQL queries, stored procedures and functions using PL/SQL and SOQL.

Experience in using data sources Salesforce, Oracle, Teradata, DB2, Ms SQL server, Text files.

Experience in extracting data from webservices and transforming data.

Experienced in handling SCDs (Slowly Changing Dimensions) using ETL tools.

Experience in change data capture (CDC).

Extensive experience in Performance Tuning in Identifying and fixing bottlenecks also tuned the complex ETL loads for better Performance.

Experience in scheduling the jobs using Tidal scheduler, ESP workload scheduler, Postman collection run.

Knowledge in AWS cloud database and storage services – RDS, DynamoDB, Redshift, S3.

Knowledge in using snowflake in ETL development.

Knowledge in using Qlik Replicate to Replicate data from different data base servers.

Develop End to End Data Migration jobs including steps like Cleansing, Data Validation, transformation and Data load in Base tables.

Experience in analysing data models and developing ETL mappings (OLTP, File feeds).

Troubleshoot reporting issues. Trace data/logic and report back on root cause.

Experience in understanding the requirements and preparing data mapping documents.

Experience in working Onsite/offshore model and Onsite ETL lead.

Knowledge in Artificial Intelligence & Machine Learning technologies, Python and Machine learning libraries.

Knowledge of the System Development Life Cycle, Agile/SCRUM methodologies.

Academics

Post Graduate Diploma in Machine Learning and Artificial Intelligence from International Institute of Information Technology, Bangalore, India -

M. Tech in Power systems from National institute of Engineering, Mysore, India.

B. Tech from SK University, Anantapur, India.

Accreditations

AWS certified Cloud practitioner.

AWS certified Cloud architect.

Advanced certification in Artificial Intelligence & Machine Learning from IIIT, Hyderabad

Healthcare – AHM 250 certified

Technical Skills

ETL

IICS cloud data integration (CDI), IICS cloud application integration (CAI), Informatica PowerCenter, Microsoft SSIS

Databases & Tools

Salesforce, Oracle, DB2, Teradata, MS SQL server, snowflake, RDS, DynamoDB, Redshift, S3, delimited and fixed width Text files, Webservices, Salesforce workbench, Teradata SQL Assistant, DB Visualizer, SQL developer, Toad, MS SQL server management studio, Putty, Tectia SSH terminal, SQL developer data modeler.

Deployment and FTP Tools

Perforce, IBM Urban Code deploy, Microsoft TFS, Appdb, winscp,Filezilla

Scheduling tool

ESP workload Automation,Control M, Tidal scheduler, Postman collection run

Programming Languages

SQL, SOQL, Xquery, PL/SQL, Python, Unix commands/Shell Scripting.

Other tools

Quality Center, Test Track,Team Track,Team foundation server,CA Rally

Professional Experience:

Client: CareMore Health, Cerritos, CA Feb 2020 – Till now

Role: ETL Lead

Project: EDH

Description: Main aim of this project is to develop ETL code by using Informatica cloud CDI (IICS) to load the data from salesforce objects – Opportunity, Opportunity Products to Oracle stage tables, from stage tables to reporting tables with additional requirement logic useful for reporting purposes. These oracle reporting tables are used by Tableau reporting team. Other requirement is to extract data from webservices applications, transforming data and sending the transformed data to other web service applications in real time based on events in salesforce object.

Responsibilities:

•Worked with functional analysts and business users in understanding the information needs of the business and developing new Data Warehouse in an Agile environment

•Developed mappings, mapping tasks, task flows in IICS CDI to load data from salesforce objects to Oracle database tables.

•Implemented mappings with webservice transformations to extract data from webservice and load the data to oracle tables using Rest API V2 connector.

•Developed process in IICS CAI for real time data processing from webservices.

•Developed SOQL queries to extract data from Salesforce objects.

•Responsible for implementing Incremental Loading mappings using Mapping Variables and Parameter Files in IICS

•Performed code reviews and unit testing.

•Implemented ETL performance tuning techniques.

•Created job schedules in IICS and run the jobs.

•Migrating the Task flows to UAT and production

Environment: Salesforce, Salesforce Workbench, IICS Cloud Data Integration (CDI) and Cloud Application Integration (CAI), Postman, Oracle DB, SQL developer, SQL, SOQL, Jira.

Client: CareMore Health, Cerritos, CA Apr 2018 – Jan 2020

Senior ETL developer

Project: RDP

Description: Main aim of this project is to extract, transform and load data from different source systems such as Facets, CareEvolution,Pharmacy_analytics to HealthyStart, HealthyJourney, StarsOutreach tables in ADP data base. This database is enterprise analytical data platform to host all the aggregate data to server self-service BI and other downstream data needs such as Navigator reporting needs.

Responsibilities:

Understanding the requirements and data mapping documents.

Preparing DDL scripts to create dimension tables.

Prepared complex SQL queries and used in ETLs.

Implemented changed data capture by using control tables.

Created Informatica mappings by using SQL transformation, Joiner, Lookup, Expression, Aggregator, Router, Lookup, Update Strategy and Sequence Generator.

Worked with various Lookups-Connected lookup, un connected lookup, Static cache, and Dynamic cache lookups.

Parameterized using mapping, session parameters, and workflow variables by defining them in parameter files.

Analysed session logs, session run properties, Workflow run properties.

Analysed rejected rows using bad files.

Used UNIX shell scripting and PMCMD commands to run Informatica workflows.

Performed code reviews and unit testing.

Deployment of Informatica workflows to test, stage and production.

Implemented ETL performance tuning techniques.

Performed data analysis with huge data load.

Worked closely with data modeller and business analyst

Environment: Informatica Power Center 10.0, Oracle, CSV files, Unix Shell Scripting, SQL, PL/SQL, SQL developer, Putty, winscp.

Client: Allergan, Irvine, CA

Employer: Apps solutions Inc. Sep 2017 – Mar 2018

Senior ETL developer

Project: Eyecare

Description: Main aim of this project is to process the daily files getting from vendors - PRIA and TRIPLEFIN healthcare and Integrate with the Allergan database tables which are accessed by reporting system. Developed ETLs to load data from files to Staging area and from staging area to Dimension, fact tables.

Responsibilities:

Understanding the requirements and data mapping documents.

Preparing DDL scripts to create dimension tables.

Prepared complex SQL queries and used in ETLs.

Implemented changed data capture by using control tables.

Created Informatica mappings by using SQL transformation, Joiner, Lookup, Expression, Aggregator, Router, Lookup, Update Strategy and Sequence Generator.

Worked with various Lookups-Connected lookup, un connected lookup, Static cache, and Dynamic cache lookups.

Parameterized using mapping, session parameters, and workflow variables by defining them in parameter files.

Analysed session logs, session run properties, Workflow run properties.

Analysed rejected rows using bad files.

Used UNIX shell scripting and PMCMD commands to run Informatica workflows.

Performed code reviews and unit testing.

Deployment of Informatica workflows to test, stage and production.

Implemented ETL performance tuning techniques.

Performed data analysis with huge data load.

Worked closely with data modeller and business analyst

Environment: Informatica Power Center 10.0, Oracle, CSV files, Unix Shell Scripting, SQL, PL/SQL, SQL developer, Putty, winscp.

ThermoFisher Scientific, Carlsbad, CA Mar 2017 – Sep 2017

Senior Informatica ETL Developer

EDW DSR

Description: Main aim of this project is to develop ETL code to replace the existing business logic developed in Business objects reporting tool, which contains huge number of complex case statements and the reporting tool is not able to handle additional case statements. Developed ETL mappings to load huge data to dimension tables and fact tables.

Responsibilities

Extracted the data from Oracle, Flat files and Access into Data warehouse.

Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Source Qualifier, Expression, Lookup, aggregate, Update Strategy, and Joiner. Used connected unconnected, Static and Dynamic lookup transformations.

Developed mappings by using parameters and Variables.

Analysed session logs, session run properties, Workflow run properties, rejected records.

Involved in identifying the bottlenecks and tuning to improve the Performance.

Performed code reviews and unit testing.

Scheduled informatica workflows using Control M.

Worked with the testing team in resolving the issues during system integration testing and UAT

Developed stored procedures and functions with PL/SQL programming.

Environment: Informatica Power Center 10.0, Oracle, CSV files, Unix Shell Scripting, SQL, PL/SQL, SQL developer, Putty, winscp.

Client: Cisco, San Jose, CA

Employer: Apps solutions Inc. Nov 2016 – Feb 2017

Senior ETL Developer

Project: Team space & DRF

Description: This project is to develop an approach that address the data downtime issues during data refresh in publish layer which is used by micro strategy reporting system. Developed ETL code which will load the data to publish layer tables based on synonym definition and swapping the synonym at the end of the load. This project includes development of ETL code that loads data to 90 different tables in publish layer based on the synonym definition.

Responsibilities:

Preparing DDL scripts to create tables and synonyms in Publish layer database.

Created Logical, Relational and Physical data models using SQL developer data modeler.

Created mappings by using SQL transformation, Joiner, Lookup, Expression, Aggregator, Router, Lookup, Update Strategy and Sequence Generator.

Worked with various Lookups-Connected lookup, un connected lookup, Static cache, and Dynamic cache lookups.

Parameterized using mapping, session parameters, and workflow variables by defining them in parameter files.

Analysed session logs, session run properties, Workflow run properties.

Analysed rejected rows using bad files.

Used UNIX shell scripting and PMCMD commands to run Informatica workflows.

Performed code reviews and unit testing.

Performed deployment of informatica code by using IBM Urban Code deploy.

Migrated data base tables to next environment by using Appdb tool.

Run the informatica jobs by using Tidal scheduler.

Implemented performance tuning techniques.

Environment: Informatica Power Center 10.0, Oracle, Unix Shell Scripting, SQL, PL/SQL, Toad, SQL developer data modeler tool, Tectia SSH terminal, IBM Urban Code deploy, Appdb,Tidal scheduler, Tidal Transporter, winscp,Rally.

Liberty Mutual, Dover, NH Feb 2016 – Nov 2016

Senior Informatica ETL Developer

CCO ADW

Description: The main objective of this project is to create a central policy holder data warehouse that facilitates accurate and efficient analysis of customer data which is located in CCO application. Developed ETL mappings for stage table loads, ODS table loads and base table loads.

Responsibilities:

Understanding the requirements and data mapping documents.

Prepared high level mapping design documents from requirements.

Developed mappings for Audit reconsolidation of daily loads.

Implemented changed data capture by using control tables.

Implemented SCD type 1 for ODS table loads.

Created mappings by using Joiner, Lookup, Expression, Aggregator, Data masking, Router, Lookup, SQL transformation, Update Strategy and Sequence Generator.

Worked with various lookups-Connected lookup, un connected lookup, Static cache, and Dynamic cache lookups.

Parameterized using mapping, session parameters, and workflow variables by defining them in parameter files.

Analysed session logs, session run properties, Workflow run properties.

Analysed rejected rows using bad files.

Performed code reviews and unit testing.

Performed deployment of informatica code.

Run the informatica jobs by using Tidal scheduler.

Implemented performance tuning techniques.

Leading a team of ETL developers from offshore.

Environment: Informatica Power Canter, Teradata, Unix Shell Scripting, Teradata SQL Assistant, Putty, Tidal scheduler, Tidal Transporter, winscp, Filezilla.

Client: Salesforce, San Francisco, CA

Employer: Apps solutions Inc. Jan 2015 - Jan 2016

Senior ETL Developer

Project: SFDC core Services

Description: This project aims at extracting data from seven data centres to core tables. Used existing logic in stored procedures of different packages to implement the ETL code by using informatica power center. Developed informatica code to load data to three stages - loading data to stage tables, ODS tables and finally to core tables. Core tables are loaded with aggregated data which is useful for analysing the revenues generated.

Responsibilities:

Understanding the requirements.

Understanding the existing logic in PL/SQL stored procedures of different packages to implement ETL code in informatica.

Created Informatica mappings for stage tables, ODS tables and core table loads.

Extensively worked with Informatica power center.

Created mappings by using Joiner, Expression, Aggregator, Router, Lookup, SQL transformation, Update Strategy and Sequence Generator.

Parameterized ETL mappings by defining parameters and declaring them in parameter files.

Used UNIX shell scripting and PMCMD commands to run Informatica workflows.

Analysed session logs, session run properties, Workflow run properties.

Analysed rejected rows using bad files.

Performed code reviews and unit testing.

Implemented performance tuning techniques.

Used mapping debugger for data and error conditions to get trouble shooting information.

Environment: Informatica Power Canter, Oracle 11g, SQL developer, tidal, Unix Shell Scripting, Putty,

SQL, PL/SQL,

Terex - Genie,

Redmond, WA Feb 2014 – Jan 2015

Senior ETL Developer/Onsite lead

Terex AWP

Description: Terex is global company. In the process of acquisition of genie Aerial network platform, this project aims at two major parts – Migration and Integration of data. Migration part aims to migrate the data from old legacy systems to Terex Management Systems by developing ETL mappings with transformation rules to meet the Terex Management System requirements.

Second part is to integrate the TMS data with different Terex applications – ECOM, Seva and Winscope. Developed ETL mappings with different transformation rules to load data to 23 interfaces like Customer-billto, Customer-shipto, Item Master, Item comments, Sales orders, Warranty, Payment terms, Ship method, Invoice etc.

Responsibilities:

Understanding the requirements and data mapping documents.

Created Informatica mappings for data migration from DB2 to Oracle.

Created Informatica mappings for data integration with different Terex applications in Ms sql server.

Extensively worked with Informatica power centre.

Created mappings by using Joiner, Expression, Aggregator, Router, Lookup, SQL transformation, Update Strategy and Sequence Generator.

Worked with various Lookups-Connected lookup, un connected lookup, Static cache, and Dynamic cache lookups.

Parameterized using mapping, session parameters, and workflow variables by defining them in parameter files.

Implemented SCD type 2 in the integrations.

Analysed session logs, session run properties, Workflow run properties.

Analysed rejected rows using bad files.

Performed code reviews and unit testing.

Implemented performance tuning techniques.

Used mapping debugger for data and error conditions to get trouble shooting information.

Created data visualization reports and dashboards using Tableau desktop.

Leading a team of 3 ETL developers working from offshore.

Environment: Informatica Power Center,DB2, Oracle 11g, Ms SQL server, SQL developer, DB Visualizer, SQL server management studio, Tableau desktop, Putty, SQL, Unix shell scripting, Windows 8

Tata Consultancy Services, Hyderabad, INDIA Nov 2012 - Jan 2014

Senior ETL Developer

Project: Quadramed, Enterprise Scheduling and Data warehouse

Description: Data warehouse is developed to analyse the process, outcome and utilization measures by each facility, department and resource of the healthcare providers. Process measures include number of patients treated in hospital, outcome measures include number of patients with change in their health status, and utilization measures include maximum, minimum and average length of stay, bed occupancy rate. Quality and performance excellent departments at provider side can use EDW to identify opportunities for improvement.

Responsibilities:

Understanding the health care Business Requirements and map to the Technical Requirements. Developed ETL Design & process documents and mapping document.

Created Several Informatica Mappings to populate the data into dimensions and fact tables.

Extensively used ETL to load data from multiple sources to Staging area and implemented transformation logics to load data to base tables.

Created different transformations such as Source Qualifier, Joiner, Expression, Aggregator, Rank, Lookups, Filters, Stored Procedures, Update Strategy and Sequence Generator.

Analyzed session logs, session run properties, Workflow run properties, rejected records.

Involved in identifying the bottlenecks and tuning to improve the Performance.

Parameterized using mapping and session parameters.

Partitioned Sessions for concurrent loading of data into the target tables.

Performed code reviews and unit testing.

Environment: Informatica Power Center, Oracle, SQL, SQL developer, Windows XP, Putty, UNIX.

Tata Consultancy Services, Hyderabad, INDIA June 2008 – Oct 2012

Senior ETL Developer

Project: British Telecom, Consumer data warehouse

Description: The main objective of the project is extracting the data and loading into data mart. The idea is to build a Decision Support System for executives. The project initial goal is to build the capability to better analyze the product performance, revenues generated from various customer segments.

Responsibilities

Extracted the data from Oracle, Flat files and Access into Data warehouse.

Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Source Qualifier, Expression, Lookup, aggregate, Update Strategy, and Joiner. Used connected unconnected, Static and Dynamic lookup transformations.

Developed mappings by using parameters and Variables.

Analysed session logs, session run properties, Workflow run properties, rejected records.

Involved in identifying the bottlenecks and tuning to improve the Performance.

Performed code reviews and unit testing.

Scheduled informatica workflows using Control M.

Worked with the testing team in resolving the issues during system integration testing and UAT

Developed stored procedures and functions with PL/SQL programming.

Environment: Informatica Power Center, Oracle, SQL, PL/SQL, SQL developer, Putty, UNIX.

Employer: QuinStreet, Pune, INDIA Apr 2006 – May 2008

ETL developer

Project: Guide to Lenders

Description

Main objective of this project is to Collect data from different sources like oracle, sql server, flat files and generate the reports for performance analysis by using excel pivot tables.

Guide to lenders site offers an online consumer service that links borrowers to a network of qualified mortgage brokers and lenders throughout US. It has a wide array of programs ranging from mortgage refinance, home equity loan, debt consolidation mortgage, New Home Loan. It enables to locate qualified lenders in different areas of the country, review loan features and comparison shop for competitive rates, search thousands of loan programs to find the one that works, estimate monthly payments, calculate loan amounts and compare loan programs, submit your profile and contact different clients which match your needs.

Based on information consumer provides, It searches Guide to lender's extensive database to identify the mortgage broker or banker best suited to meet consumer’s financing needs, at the most competitive rates. Consumer’s information is validated and sent to matching client using different functionalities like Auto email, and Client integration.

Responsibilities:

Client Interaction on regular basis for requirement collation

Extensively involved in reviewing and analyzing the business and functional Requirements

Performed data cleaning by eliminating null values

Collect data from different sources like oracle, sql server, flat files

Form the final data in common format with the business rules

Create the pivot reports from the source.

Environment: Oracle, sql server, Microsoft SSIS, SQL, TOAD, sql server management studio



Contact this candidate