Rathaiah Kommineni
*********@*****.***
Summary:
** ***** ** ********* ********** in Data Warehousing ETL using Matillion, Informatica 10.5.5/10.2.0/ 9.6.1/9.5.1/9.1.0, Informatica Cloud Services (IICS)/ Informatica Data Quality 9.6.1/9.5.1 and Abinitio GDE.
Perform data analysis, data profiling, data quality and data ingestion in various layers using big data/Hadoop/Hive/Impala queries and UNIX shell scripts.
Followed the organization coding standard document, Create jobs, mappings, sessions and workflows as per the mapping specification document.
Create and update design documents, provide detail description about workflows after every production release.
Performance tuning long running ETL/ELT jobs by creating partitions, enabling full load and other standard approaches.
Perform Quality assurance check, Reconciliation post data loads and communicate to Client for receiving fixed data.
Participated in ETL/ELT code review and design re-usable frameworks.
Create Remedy/Service Now tickets to fix production issues,
Create Support Requests to deploy Database, Hadoop, Hive, Impala, UNIX, and ETL/ELT to UAT environment.
Participate in meetings to continuously upgrade the Functional and technical expertise.
Experience on Teradata tools and utilities (BTEQ, Fast load, Multi Load, Fast Export, and TPUMP).
Strong in Data warehousing concepts, dimensional Star Schema and Snowflakes Schema design. Data integration with SQL Server and Oracle using Informatica cloud.
Troubleshoot and maintain ETL/ELT jobs running using Matillion.
Client: ACG-AAA Nov 2021 –May 2024
Role: IICS ETL Developer
Project: SFDC Migration from GCP
Description:
This project will implement a new policy system that will enable a tiered Auto Insurance product in our ACG footprint, Joint Venture states and Florida and By-Peril Tiered Home product in our ACG footprint and Joint Venture states. The functionality delivered with this project will support the quote, new business, amendment, out of sequence, inquiry, and renewal functions in these markets. This includes proper generation of all customer documents and all statutory, financial, sales compensation and operational reporting. FACTS functionality to support claims on these policies is in scope. These products will be consistent across markets with differences allowed for regulatory reasons and market competitiveness. The products lines to be supported on this platform include, Auto, Home, PUP, and Specialty Lines.
Responsibilities:
Introduction of new policy system (Guidewire) to provide additional functionality and features to support new and existing products.
Designed, Developed and implemented ETL Process using IICS CDI.
Created IICS Connections Using Various cloud connectors in IICS Administrator.
Extensively used Cloud transformations – Aggregators, Joiners, Lookups (Connected and Unconnected), router, sorter, filters and expressions.
Writing Big Query SQL Scripts to extract the data from Informatica Cloud IICS Mappings.
Hands on development in the implementation of Informatica Intelligent Cloud Services (IICS) and Power center.
Interacting with the Source Team and Business to get the Validation of the data.
Created Mapping Task, Mappings to load data into SFDC thru Informatica Cloud IICS.
Created Synchronizations tasks.
Involved in resolving issue tickets related data issues, ETL issues, performance issues, etc.
Environment: Informatica Power Center 10.2.0, Informatica Intelligent Cloud Services (IICS), GCP Big Query, Salesforce, Oracle, Linux 2.6.32, BODS, Flat File, UNIX
Client: Kaiser Permanente June 2020 – Oct 2021
Role: Sr. Data Warehouse ETL Lead
Responsibilities:
Designed and implemented the Informatica mapping to process the data from Flat files to Generic tables, Generic tables to Stage then Reporting tables.
Created mockup data, perform Unit testing and capture the result sets against the jobs developed in lower environment.
Updating the production support Run book schedule document as per the production release.
Created and update design documents, provide detail description about workflows after every production release.
Created Service Now tickets to fix production issues,
Created Support Requests to deploy Database, UNIX, and ETL code to UAT environment.
Created re-usable framework for Audit Balance Control to capture Reconciliation, mapping parameters and variables, serves as single point of reference for workflows.
Implemented soft delete process whenever data has been deleted in source for the existing Stage and Report tables.
Participated in meetings to continuously upgrade the Functional and technical expertise.
Interacting with the Source Team and Business to get the Validation of the data.
Created PL/SQL Procedure to process the data from flat files to Oracle.
IBM Rational Team Concert used to maintain code version.
Environment: Informatica Power Center 10.2.0, Oracle 12.1.0, Linux 2.6.32, Service Now, IBM Rational Team Concert, EPIC
Client: IHG/TCS June 2019 – May 2020
Role: Sr. ETL Lead/Admin
Responsibilities:
Designed and customized data models for Data warehouse supporting data from multiple sources on real time.
Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
Writing SQL Scripts to extract the data from Database and for Testing Purposes.
Interacting with the Source Team and Business to get the Validation of the data.
Involved in resolving issue tickets related data issues, ETL issues, performance issues, etc.
Used Unix Schell scripting to automate several ETL processes.
Created Shell scripts to execute Informatica Workflows
Promoted scripts to Upper Environments (Test, prod).
Performed code promotion to Upper environment.
Fulfilled Incident and Change request in service now for Code migration.
Enhanced the code before promoted to Prod environment.
Environment: Informatica Power Center 10.1.1, Oracle 12.1.0, Linux 2.6.32, Teradata 16.10.02.08, Teradata Utilities
Client: SGWS April 2018 – May 2019
Role: Sr. ETL Developer
Responsibilities:
Created Concurrent workflows to process the data from flat file to Hadoop and Hadoop to Redshift
Created copy commands to load the data into S3 and Redshift tables.
Created the unload commands from redshift to third party vendors.
Written redshift scripts to process the data from Redshift to S3 then Hadoop.
Created Shell scripts to execute Impala scripts.
Created Shell scripts to execute Informatica Workflows
Promoted scripts to Upper Environments (Test, prod).
Data Transformed to Microsoft SQL Server
Created external tables and views in SQL Server to access the data from HDFS.
Used Impala scripts to process data from Hadoop to Outbound systems.
Created Hadoop scripts to copy the file from local to Hadoop.
Created Hadoop partition in Impala.
Data has been loaded into S3 thru Informatica Power center.
Environment: Informatica Power Center 10.0.0, Microsoft SQL Server Management Studio 13.0.15000.23, Red Hat Enterprise Linux Server 7.1.
AWS Redshift, Impala, Hive and Hadoop. Microsoft T-SQL, S3
Client: Ryder Systems Inc., FL Sep 2015 – Mar 2018
Role: Sr. ETL Developer
Responsibilities:
Used Informatica Power Center 9.6.1 for extraction, transformation and load (ETL) data in the Netezza.
Used Informatica Cloud services (ICS) to populate the data into Redshift and Netezza.
Used Informatica Power Center Workflow manager to create sessions, workflows.
Develop and improve standards and procedures to support quality development, testing, production, and operational oversight of the data warehouse processes.
Developing detailed architecture and technical work plan to ensure successful development and implementation of the applications.
Develop and improve standards and procedures to support quality development, testing, production, and operational oversight of the Data flows into Netezza and Amazon Redshift.
Developing detailed architecture and technical work plan to ensure successful development and implementation of the applications.
Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
Extensively read data from SQL Server and load into Netezza.
Data has been loaded into S3 thru Informatica Power center and Informatica Cloud (ICS)
Environment: Informatica Power Center 9.6.1, Netezza 7.2.0, AWS Redshift, Sql Server 2014, Redwood 9.0, T-SQL, S3
Client: CBSi, FL Jul 2012 – Aug 2015
Role: Sr. ETL Developer
Responsibilities:
Extract, cleanse, and transform data to ensure it is accurate, reliable, and consistent.
Used Informatica Power Center 9.1 for extraction, transformation and load (ETL) of data in the data warehouse.
Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
Develop and improve standards and procedures to support quality development, testing, production, and operational oversight of the data warehouse processes.
Developing detailed architecture and technical work plan to ensure successful development and implementation of the applications.
Extensively used Transformations like Router, Aggregator, Sorter, Joiner, Expression and Lookup, Update strategy and Sequence generator and SQL Transformations.
Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
Created procedures to truncate data in the target before the session run.
Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
Created the ETL exception reports and validation reports after the data is loaded into the warehouse database.
Extracted Change Data Capture data (CDC) and Loaded into Data Marts (DM) for Reporting
Created and addressed JIRA tickets for Operative to Financial supported Application systems.
Created technical specifications, process/workflow design, user acceptance test planning and cut-over planning.
Provided inputs for the most optimum and feasible design in terms of performance and maintenance after thorough analysis of the requirement.
Provided production support by documenting tickets and communicating with customer and vendor.
Analyzed production support related issues, inquiries and requests.
Provided a concrete explanation of resolutions to issues, inquiries and requests.
Created Data Dimension Data Model using Erwin for reporting purpose.
Created Data Sheets (PDF) using Erwin tool.
Environment: Informatica Power Center 9.1, Python, Hadoop, UC-4, JIRA, MYSQL, Oracle 11gR1, Toad, Teradata Sql Assistant, UNIX, XML, TeradataV13.10, Erwin.