RAKESH PATEL
SQL Developer / Informatica Developer
Phone: +1-469-***-**** Email: ************@*****.*** US Citizen http://www.linkedin.com/in/rakesh-patel-1a385536
PROFESSIONAL SUMMARY
With over 7 years of experience in the IT industry, I possess a wide range of skills and expertise in various aspects of data warehouse applications. My primary focus has been on analysis, design, development, implementation, and troubleshooting. Specifically, I have honed my skills in working with Informatica Cloud IICS/IDMC and Informatica PowerCenter, an ETL tool, where I have designed and developed ETL processes, created mappings, sessions, and workflows. I have also been responsible for ensuring data quality and integrity, as well as resolving any data integration issues that arise in an Agile environment. In my previous roles, I have successfully delivered two data warehouse projects. Additionally, I have optimized and fine- tuned SQL queries for enhanced performance, while also ensuring data quality and consistency throughout the entire ETL process. I bring a strong background in ETL skills for data warehousing and possess expertise in working with databases such as Oracle, SQL Developer and SQL Server. My proficiency in writing and interpreting complex SQL statements and PLSQL enables me to handle complex data transformations effectively. Furthermore, I have garnered valuable experience in SQL optimization and performance tuning. I am adept at identifying performance bottlenecks and employing techniques to optimize Informatica Load for better efficiency and overall performance. I am well-versed in Unix command and understand Shell scripts.
Good experience in Informatica Cloud IICS/IDMC and Informatica PowerCenter 10.5, 9.6.1 along with a strong ETL background.
Good in Informatica Cloud features like Agent configuration, task automation, deployments of task flows across different environments etc.
Proficient in using Informatica Intelligent Cloud Services (IICS) for data integration and transformation.
Good experience in developing ETL applications and statistical analysis of data on databases ORACLE 11g/12c, and SQL Server 2017.
Strong experience in implementing CDC using Informatica Power Center.
Solid understanding of REST API and SWAGGER, enabling seamless integration and API management within
IICS.
Comprehensive experience of all major transformations within IICS, ensuring efficient data processing and manipulation.
Access the Informatica Cloud REST API from command line using the CURL utility
Skilled in creating and managing mappings, mapping tasks, and task flows, optimizing workflow automation.
Managing hybrid integrations between IICS and on-premises systems, ensuring seamless data flow and connectivity.
Extensively worked on Informatica Power center transformations as well like Expression, Joiner, Sorter, Filter, Router, and other transformations as required.
Good in Data Warehouse Concepts like Dimension Tables, Fact tables, Slowly Changing Dimensions,
DataMart’s, and Dimensional modeling schemas.
Experience in Data modeling; Dimensional modeling and E-R Modeling and OLTP AND OLAP in Data Analysis.
good understanding of snowflake schema and star schema.
Created PL/SQL stored procedure, Function, cursor, triggers and packages for moving the data from staging to DataMart.
Involved in data loading using PL/SQL and SQL*Loader to download and manipulate files.
Creation of database objects like tables, views, materialized views, procedures using oracle tools like PL/SQL Developer and SQL Plus.
Strong Experience in developing Sessions/Tasks, Worklets and Workflows using Workflow Manager
tools -Task Developer, Workflow & Worklet Designer.
Identifying and resolving performance bottlenecks in IICS environment, optimizing resource usage and processing times.
Experience in debugging mappings, identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
Experience in tuning and scaling SQL for better performance by running explain plan and using different approaches like Bulk Connect hint and bulk load.
Experience in Performance tuning of ETL process using pushdown optimization.
Good experience in automation of ETL process, error handling and auditing purposes.
Skilled in creating & maintaining ETL Specification Documents, Use Cases, Source to Target mapping, Requirement Traceability Matrix, providing Effort estimates and deployment artifacts
Assisted the other ETL developers in solving complex scenarios and coordinated with source systems owners with day-to-day ETL progress monitoring.
EDUCATION
Bachelor of Electronics and communications from The institute of engineering, India
TECHNICAL SKILLS
ETL: Informatica PowerCenter 10.5/10.2/ 9.6.1, Informatica Cloud
RDBMS: Oracle 18c/12c, SQL Server 2019/2017, MySQL, PL/SQL
Other Tools: Notepad++, Toad, SQL Navigator, JIRA, Rally, AutoSys,
Control M, POSTMAN
WORK EXPERIENCE
HUMANA, IRVING, TX
DURATION: JAN 2022 TO CURRENT
ROLE: INFORMATICA DEVELOPER
HUM EDW
Humana operates as a health and well-being company in the United States. The company offers medical and supplemental benefit plans to individuals. It also has contract with Centers for Medicare and Medicaid Services to administer the Limited Income Newly Eligible Transition prescription drug plan program; and contracts with various states to provide Medicaid, dual eligible, and long-term support services benefits.
Responsibilities:
Worked with BA on requirement gathering, understanding current business flow and scenarios.
Created Technical Design Document from BDR and analyzed system requirements to identify system impacts.
Performed data analysis, data profiling, and created data dictionary, used 100 of SQL query to perform
these activities.
Prepared source to target mapping and conducted meetings with the business to understand data
transformation logic.
Designed, built, and maintained mappings, sessions, and workflows for the target load process.
Designed mappings for Slowly Changing Dimensions (Type 1, Type 2 and more), used Lookup (connected
and unconnected), Update strategy and filter transformations for loading source data.
Extensively used transformations like source qualifier, router, filter, sorter, aggregator, lookup, joiner, expression, and sequence generator, Store Procedure, XML transformations in extracting data in compliance with the business logic developed.
Wrote SQL overrides in source qualifier to filter data according to business requirements.
Analyzed the existing mapping logic to determine the reusability of the code.
Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
Done extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions, or system. This led to better session performance.
Created Unit test plans and did unit testing using different scenarios separately for every process. Involved in System test, Regression test and supported the UAT for the client.
Performing ETL and database code migrations across environments using deployment groups.
Populating the business rules using mappings into the target tables.
Created different parameter files and started sessions using these parameter files using PMCMD command to change session parameters, mapping parameters, and variables at runtime.
Externally to trigger IICS Tasks using RunAJobCli tool or check the status of a running task.
Worked in slow running SQL from Production environment; Rewrote new SQL queries and fixed the issue.
Tuned the mappings by removing the Source/Target bottlenecks to improve the throughput of the data loads.
Integrating data from Oracle to Snowflake using IICS/IDMC ETL tool.
Created batch scripts for automated database build deployment.
Worked in the ETL Code Migration Process from DEV to QA and to UAT using CICD pipeline.
Worked with offshore team on daily basis to ensure commitments are met.
Environment: IICS /IDMC, Informatica PowerCenter 10.5, Oracle 18c, Snowflake Cloud DB, SQL Developer, SQL Server, Unix, AutoSys, Putty, WinSCP, Confluence, GitHub, Jenkins
CLIENT: METLIFE, INC., LONGWOOD, FL
DURATION: MAR 2019 TO DEC 2021
ROLE: INFORMATICA DEVELOPER
MEDW RELEASE 1.7
MetLife, Inc. engages in the insurance, annuities, employee benefits, and asset management businesses worldwide. The company offers life, dental, group short-and long-term disability, individual disability, accidental death and dismemberment, vision, and accident and health coverages. As a part of this project data is being collected from all servicing originations Application systems, databases like oracle, flat files, mainframe files. Data is migrated from all various sources into the ingested staging table, and then pushed to the target system, Data is cleansed and transformed based on the business rules through ETL tools Informatica.
Responsibilities:
Worked with the business team to gather requirements for projects and created strategies to manage the requirements.
Documented Data Mappings/ Transformations as per the business requirement.
Worked on project documentation which included the Technical and ETL Specification documents.
Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
Involved in extracting the data from the Flat Files and Relational databases into staging area.
Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
Created Sessions and extracted data from various sources, transformed data according to the requirement and loaded it into data warehouse.
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
Imported various heterogeneous files using Informatica Power Center 9.x Source Analyzer.
Developed several reusable transformations and mapplets that were used in other mappings.
Prepared Technical Design documents and Test cases.
Extensively worked on Unit testing for the Informatica code using SQL Queries.
Implemented various Performance Tuning techniques.
Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner,
Update Strategy, Lookup, Sorter, Expression, Router, Filter, Aggregator and Sequence Generator
transformations.
Wrote numerous SQL queries to view and validate the data loaded into the warehouse.
Developed transformation logic as per the requirement, created mappings and loaded data into respective targets.
Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the process.
Improved performance testing in Mapping and the session level.
Developed Parameter files for passing values to the mappings for each type of client
Scheduled batch and sessions within Informatica using Informatica scheduler.
Worked with UNIX shell scripts for job execution and automation.
Environment: Informatica PowerCenter 10.2 SQL Server 2017, Oracle 12c, SQL, Unix, Putty, WinSCP, Confluence, GitHub, Control M, JIRA, Power BI
TD BANK, MT LAUREL TOWNSHIP, NJ
DURATION: JAN 2018 TO Mar 2019
ROLE: INFORMATICA DEVELOPER
TD EDW
TD Bank is an American national bank and the United States subsidiary of the multinational TD Bank Group. This project integrates from multiple applications into data into enterprise data warehouse; to build an Analytics solution for better reporting.
Responsibilities:
Co-ordination from various business users’ stakeholders and SME to get Functional expertise, design, and business test scenarios review, UAT participation and validation of data from multiple sources.
Analyzed source data and gathered business requirements from the business users and source data team
Analyzed business and system requirements to identify system impacts.
Created the Detail Technical Design Documents which have the ETL technical specifications for the given functionality.
Prepared source to target mapping and conducted meetings with the business to understand data transformation logic.
Worked on complex mappings which involved slowly changing dimensions.
Optimized the mappings and implementing the complex business rules by creating re-usable transformations and Mapplets.
Debugged and performance tuning of targets, sources, mappings, and sessions.
Analyzed the existing mapping logic to determine the reusability of the code.
Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
Done extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions, or system. This led to better session performance.
Created Unit test plans and did unit testing using different scenarios separately for every process.
Involved in System test, Regression test and supported the UAT for the client.
Performing ETL and database code migrations across environments using deployment groups.
Populating the business rules using mappings into the target tables.
Involved in end-to-end system testing, performance testing and data validations.
Managed performance and tuning of SQL queries and fixed the slow running queries in production.
Improving the performance tuning of ETL mapping/workflow
Created different parameter files and started sessions using these parameter files using PMCMD command to change session parameters, mapping parameters, and variables at runtime.
Responsible for identifying bugs in existing mappings by analyzing data flow, evaluating transformations, and fixing bugs.
Involved in testing the database using complex SQL and handled the performance issues effectively.
Code walks through with team members.
Environment: Informatica PowerCenter 9.6, Oracle 12c, MySQL, UNIX, Control M, JIRA, CICD