SHYAM KAKADIA
Sr Data Engineer
ETL Informatica / IICS-IDMC
Phone – 469-***-****
Email – ***.****@*****.***
PROFESSIONAL SUMMARY
Results-oriented and highly skilled Senior Data Engineer / ETL Developer with over 10 years of experience delivering complex, scalable, and high-performance data solutions across banking, insurance, finance, and manufacturing industries. Demonstrated expertise in Informatica PowerCenter (9.x – 10.x), IICS, Data Warehousing, ETL Pipelines, Data Migration, and Cloud Data Integration using Google Cloud Platform (GCP BigQuery), Microsoft Azure, and AWS. Strong command over SQL, PL/SQL, and scripting languages such as Python, Java, and Unix Shell for automation and optimization. Proficient in data profiling, data quality checks (IDQ), and implementing SCD Type I/II, CDC, performance tuning, and job orchestration with Control-M, Autosys, and ServiceNow. Adept in CI/CD pipelines, Docker, Kubernetes, Git, and Agile/Scrum methodologies. Experienced in version control, incident management, and Agile collaboration using tools like Jira, ADO Boards, and ClearCase. Passionate about leveraging technology to enable data-driven decision-making, improve efficiency, and support enterprise-scale transformation projects.
TECHNICAL SKILLS
ETL Tools:
Informatica PowerCenter (9.x – 10.x), Informatica IICS, IDMC, Informatica Cloud Data Integration (CDI), IDQ, MDM
Databases & Data Warehousing:
Oracle, SQL Server, DB2, MySQL, PostgreSQL, VSAM, Flat Files, XML, JSON, BigQuery, Azure SQL, S3
Cloud Platforms:
Snowflake, Google Cloud Platform (GCP – BigQuery), Microsoft Azure, Amazon Web Services (AWS)
Programming & Scripting:
SQL, PL/SQL, Python, Java, Unix Shell Scripting, Bash
Data Integration & Modeling:
Data Warehousing Concepts, Data Migration, SCD Type I/II, CDC, Staging/ODS/Data Marts, Star/Snowflake Schema
Scheduling & Automation:
Control-M, Autosys, Shell Scripts, Batch Processing, ServiceNow
Containers & DevOps:
Docker, Kubernetes, Git, CI/CD Pipelines, GitHub, GitLab, Jenkins
Data Quality & Profiling:
Informatica Data Quality (IDQ), Reusable Mapplets, Transformation Rules
Version Control & SDLC Tools:
Git, ClearCase, Bitbucket, Jira, ServiceNow, ADO Boards, Agile, Scrum, PI Planning
Visualization & Reporting:
Tableau, Cognos, Excel Dashboards
PROFESSIONAL EXPERIENCE
MetLife Inc. Financial Cary NC (Remote)
Sr. ETL Informatica Developer / IICS- IDMC July 2022 – April 2025
MetLife is a leading global provider of life insurance, annuities, employee benefit serving 90 million customers in over 60 countries and has A+ (Superior) rating from AM Best.
●Worked using ETL Informatica PowerCenter tool support production running job and scheduler.
●Implementation of new Taskflow and convert PC workflow in IICS -IDMC environment using cloud tool.
●Effectively worked on Mapping Designer, Workflow Manager, and Workflow Monitor.
●Working in GCP Big Query cloud environment analyzing data.
●Create new job schedule and monitor Informatica jobs using PowerCenter monitor and Maestro/Control-M scheduler tool. experience in monitoring and reporting issues for the Daily, weekly, and Monthly processed jobs. Also, work on resolving issues on priority basis and report it to management
●Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
●Developed mapping for type 1 and type 2 dimensions and incremental loading and unit tested the mappings.
●experience with Data Extraction, Transformation, and Loading (ETL) from disparate data sources like Multiple Relational Databases (Oracle, DB2, SQL SERVER), VSAM, XML and Flat Files)
●Created UNIX shell scripting for automation of ETL processes.
●Utilized Docker and Kubernetes for containerized deployment of integration components to higher environments, and actively debugged failures in deployment and runtime execution of ETL workflows.
●Coordinate and developed all documents related to ETL design and development.
●Analyze business requirements, technical specification, source repositories and data models for ETL mapping and process flow.
●Worked with Oracle Database mappings using expressions, aggregators, filters, lookup, joiners, update strategy and stored procedure transformations.
●Raised tickets in production and fixed bugs for existing mappings in common folder for new files through versioning (Check in and Checkout) on an urgency through support for QA in component unit testing validation.
●Good skills in defining standards, methodologies and performing technical design reviews having excellent communication skills, interpersonal skills, self-motivated, quick learner, team player.
USAA San Antonio, TX Sep 2019 – July 2022
Sr. ETL Informatica Developer/ Data Analyst
●Gathering requirements and implement them into technical requirement source to target mappings
●Work closely with Business Technical leads. Support entire client Management Application and lead team.
●Experience in integration of data sources like SQL server and ORACLE and non-relational sources like flat files into staging area.
●Setup new job flow Using Control M scheduling Tool Schedule new job, monitor jobs as well order jobs and troubleshooting.
●Experience in Financial, Insurance, Banking data domain.
●Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
●Having Experience in Java and Python scripts.
●Worked on Informatica IDQ performance tuning and optimization.
●Worked with ServiceNow (Snow) to raise ticked and tracking tickets for production support.
●Having good Experience in IICS Cloud base tool like MS Azure and AWS.
●Integrated Informatica PowerCenter and Created mappings using Informatica Data Quality tool and imported them into Informatica PowerCenter as Mappings, Mapplets.
●Optimized the source queries to control the temp space and added delay intervals depending upon the business requirement for performance.
●Designed and developed Technical and Business Data Quality rules in IDQ (Informatica Developer).
●Effectively worked on Mapping Designer, Workflow Manager, and Workflow Monitor.
●Create new job schedule and monitor Informatica jobs using Control-M scheduler tool.
●Deployed and managed ETL code using Docker and Kubernetes containers across higher environments, including debugging deployment issues and ensuring stable CI/CD releases.
●Extensively used fixed bugs / tickets raised in production for existing mappings in common folder for new files through versioning (Check in and Checkout) on an urgency through support for QA in component unit testing validation.
●Connect with various Team during PI Planning Discuss future Releases, discuss features and Plan Iteration work.
F Yum! Brands, Inc., Plano, TX Nov 2016 – Aug 2019
Sr. Informatica Developer
Description: Inc., is an American fast-food company. Yum! operates the brands Taco Bell, KFC, Yum! Brands,
Pizza Hut, and Wing Street worldwide, where the brands are operated by a separate company, Yum China. Project was to Load data in Global Data platform (GDP).
Responsibilities:
●Analyze business requirements, technical specification, source repositories and data models for ETL mapping and process flow.
●Worked with Oracle Database mappings using expressions, aggregators, filters, lookup, joiners, update strategy and stored procedure transformations.
●Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
●Developed mapping for type 1 and type 2 dimensions and incremental loading and unit tested the mappings.
●Coordinate and develop all documents related to ETL design and development.
●Used repository manager to create repository, user’s groups, and managed users by setting up privileges and profile.
●Used debugger to debug the mapping and corrected them.
●Performed Database tasks such as creating database objects (tables, views, procedures, functions).
●Responsible for debugging and performance tuning of targets, sources, mappings, and sessions.
●Optimized the mappings and implementing the complex business rules by creating re-usable transformations and Mapplets.
●Optimized the source queries to control the temp space and added delay intervals depending upon the business requirement for performance.
●Used Informatica workflow manager for creating, running the Batches and Sessions, and scheduling them to run at specified time.
●Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.
●Improving the performance of the ETL by indexing and caching.
●Created Workflows, tasks, database connections, FTP connections using workflow manager.
●Responsible for identifying bugs in existing mappings by analyzing data flow, evaluating transformations, and fixing bugs.
●Created UNIX shell scripting for automation of ETL processes.
●Used check in and checkout of workflows and config files into the Clear case.
●Experience in monitoring and reporting issues for the Daily, weekly, and Monthly processes. Also, work on resolving issues on priority basis and report it to management.
Environment: Informatica PowerCenter 9.6.1, 10.2 Oracle 11g, MS SQL Server 2012, Putty, Shell Scripting, WinSCP, Notepad++, JIRA.
Cadence Design Systems, Inc San Jose, California Feb 2015 to Nov 2016
Informatica Developer
Description: Systems, Inc (Fortune 500 company) is an American multinational electronic Cadence Design automation (EDA) engineering services company.
Responsibilities:
●Gathering requirements and implement them into source to target mappings.
●Experience in integration of data sources like SQL server and ORACLE and non-relational sources like flat files into staging area.
●Designed and developed Technical and Business Data Quality rules in IDQ and worked with MDM.
●Effectively worked on Mapping Designer, Workflow Manager, and Workflow Monitor.
●Extensively used fixed bugs / tickets raised in production for existing mappings in common folder for new files through versioning (Check in and Checkout) on an urgency through support for QA in component unit testing and validation.
●Used shortcuts for sources, targets, transformations, mapplets, and sessions to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
●Integrated Informatica PowerCenter and Created mappings using Informatica Data Quality tool and imported them into Informatica PowerCenter as Mappings, Mapplets.
●Effectively understood session error logs and used debugger to test mapping and fixed bugs in DEV in following change procedures and validation.
●Scheduled jobs for running daily, weekly, and monthly loads through control-M for each workflow in a sequence with command and event tasks.
●Used most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator Update Strategy, Rank, Expression, lookups (connected and unconnected), Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
●Fine-tuned ETL processes by considering mapping and session performance issues.
●Responsible for Creating workflows and Worklets. Created Session, Event, Command, Control, Decision and Email tasks in Workflow Manager.
●Maintained the proper communication between other teams and client.
●Involved in Onsite & Offshore coordination to ensure the completeness of Deliverables.
Environment: Informatica Power Center 8.1, SQL, Shell Scripting, SQL Server 2008, Jira, Oracle 11g, Autosys.
Huntington National Bank, Columbus, OH June 2013 to Feb 2015
SQL Developer
Project Description: Worked on retail collection system, a GUI based client/server bill collection software, to assist brokers in carrying out their daily transactions and for managing banks property and equipment.
Responsibilities:
Created new tables, written stored procedures for Application Developers and some user defined functions.
Created SQL scripts for tuning and scheduling.
Performed data conversions from flat files into a normalized database structure.
Created and managed users, roles and groups and handled database security.
Created Triggers to enforce data and referential integrity.
Defined Check constraints, Business Rules, Indexes, and Views.
Configured Server for sending automatic mails to the respective people when a DTS process failure or success.
Worked on DTS Package, DTS Import/Export for transferring data from Heterogeneous Database (Oracle and Text format data) to SQL Server.
Performance tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard.
Performed development support, document reviews, test plan, integration of the system.
Creation/ Maintenance of Indexes for fast and efficient reporting process.
Analyzing the Database Growth and Space Requirement. Handling Users/Logins/User Rights
Creation/Deletion of Linked Servers.
Managing historical data from various heterogeneous data sources (i.e, Excel, Access).
Environnent : SQL Server 2012, T-SQL, PL/SQL, TOAD.
Education
Dr. Babasaheb Ambedkar Technological University
Bachelor's degree in Computer Science