Somasekhar Gundavarapu
Email:********@*****.*** Phone No.: 551-***-****
Experience Summary
14+years of experience in the entire life cycle development of Data Warehousing and Data Integration projects using industry accepted methodologies and procedures.
Having 14+ years of overall experience in ETL tools like Informatica PC, IDMC/IICS,TALEND,SSIS & Abinitio.
14+ years of experience in relational databases like Oracle, Teradata, DB2 and Sql Server.
3 years of experience in Snowflake
5 years of experience in Informatica IICS/IDMC development and migration
Good exposure On Talend,SSIS and Abinitio
Good Knowledge on Azure.
Worked across different domains – Health Care,Finance,Pharma, Insurance, Sales and Telecom
Good Work exposure on Reverse Engineering Analysis and Implementation.
Good working experience in Agile and Waterfall methodologies.
Good Workexposure on Teradata Sql,Bteq scripts,tuning and viewpoint monitoring
Experienced in activities like estimation, planning, resource loading, client interactions, new project discussions, scheduling, tracking and analysis.
Experience in Coordinating with stakeholders to resolve production job issues involves understanding the problem, engaging relevant parties, implementing temporary workarounds, raising problem tickets for permanent solution.
Extensively worked on the ETL (Extract, Transformation and Load) strategy to populate the Data from various source systems(Flatfiles,Oracle,Teradata, SalesForce,API,Cobol,SAP-BW, and DB2) using Informatica Power Center/IDMC.
Developed Informatica mappings using various transformations like update strategy, lookup, Aggregator, Rank, router, joiner, sequence generator and expression transformations to move the data from source to target.
Creating Mappings, Session, Workflows using Informatica/IICS/IDMC
Did performance tuning of Informatica workflows using Push down optimization, Partitioning and parallel processing & tuned complex Teradata and SQL queries by using explain plan, Indexes, partitions, collect stats and query re writing.
Mapping and Data Transformation: Use mapping tools within IICS CAI to define how data flows between different applications, ensuring proper transformation of data (e.g., XML to JSON) as it moves across systems.
Having good experience in migration of projects from Informatica to to IICS/Snowflake
Having experience on working with Informatica Global support for fixing the new issues occurred in IICS.
Having experience in File Listener to trigger the task flow using Informatica Cloud.
Have experience on IICS Client components like CDI, Mass Ingestion, Administrator and monitor services.
Strong understanding of Data Modeling (Relational, dimensional, Star and Snowflake Schema)
6+ Years of Experience in Working on Informatica Support Projects Activities like tracking incidents, problem tickets, change tickets, Service requests in Service now by aligning with Change Management Process.
Maintained history in history tables by implementing CDC, SCD Type 1, Type 2 using Informatica mappings & Teradata BTEQ scripts.
Comprehensively worked on SQL scripts using local and Global Temp Tables, Variables and Table Variables, Common Table Expressions (CTE) as per the requirement and convenience.
Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
Experience with shell scripting and Python scripting.
Worked on Control-M/Autosys scheduling tools
Technical Skills :
ETL Tools : Informatica 10.5/10.2/9.6/8.6,IICS/IDMC,Talend,SSIS,Abinitio
Cloud Technologies : Snowflake & IDMC
Reporting tools : Micro strategy
RDBMS : DB2,Oracle,Teradata & SQL-Server
Scheduling tool : Control-M,Autosys
Languages : SQL, PL/SQL,UNIX & Python Scripting
Others: :HP Quality Center, Jira, Confluence, Service now.
IT Professional Experience:
Worked as Sr ETL Developer at MSR Technology Group,MI,USA from Nov’2022 to Feb’2025.
Worked as Software Developer at Compunnel Software Group,NJ,USA from Jun2022 to Nov’2022.
Worked as a Tech Lead at Legato Health Technologies, India from Aug’2021 to Apr’2022.
Worked as a Sr Informatica-Teradata Consultant at Infoline LLC,Muscat,Oman from Feb’2019 to July’2021.
Worked as a System Analyst with UST-Global, India from May 2018 to Feb’2019.
Worked as a Senior Software Analyst with Wissen Infotech Pvt Ltd, India from May 2016 to Apr2018.
Worked as S/w engineering Sr’Analyst,with Accenture Services PvtLtd, India from May2013 to Dec2015.
Worked as Associate with Cognizant Technology Solutions, India from Nov’2010 to May’2013.
Worked as Technical Consultant-1 with Hewlett-Packard Global soft Ltd, India from May2010 to Nov’2010.
Educational Details:
Bachelor Of Engineering from University of Madras,Chennai,India.
Project Details
Client : Mercedes-Benz Financial Services, MI
Project Name : Dealer Credit Mart (Nov’2022 to Till date)
Role : Support & Informatica and Snowflake Developer
Tools :Informatica Powercenter /IDMC/IICS, DB2, Snowflake, Python, Unix & Oracle,
Autosys, Jira
Responsibilities:
Coordinating with stakeholders to resolve production job issues involves understanding the problem, engaging relevant parties, implementing temporary workarounds, raising problem tickets for permanent solution.
Interacting business teams if any source data not available
Performing requirements gathering and analysis, estimating time required for project completion, preparing schedules, design documentation, design reviews and development, testing and deployment of application enhancements
Involving in Scrum calls to discuss on current sprint stories
Designed, developed, and optimized ETL pipelines for data ingestion, transformation, and loading into Snowflake.
Worked on Migrating Informatica Power Center IICS Jobs to Snowflake jobs.
Automated data ingestion using Snowpipes to load structured and semi-structured data from AWS S3.
Involved in performance tuning of ETL and Sql scripts
Worked on Developing IICS Mappings, Task flows creation.
Build IICS components like mappings, mapplets, mapping tasks, task flows, Business services, Data Replication, Data synchronization tasks, file listener, Business Services, Hierarchical schema etc.
Developed and maintained data transformations using Snowflake SQL, Streams, Tasks, and Stored Procedures.
Implemented incremental data processing using Streams and Merge logic for efficient updates.
Optimized query performance using clustering, temporary tables, and materialized views.
Created external tables to access data stored in AWS S3 without loading it into Snowflake.
Developed error-handling and logging mechanisms for ETL jobs to ensure data integrity
Client : Johnson & Johnson
Project Name : Johnson & Johnson Supply Chain Application(Jun2022 to Nov’2022)
Role : Support/Enhancements
Tools : Informatica IICS,Talend, Teradata, Unix & Oracle
Responsibilities:
Involving in Operations and Development.
Debugging data issues and fixing technical issues during jobs load .
Worked on Migrating Informatica Power Center jobs to IICS jobs
Involved in preparing Bteq scripts and used Teradata utilities like (Fload,Mload,etc..)
Involving in Migration activities
Involving in supporting activities.
Created Midstream in mapping tasks using web stream transformation in IICS.
Developing Informatica Cloud (IICS) mappings and tasks, performing the validations of test cases.
Effectively using IICS Data integration console to create mapping templates to bring data into Snowflake Database from Oracle, Flat Files.
Client : Legato Helath Techonlogies
Project Name : MRDM STARS
Role : Tech Lead(Informatica & Teradata) (Aug’2021 – April’2022)
Tools : Informatica,Teradata,Unix.
Responsibilities:
Involved in the all phases of SDLC and Sprint calls.
Team co-ordination at Offshore.
Co-ordination with the onsite team members for requirement clarifications.
Client facing for requirement clarifications.
Review the code, test cases, results created by the team.
Work estimation and proper resource utilization.
Being part of organization development initiatives like training and process improvement.
Performing requirements gathering and analysis, estimating time required for project completion, preparing schedules, design documentation, design reviews and development, testing and deployment of application enhancements.
Client : Ooredoo
Project Name : ITDWH Support & Report Rationalization#1 Development
Role : Senior Informatica Teradata Developer (Feb’2019 – July’2021)
Tools : Informatica,IICS/IDMC,Talend Teradata, Microstrategy, IFRS Unix & Oracle
Responsibilities:
Involved in L1&L2 Support Jobs Monitoring.
Involved in CR and estimation of timelines for enhancements.
Coordinated with different teams for supporting the application.
Analyzed data issues & Bug fixing.
Developed the Teradata Bteq Scripts,Informatica mappings by using different type of t/f’s like sourcequalifier,Filter,Expression,Normalizer,Lookup,sequance,update,jnr,router,filter,sorter, Aggregator,etc..
Worked on gathering business requirements and implemented those in Informatica PowerCenter 10.5
and/or Informatica Intelligent Cloud Services (IICS/IDMC)
Done performance tuning on Informatica and Teradata jobs
Involved in Infromatica & Teradata Migration activities.
Trained on Talend and involved in development of ETL jobs
Did Analysis on existed code and new requirements as part of IFRS Engine issues fix.
Involved in LLD design by understanding of BRD for new Development RR Project.
Analyzed Existed DATA Model to re-build the new data model as part of new project
Client : Anthem, American Health Insurance Company
Project Name : EDS
Role : Senior Informatica & Teradata Developer (May’2018 – Jan’2019)
Tools : Informatica,Teradata & Unix
Responsibilities:
Involved ftp’ing EDI (ACES,CS90,FEP,FACETS,NASCO,WGS,WDS & VA) Files into BDF Server through Unix Script.
Involved in Scrum calls and estimation of stories timelines.
Worked on Agile process and code deliverables on every week or two weeks
Reverse Engineering Analysis and Development done for Edward tables (MBR,MBR_LANG,MBR_CNTCT &CNTCT_PREF) to get complete history of source files.
Developed the Teradata Bteq Scripts,Informatica mappings by using different type of t/f’s like Sourcequalifier,Filter,Expression,Normalizer,Lookup,sequance,update, jnr,router,filter, sorter,Aggregator,etc..
Done Lead Activities in offshore to complete the given user stories in that Sprint period itself without any issues. Worked on Control-M tool
Worked on Teradata Utilities and done SQL tuning aswell.
Done Analysis on on existed source systems and providing required documents to BDF team for Development.
Design and developed the jobs in BDM. Worked on HIVE tables .
Client : CA Technologies
Project Name : CDL Services (May’2016 –Apr2018)
Role : Informatica Sr Developer
Tools : Informatica,Teradata, Oracle
Responsibilities:
Understanding the business requirements and involved in design phase.
Prepared Design Spec’s and Mapping Spread sheet.
Involving in Daily Scrum Calls
Working on Agile process and code deliverables on every week or two weeks
Developed the Informatica mappings by using different type of t/f’s like
Sourcequalifier,Filter,Expression,Normalizer,Lookup,sequance,update,jnr,router,filter,sorter,Aggregator,etc..
Prepared the Teradata BTEQ Scripts and worked on Utilities and done SQL tuning aswell. Created and manipulated many BTEQ, Fast load, MultiLoad scripts as part of the data loading Performing data reconciliation in source and target systems.
Created tables, views and Macros using Teradata SQL Assistant Developed Numerous Fast Load and BTEQ scripts
Client : EMC CORPORATION(EMC )
Project Name :Propel Release2.(May’2013 to Dec’2015)
Tools :SAP BW,Informatica9.5.1, Teradata,Oracle and UNIX
Role : Development
Responsibilities:
Involved in requirement analysis and understanding the functionality.
Prepared the Technical Design Spec’s.
Developed the Informatica mappings by using different type of t/f’s like Application Multi Group Source qualifier,Normalizer,Expression,Lookup,sequance,update,jnr,router,filter,sorter,Aggregator,etc. Prepared test scripts for UT
Worked on Teradata Utilities and SQL tuning
Fixed the defects during FAT phase and used HPQC update the defects status.
Client : Amex
Project Name : Amex DWBI-Basel.(Sep’2011 to May’2013)
Tools : Informatica8.6, DB2 and UNIX
Role : Development
Responsibilities:
Involved LLD documents preparation.
Prepared the Mapping spread sheets as per PDM
Developed the Informatica mappings by using different type of t/f’s like Source
Qualifier,Normalizer,Expression,Lookup,sequance,update,jnr,router,filter,sorter,Aggregator,etc.. Prepared test scripts and involved in UT and SIT phase
Prepared Teradata Bteq Scripts and worked on teradata Utilities
Have taken the Responsibility as mentor on Informatica to conduct Informatica Training sessions for new recruits.
Have taken the responsibility as a DP Co- coordinator and conduct DP meetings
Had the discussion with Project team members regarding defects which we got in our project at each phase wise
Found the different root causes in the DP meeting and taken the action plans to restrict the defects which won’t be repeated in the future cases.
Client : Amex
Project Name :IDN Application.(May’2011 to Aug’2011)
Tools :,Abinitio, DB2 and UNIX
Role : Development
Responsibilities:
Prepared the Technical Design Spec’s.
Developed the B-teq Scripts which are used to load data from stage to Dimensions through informatica . Done the T-SQL PerformanceTuning
Prepared test scripts for UT
Fixed the defects during FAT phase and used HPQC update the defects status.
Client : Fox
Project Name : E1Application(Dec’2010 -Apr2011)
Tools :Informatica 8.6, Oracle10g and Teradata
Role : Enhncement and Production supporting
Responsibilities:
Prepared the BTEQ Scripts as per given design documents. Involved in SQL tuning Monitor the jobs.
Involved in CR and estimation of timelines.
Coordinating with different teams for supporting the application.
Analyze the issues & Bug fixing.
Client : Caremark
Project Name :Pharmacogenomics(May’2010-Nov2010).
Tools :Informatica 8.6, Oracle10g and UNIX
Role :Supporting
Responsibilities:
Monitor the jobs.
SQL queries analysis to find out the performance issues.
Analyze the issues & Bug fixing.
Coordinating with different teams for supporting the application.