Post Job Free
Sign in
Sorry, this job has been deleted.
Please consider applying to the jobs below.

Data Integration Business Systems

Location:
Ashburn, VA, 20147
Posted:
July 30, 2025

Contact this candidate

Resume:

SUMMARY

Seasoned IDMC/IICS/ ETL Architect with over 14 years of experience in designing and implementing data integration solutions in Cloud and On-premises environments. Seeking to leverage my expertise in IDMC and IICS to drive business efficiencies, enhance data quality, and streamline processes for organizations utilizing cloud technologies. Experienced Leader in establishing standards, delivering business solutions, and building solid working relationships with clients, staff, and senior management.

PROFESSIONAL SUMMARY

12 years of experience in Development/Lead/Administration and implementation of Data Warehouse, Datamart’s and Database business systems with Informatica Data Management Cloud and Intelligent Cloud Services IDMC/ IICS.

Strong experience in all phases of development including Extraction, Transformation and Loading (ETL) data from various sources and expertise in Data Analysis, Data Migration, Data Validation, Data Cleansing, Data Integration, Data Masking using ETL tools IICS/IDMC and Informatica PowerCenter.

Experience working with Cloud Data Integration in creation of mappings, tasks, task flows and good knowledge in schedules, Administration, deployments, Permissions, Monitor.

Experience in real-time integration process that interacts with REST API, Web services, and interface with various applications on cloud.

Experience in Snowflake cloud data Platform and knowledge in Copy Cloning, Time Travel, Streams, Table types, Snow SQL CLI to connect to Snowflake DB to load data.

Strong experience with Ralph Kimball and Inmon data modelling methodologies. Good understanding of the Data modelling concepts like Star and Snowflake schema modelling, CDC’s, slowly changing Dimensions for Data warehousing / OLTP systems.

Developed, managed and deployed complex data integration workflows using Informatica IDMC to streamline ETL processes across cloud-based and on-prem environments ensuring data consistency, integrity.

Good understanding of Python, Pyspark.

Hands on experience loading data from S3 to Redshift using AWS Glue process. Knowledge of Clusters, Crawlers, I AM Roles, Athena to run Interactive ad hoc SQL queries.

Good understanding on various AWS services including S3, Athena, Redshift and Azure services - Blob Storage, Azure SQL, Data Lake, Data Bricks to integrate i.e. read and write data to cloud applications.

Utilized GitHub and Bitbucket for version control and source code management across multiple development projects.

Automated deployment pipelines with Git and CI/CD tools like Jenkins/GitHub Actions, ensuring reliable code delivery.

Experience in understanding Unix Shell scripting, good knowledge of control M, Autosys for scheduling jobs.

Identify application bottlenecks and opportunities to optimize performance, tuning ETL sources, mappings, targets and sessions for better performance.

Knowledge and experience of Informatica Administration a plus. Highly proficient in processing tasks, scheduling sessions, import/export repositories, managing users, groups, associated privileges, and folders with Informatica.

Experience of AGILE Approach. Actively participated in all agile ceremonies, planning, retrospectives and daily scrum.

Work closely with cross-functional teams, including business stakeholders, IT teams, and data engineers. Collaborate on data-related projects and initiatives, providing guidance and expertise.

EDUCATION:

Master of Computer Applications from Jawaharlal Nehru Technological University, India

CERTIFICATION:

AZ-900: Microsoft Azure AI Fundamentals

TECHNICAL SKILLS:

Databases

Oracle Database, SQL Server, Netezza, Teradata, Snowflake, Redshift, Azure Synapse, Azure SQL DB, Vertica, Mongo

Tools & Utilities

Informatica PowerCenter 10.1.4, IBM Data stage, Control-M Scheduler, Putty, WINSCP, PL/SQL Developer, TOAD, Jenkins, GITHUB, Jasper soft, Qlik, SQL Workbench, Jira, Service now, XML Notepad, Postman, S3 containers,

Cloud Platforms

AWS, Azure (ADF, SQL DB), Informatica Cloud – IDMC/IICS,

Languages

PL/SQL, SQL, Python, Py Spark, Data Bricks

Operating Systems

Windows/Linux/Unix

Product Management Methodology

Agile Methodology, Waterfall Methodology

EXPERIENCE SUMMARY

Project: Freddie Mac - McLean, US 01/2025–

IICS Integration Consultant

Project Description

Freddie Mac operates through two primary business segments: Single-Family and Multifamily. “eSafe” is one of the Single-Family Data Portfolios comes from the idea that products serve as “safe" for the data coming from through loan Advisor. When Lenders submit loans, each unique loan is given a submission link ID or SLID. Products such as SLS, PML APP, DOCSS and others use this SLID to accurately link loans across the loan advisor applications to properly store, centralize, and provide data in a consumable format for analytical and data science needs.

The objective of this project is to enhance the exiting FDP (Freddie Data Platform) by integrating a new stage that supports both ETL Applications-IICS/IDMC, Python. The technical solution involves IICS services like CDI, Data Ingestion and Replication, CAI.

RESPONSIBILITIES:

As an IICS Developer, involved in providing the estimates for development activity efforts, migration activities, communication with project and technical teams to identify, define and translate business requirements into technology specifications.

The release-wise break-up of the Informatica assets along with Freddie Mac SME.

Creation of Data Ingestion and Replication pipelines to load data from AWS S3 DB to Snowflake database to transfer large number of files between on-premises and cloud repositories.

Understanding the existing ETL frameworks, modifying the mapping configuration tasks to set the source, target connections, source/targets definitions, standardization of Cache directories, updates to PM Root Dir variables etc.

Create new mapping parameters, modify the existing parameters, variables in DEV and SIT to test the ETL framework.

Created new connections for Mongo DB Atlas, AWS S3, Snowflake, Flat file to read data load in to cloud storage.

Configuring, Rewiring the existing Mongo Brownfield to Mongo Goldfeld to process as is data collections.

Designed, developed mappings, MCTs (mapping configuring Tasks) and Task flows in IICS Data Integration and enhance the existing Task flows.

Experienced in designing and developing ETL processes to read and write data to cloud applications - AWS, Snowflake Cloud database in a way of Cloud Computing.

Created Web Services transformation in Informatica Intelligent Cloud Services (IICS) to integrate applications and access, transform, or deliver data through web services.

Knowledge in Understanding the Pyspark & Step Functions to analyze the EC2, EMR logs as most of the existing frameworks built in PySpark/Python automations.

Building and executing the ETL pipelines for Batch processing for Incremental and Full load from flat file source to S3

Analysis & conversion of SQL, creating DDL’s in Snowflake.

Publishing DMC reports on a timely basis along with the UTC reports with all the snapshots.

Orchestration using control-M.

Worked on code migration using CI/CD through Jenkin pipelines and created release notes for the deployment activities.

Code merge using Bitbucket process to make sure the latest version is in Master whenever there are code changes

Involved in Performance tuning techniques while loading data to and from using IICS/IDMC at Mapping and Task level.

Introduced GitHub as a version control plugin for IICS.

Developed configuration files, environmental files, YAML files for the pipelines to execute and code migration.

Involved in Agile product backlog grooming, sprint planning and maintained JIRA board for the tasks.

Involved in Unit Testing, SIT and UAT phases of the projects.

Assist customers with Go-Live activities.

Environment: ETL - Informatica Intelligent Cloud Services- Cloud Data Integration, Cloud Application Integration, Snowflake Cloud Database, AWS Cloud, Dremio, WinSCP, Control-M

Project: MassMutual Life Insurance, Springfield, US 05/2024–01/2025

Data Solutions Lead Analyst

Project Description

The MassMutual Enterprise Technology (“ETX”) along with MassMutual Global Competency Center engage Supplier to migrate their current landscape from a On-premises set-up to an AWS based ecosystem. As part of this migration, the following technology/tool migration is considered in scope and the To-Be architecture and Data Flow is depicted below

Informix DWH to Vertica DWH called Tier 1 – Informix to Vertica migration involved.

Informatica PowerCenter to IDMC called Tier 2 and Data Extracts – Informatica PowerCenter to IDMC migration involved.

Responsibilities:

Closely worked with the business in providing the estimates for development activity efforts, migration activities, communicating with project and technical teams to identify, define and translate business requirements into technology specifications.

Assessment and planning the existing PowerCenter environment, including workflows, mappings, and dependencies, Identify the scope of migration, data volumes, and data validation.

Involved during PC to IDMC migration prioritized critical data elements and identified data transformations needed.

Migration of Informatica On-prem to IDMC cloud using RPA, updating with right configurations and standards.

Designed, developed mappings, MCTs (mapping configuring Tasks) and Task flows in IICS Data Integration.

Executing the Airflow DAG pipelines for huge volume tables to replicate data from PROD to lower environments DEV, QA.

Conversion from Informatica PowerCenter to IDMC including encryption-decryption UDF as applicable.

updating source/targets from Informix definitions to ODBC definitions, standardization of Cache directories, updates to PM Root Dir variables, changing sources from Flat files to Stage Tables, etc.

Loading of the data from initial staging into Vertica data mart through the IDMC transformation

created Voltage Functions /Voltage Protect that protect the sensitive columns by masking data and Voltage Access to view the data.

Comparison of data between Informix PROD and Vertica DB to make sure column level comparison between the systems is in sync with respect to data using Data Contrast tool. Similarly, other sources with Vertica DB are also compared.

Publishing Data Contrast reports on a timely basis along with the UTC reports with all the snapshots.

Orchestration using Maestro.

Configuration changes in Unix scripts for generation of Data Extracts from Vertica data mart – including the migration of other Unix scripts that are unrelated to data extracts.

Designed, developed mappings, MCTs (mapping configuring Tasks) and Task flows in IICS Data Integration.

Performance tuning the ETL-Informatica code at Mapping level and Task level.

Actively involved in code migration and created release notes for the deployment activities.

Co-ordinate UAT with business, Assist customers with Go-Live activities.

Environment: Informatica Cloud IDMC/IICS, Informatics PowerCenter 10.5.2, Vertica, DBeaver, WinSCP, ServiceNow, Jira, Maestro, Informatica RPA (Robotic Process Automation) Tool, Data Contrast, IBM Informix

Project: State Of New York, IES Department, Albany, US 07/2022 – 05/2024

IDMC Integration Lead

Project Description:

A comprehensive data management, conversion and synchronization strategy that will lay the foundation for the phased transition from multiple legacy systems to New York’s new IES solution that supports 4 different systems - WMSU for Upstate, WMSN for New York, BICS, CBIC. This project will consist of the following primary components-

Migration of As-Is data into the IES To-Be system.

Design, Develop and Implement a data bridging solution to synchronize the data in real time and batch modes.

RESPONSIBILITIES:

●Worked with the architect, Program managers in understanding the requirements, analysis, and project coordination in PowerCenter to IDMC Cloud migration.

●As an Integration Lead, involved in providing the estimates for migration activities, development activity efforts, task alignment, resource allocation, prepared overview of the process to offshore team.

●Assessment and planning the existing PowerCenter environment, including workflows, mappings, and dependencies. Identify the scope of migration, data volumes, and data validation.

●Prioritized critical data elements and identified data transformations needed during PC to IDMC/IICS migration.

●Involved closely with Informatica Support during migration of PowerCenter to IDMC Cloud.

●In PowerCenter, exported the relevant workflows, sessions, mappings using the Repository Manager in XML format.

●Executed the IDMC/IICS task to migrate the PowerCenter workflows to IICS. Validate data integrity and perform necessary checks during the migration phase.

●Worked on TASK creation for scheduling/Automate Snowflake jobs

●Hands-on experience in bulk loading and unloading data into Snowflake tables using COPY command

●Extensively used IDMC cloud connectors for Oracle, Flat file, Amazon S3, Snowflake, Rest V2 etc.

●Worked on advanced transformations like Hierarchical Builder, Hierarchy Parser, Webservice etc.

●Performance tuning the ETL-Informatica code at Mapping level and Task level.

●Created IDMC/IICS API Process to call API’s using service connectors through API Connection.

●Involved in IICS Application Integration – API testing the End point URLs by passing input parameters to make sure the response with expected results.

●Read data from AWS S3 to perform ETL activity in Glue to transform and data cleaning, and store output in S3 location to query and perform metrics using Athena.

●Involved in Development, Unit Testing, and helping team in testing to mockup test data.

●Actively involved in code migration and created release notes for the deployment activities.

Environment: Informatica Cloud IDMC/IICS, Informatics PowerCenter 10.5.2, Oracle, PLSQL Developer, Putty, WinSCP, Snowflake, AWS-S3, RedShift.

Project: NASBA-TN, US 07/2020 – 06/2022

Lead IICS Developer/CGI, India

RESPONSIBILITIES:

●Worked as a team lead for an ETL and Reporting teams.

●Understanding the business rules and sourcing data from multiple source systems using IDMC/IICS.

●Created new IDMC/IICS Data Integration jobs based on the BRD documents, modified/enhance existing jobs.

●Create backup jobs to a separate schema/DB for offline reporting without impact on the current ETL load.

●Created dependencies between Task Flows using File Listener components.

●Worked on IDMC Data Integration components – File Listener, Hierarchical Schema, Saved Query.

●Extensively used cloud connectors S3, Azure SQL DB, Data Lake, Azure Blob, Snowflake, Oracle etc.

●Extracted data from Snowflake to push data into Azure warehouse instance to support reporting requirements.

●Extensively used Informatica Cloud - Data Synchronization, Data Replication and mapping configuration during integration of Cloud based application.

●Debugging the mappings and using session log files to trace errors that occur while loading.

●Worked on IICS Data Integration transformation Hierarchical Builder, Hierarchical Parser, Web Services to process XML/JSON/Swagger files.

●Worked on SNOW SQL to connect to Snowflake to execute SQL queries and perform all DDL and DML operations including loading data into and unloading data out of database tables.

●Worked on Data pipelines to process data from Blob storage containers into Data Lake using Azure Data Factory ETL process using various activities, query data using Azure SQL DB.

●Involved in IICS Application Integration – API testing the End point URLs by passing input parameters to make sure the response with expected results.

●Involved in Development, Unit Testing, SIT and UAT phases of project, scheduling jobs through Autosys.

●Understanding UNIX scripting for validation.

●Actively involved in code migration and created release notes for the deployment activities.

●Involved in agile product backlog grooming and sprint planning and maintained JIRA board for the tasks.

Environment: Informatica Cloud IDMC/IICS, Oracle, TOAD, Putty, WinSCP, Data Lake, Confluence, Stash, S3, Snowflake, ADF, Python, Data Bricks, GitHub, JIRA

Project: iDATA Warehouse, IKANO-Sweden 07/2019 – 06/2020

Lead Analyst/ CGI, India

RESPONSIBILITIES:

●Actively involved with onshore and Business Analysts, SMEs to go through the requirements /user stories.

●Understanding ETL functionality and documented by preparing HLD and LLD documents that suit the business requirement.

●Involved in Agile product backlog grooming and sprint planning and maintained JIRA board for the tasks.

●Lead the Offshore team, closely working with teams to help in understanding the requirements, assigning tasks, and status updates.

●Developed new mappings, modified the existing mappings that meet business logics to load to target system.

●Actively involved in the code reviewing with peer members of the project.

●Participated in providing the estimates for development activity efforts.

●Maintained the mapping parameters to make the mapping more flexible.

●Hands on Informatica power center admin to grant appropriate access to users/developers across DEV, QA and prod environments.

●Involved in code migration activities- Pushing the code to SVN once DEV process is completed and thereby deploy to other environments through Jenkins build parameters.

●Using Informatica PowerCenter Repository manager created deployment groups to migrate the data to higher environments.

●Reviewed the DDL and DML created by offshore team and worked with DBAs to execute them in Dev, QA and Prod

●environments after getting signed off from QA team respectively.

●Worked with testing team during UAT and SIT to fix the bugs and redeploy the code.

●I tested the developed functionality to validate the code by preparing the test cases at various environments.

●Prepared Deployment check list, Operational manuals.

●Coordinated and monitored the project progress to ensure the timely Jira updates and complete delivery of the tasks without any issues.

Environment: PowerCenter 10.1.4, TOAD, JENKINS, UNIX, WINSCP, JIRA, SVN, Oracle.

Project: TalkTalk, UK 06/2017 – 06/2019

Senior Informatica Developer/ CGI, India

RESPONSIBILITIES:

●Work closely with the Project Manager to develop and update the task plan for ETL work and to keep the manager aware of any critical task issues and dependencies on other teams.

●Lead the offshore team, co-ordinate with onsite, Business teams on requirements understanding, adhoc tasks.

●Lead the deployment activities for Production Releases.

●User creation for new users, assigning privileges. Providing access based on their roles, adding to the right groups.

●Constantly monitoring to make sure the infrastructure is readily available for developers and support team members to perform their tasks.

●Ensure the ETL code delivered is running, confirms specifications and design guidelines.

●Perform root cause analysis on all processes, resolve all production issues, validate all data, perform routine tests on databases- Netezza, SQLDB, Oracle, and provide support to all ETL applications.

●Experience in debugging, error handling and performance tuning of sources, targets, mappings, and sessions with the help of error logs generated by Informatica server.

●Identify application bottlenecks and opportunities to optimize performance.

●Provide resolution to an extensive range of complicated ETL-related problems, proactively and as issues surface.

●Performed file level verification tasks via UNIX Shell scripts and command-line utilities etc.

●Meet service level agreements for production support response and resolution.

●Experience with change management tools and processes, including source code control, versioning, branching, defect tracking and release management.

●Actively involved in Priority issues P1, P2 to coordinate with upstream, downstream systems in calls to provide fixes.

●Ability to identify system impact for small and large-scale initiatives.

●Provided training and mentoring for junior developers to resume with steady state operations.

Environment: Informatica 10.1.1, PLSQL Developer, Netezza, UNIX, WINSCP, PuTTY, JIRA, QLIKVIEW

Project: BELL-Network Data Warehouse, CANADA 12/2014 – 05/2017

Senior Informatica Developer/ CGI, India

RESPONSIBILITIES:

●Developed and maintained ETL (Data Extraction, Transformation and Loading) mappings using Informatica Designer to extract the data from multiple source systems that comprise databases like Oracle 10g, SQL Server to the Staging area and to the Data Marts

●Involved in business analysis and technical design sessions with business and technical staff to develop requirements document and ETL design specifications.

●Understand ETL Specification documents for mapping requirements and create mappings using transformations such as the Aggregator, Lookup, Router, Joiner, Union, Sorter, and Update Strategy

●Design the new mappings and modify the existing ETL as required. Performed Impact Analysis of the changes made to the existing mappings and provided feedback.

●Prepared unit test scripts and done Unit Testing by checking required queries in Toad as per SQ Override query along with required screenshots.

●Involved in the preparation of Deployment plan/Method of Procedure and Operational Manual documents during implementation phases of the projects.

●Meet service level agreements for production support response and resolution.

●Prepared Deployment checklists and reviews with the onsite, Creation/updating the technical specifications.

●Understanding UNIX scripting for validations, FTP transfer of source files

●Participated in providing the project estimates for development team efforts for the offshore

Environment: Informatica, Oracle, TOAD, LINUX, Putty, WINSCP, JIRA.

Project: Marketing Sales, Supply Chain Management-Michelin, NA 03/2011-12/2013

DataStage Developer/ CGI, India

RESPONSIBILITIES:

●Modifying the existing parallel jobs, Development of new parallel jobs

●Preparing the Technical Design Documents based on business requirements.

●Primarily L3 support

●Running the jobs when requested on-Demand.

●Batch-Plan preparation for scheduling in Maestro.

●Monitoring and Health Check-up of Applications module wise and sending status reports.

●Queue monitoring of incidents.

●Creation of Migration and Instruction manuals of ETL jobs during the Deployment

●Preparation of Unit Test cases and Unit test results document

Environment: DataStage 7.5, Oracle, TOAD, Putty, WINSCP, JIRA, Message Broker, Manage Now

Project: Financial Data Warehouse, AON HEWITT 03/2010-03/2011

ETL Tester/AON Hewitt, India

RESPONSIBILITIES:

●Understood the ETL process, functionality that fetches data from different sources like Oracle, DB2 and Flat files, and loaded into the target Warehouse (Oracle).

●Extensively worked on Informatica Components – Mapping Designer, Workflow Manager and Workflow Monitor

●Used Informatica Workflow Manager and Workflow Monitor to monitor and to check the session's status and understood the session logs.

●Develop comprehensive test plans and test cases to cover all aspects of the ETL process.

●Execute test cases to validate data extraction, transformation, and loading processes.

●Verify that data is accurately extracted from source systems, correctly transformed, and loaded into the target system without any loss or corruption.

●Develop SQL queries to validate data transformations and ensure that data integrity is maintained.

●Verify that the ETL process accurately moves and transforms data according to the defined rules.

●Test for primary keys, default values, and other integrity constraints.

●Exposure in HPQC of assigning testing BRD document for the corresponding project and the test cases.

●Performed Unit testing and validated the data in PL/SQL developer, documented the Unit Test results in DEV, QA

●Involved in End-to-end development and Unit testing to validate the flow of data. Documenting the Test results

●Data quality validation includes Creating test cases for Testing and helping for UAT.

●Understanding UNIX scripts for job scheduling in Control-M

●Actively participated in Disaster Recovery tests and CMR’s.

Environment: Informatica 8.5, WinSCP, UNIX, PLSQL Developer, CONTROL-M



Contact this candidate