Post Job Free
Sign in

Senior Data Integration

Location:
Frisco, TX
Posted:
July 19, 2025

Contact this candidate

Resume:

Venkat Chalamalasetty ***************@*****.***

Senior Data Integration and Quality Engineer Ph #: 214-***-**** SUMMARY

Senior Data Integration Engineer with 8+ years of experience delivering robust ETL pipelines and data quality frameworks using Informatica IICS and Oracle PL/SQL across Financial Services, Retail, Health Care, Government Welfare Services and Property & Casualty domains. Demonstrated success in cloud migration, real-time integration, CI/CD enablement, and compliance-driven reporting. Eager to contribute to cross-functional data platforms in fast-paced environments. TECHNICAL SKILLS

• ETL & Data Integration Tools: SSIS, IDMC/IICS, Informatica PowerCenter, Informatica CDQ, Informatica Data Quality

(IDQ), AXON, EDC

• Master Data Management (MDM): Informatica C360, TIBCO CIM

• Cloud Platforms & Services: Microsoft Fabric, Azure (Data Factory, Databricks, Functions, Synapse Analytics, Key Vault, Logic Apps, Monitor, Blob Storage), AWS (S3), GCP

• Data Engineering & Processing: Delta Live Tables, Unity Catalog, Data Lakehouse, ADLS Gen2, Snowflake, Auto Loader (Databricks), Change Data Capture (CDC), PySpark, Python, SQL, T-SQL, Java, Java 2, J2EE, Hibernate

• Databases & Query Languages: Oracle DB, SQL Server, PostgreSQL, Teradata, KQL DB, Dynamo DB, Cosmos DB, OLTP, OLAP

• Data Modeling & Visualization: Erwin, ER Studio, Microsoft Visio, Oracle SQL Developer Data Modeler, Power BI, Tableau, Qlik

• DevOps & Workflow Automation: Azure DevOps, GitHub, Jenkins, Jira, Cron Jobs

• Messaging & Application Servers: MQ, JMS, IBM WebSphere Application Server

• Standards & Protocols: HL7 Standards

• Operating Systems: Linux, Unix, Windows

CERTIFICATIONS

Neo4J Certified Professional

Informatica MDM Multidomain Developer Professional Informatica Cloud Data Warehouse & Cloud Data Lake PROFESSIONAL EXPERIENCE

Sr. Data Integration Engineer September 2024 – June 2025 Regal Rexnord Milwaukee

• Design and implement data integration pipelines to ensure cleansed and standardized data is provided for creating a unique version of customer record across Regal Rexnord from various sources such as flat files, Oracle DB, Salesforce to MDM, Snowflake, and EDW using Informatica Intelligent Cloud Services (IICS/ IDMC)

• Build real time integration tasks using RestV2 Connector, Swagger files and Rest APIs in IICS CDI.

• Environment / Project setup (folder creation, connection creation, creation and assignment of users, roles and permissions).

• Improved data quality score by 25% across customer records via CDQ enforcement

• Code migration of assets between Organizations. Hands-on experience implementing and maintaining multiple CI/CD pipelines across multiple Orgs and SubOrgs.

• Research and troubleshoot operational and performance issues on production systems. Reduced ETL runtime by 30% using Push Down Optimization on key workflows

• Collaborate with cross-functional teams to gather MDM and data integration requirements from various business units and stakeholders within the organization.

• Work closely with database administrators and system architects to optimize performance of ETL processes and data workflows to ensure efficient data processing and storage using Partition and Push Down Optimization techniques.

Lead Integration (IICS) Developer November 2022 - July 2024 Travelers Insurance Hartford

• Design and implement data quality rules and data integration pipelines to ensure cleansed and standardized data is provided for creating a unique version of customer record across Travelers NA region Consumer domain from various sources Oracle DB, flat files, Salesforce to MDM, Snowflake, Azure Blob Storage and EDW using Informatica Intelligent Cloud Services (IDMC/IICS) and Azure Data Factory ensuring seamless data flow.

• Profiled data from various source systems to identify data quality issues, created Data Quality Scorecards, reusable data quality rules and used these rules in the ETL pipelines to cleanse/standardize the data before moving it to the target systems. Validated/enriched address data using the Address Verifier asset in CDQ.

• Developed complex ETL mappings using IDMC/IICS components to extract, transform, and load data from source systems into target systems or data warehouses while ensuring high levels of accuracy.

• Wrote PL/SQL Override queries using database joins, stored procedures to extract data from the ETL processes.

• Built real time integration tasks using Rest APIs, RestV2 Connector, Swagger files in IDMC/IICS CDI.

• Collaborated with cross-functional teams to gather MDM and data integration requirements from various business units and stakeholders within the organization.

• Worked closely with database administrators and system architects to optimize performance ETL processes and data workflows for performance improvements, ensuring efficient data processing and storage.

• Enabled seamless cloud migration by transitioning legacy PowerCenter code to IICS across environments. In-depth knowledge of migration strategies, tools, and best practices for transitioning from PowerCenter to IICS/IDMC.

• Performed CI/CD using GitHub to promote code from Dev to higher environments.

• Monitored production job runs through Autosys, troubleshooting any issues that may arise during runtime, proactively addressing bottlenecks resulting from infrastructure limitations on both cloud-based environments & on-premises resources.

Lead Integration / Data Quality Developer September 2019 – October 2022 Capgemini / CVS Health Raleigh

• Design, develop, test and deploy end-to-end Data Integration solution using Informatica IDMC (CDI, CAI), flat files, Oracle, Salesforce, Snowflake, Amazon S3

• Ingested data from flat files from Amazon S3 bucket, Oracle to Snowflake database using IICS Data Integration services.

• Developed Real time integration pipelines using IDMC CAI and APIs to ingest data from various source systems into Snowflake.

• Collaborate with business stakeholders to define and document the implementation and execution of Data Quality rules.

• Defined data quality and data standardization rules in CDQ and applied these rules in mappings and mapplets in IICS to migrate cleansed data from source to target systems.

• Developed highly optimized stored procedures, functions, and database views to implement the business logic.

• Performed data management activities such as data model management, data quality/cleansing activities, data retention activities, data integration, and enforcing data security.

• Performed auto scheduling of IICS tasks using Autosys. Successfully delivered a significant number of integration components MDM / ETL (IICS) Developer October 2017 to Aug 2019 Capgemini / Mitsubishi UFJ Financial Group (MUFG) Jersey City

• Design and implement MDM processes to ensure a single, accurate view of critical business data across the organization using Informatica MDM, IDMC/IICS.

• Develop, test, and maintain ETL processes using Informatica Intelligent Cloud Services (IDMC/IICS) to extract, transform, and load data from various sources such as Oracle, EDW, Snowflake, Azure.

• Implement data quality standards and metrics, conducting regular audits to ensure data accuracy and integrity.

• Integrate data from diverse sources, ensuring seamless data flow and consistency across systems.

• Work with business analysts, data architects, and IT teams to gather requirements and ensure alignment with business objectives.

• Create and maintain detailed documentation for MDM processes, ETL workflows, data mappings, and best practices.

• Monitor and optimize the performance of ETL processes and MDM workflows to ensure efficiency and speed.

• Develop and execute test plans to validate data integrity and quality after ETL processes are implemented.

• Provide training and support to team members and stakeholders on MDM and ETL best practices.

• Ensure adherence to data governance policies and regulatory requirements, implementing necessary controls and audits.

Informatica MDM Consultant January 2017 to October 2017 The JM Smucker Company Orville

• Design, configure, develop, test, and implement Master Data Management solution (EDI Trade Partner and Item Domains) using Informatica MDM 10.1 HF 2, Java, Oracle 11g, Oracle WebLogic Application Server, SOA.

• Architected and developed the MDM solution to integrate a variety of downstream systems using SIF API framework and Service Oriented Architecture (SOA).

• Worked closely with business and data steward team to get a good understanding of the data and to capture business requirements.

• Built the data model, configured mappings, cleanse rules, match and merge rules, trust configuration, queries and packages, batch groups in the MDM Hub.

• Developed and implemented Security model for the MDM solution

• Experienced with data governance platforms like Collibra and metadata management tools.

• Developed integration pipelines to move data from source to Informatica landing layer using Dell Boomi.

• Developed ETL mappings and Data quality rules using Informatica Power center, Data Quality, Data Analyst.

• Developed IDD User Exits to implement custom business rules and workflows. MDM Developer (Lead) / Solution Architect May 2012 to November 2016 TCS / The Kroger, Inc Cincinnati

• Designed, developed, tested, and implemented the MDM solution for the Item domain using TIBCO CIM, SSIS, Java, and DB2.

• Accountable for implementing and maintaining master data hubs and applications supporting Kroger’s Master Data Management initiatives.

• Designed and developed workflows with Business Studio to manage data governance and ensure data quality within the MDM system using SSIS and Web Services.

• Performed data cleansing, transformation, and ETL using SQL Server Integration Services (SSIS).

• Designed and configured data cleansing, validation, and standardization rules.

• Developed Cron jobs and used the TIBCO File Watcher utility to automate scheduling for various import/export and synchronization jobs.

Sr. MDM Developer / Data Modeler July 2011 to April 2012 RENT A CENTER Plano

• Designed, developed, tested, and implemented MDM solutions for Customer and Store domains using TIBCO CIM, SSIS, Java, and Oracle.

• Responsible for designing, developing, implementing, and maintaining master data hubs and applications supporting Rent-A-Center’s Master Data Management efforts.

• Developed an email notification system to trigger alerts based on application events, such as entering the Store Number or setting the Actual Opening Date, using SSIS and Web Services.

• Performed data cleansing, transformation, and ETL operations with SQL Server Integration Services (SSIS).

• Wrote Cron jobs for scheduling nightly data imports into and data exports from the MDM system using the TIBCO File Watcher utility.

Sr. Software Engineer / Data Modeler September 2000 to June 2011 DEPT. OF CHILDREN AND FAMILIES (DCF) Tallahassee

• Developed the FSFN Application Framework, which is non-proprietary and based on Apache Jakarta Struts, implementing the Model-View-Controller (MVC) design pattern with Java 2, J2EE Hibernate, WebLogic 9.2, Db2 v8.1.11, and BusinessObjects XI Release 2.

• Collaborated with data analysts, data architects, and business stakeholders to design and develop the data model for the data warehouse.

• Created summary reports using Crystal Reports and listing reports with Desktop Intelligence.

• Developed and maintained business logic (entity and session beans) for assigned application modules.

• Ensured compliance with healthcare standards by applying HL7 in building the Phoenix system.

• Engaged with end users for requirement gathering.

• Mentored and coordinated team members.

• Provided production support for the application. Consultant June 1996 to July 2000

SIMTECH SYSTEMS LTD India

• Involved in the architecture team for design and implementation of systems. Developed a Monolithic application and generated UI screens using JSP, HTML, CSS and JavaScript, Servlets, Struts framework and production support issues of the existing applications.

• Implemented Struts Dispatch Action class and formed bean classes using struts framework. Client-server validations were handled using Struts Validator Framework validate plug-in.

• Implemented Multithreading for handling multiple requests and for high performance. Developed model components using session beans and used local message driven beans (MDB) to interact with sessions using EJB.

• Responsible in creating Servlets, which route submittals to appropriate Enterprise Java Bean (EJB) components and render, retrieved information and EJB Session Beans to process request from user interface using OSS.

• Java message queue (JMS) is used for exchanging information and messaging services. Designed and developed Message Driven Beans (MDB) that consumed the messages from the JMS.

• Build and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.

• Develop a business continuity plan for the SQL Server Databases using JDBC drivers. Created many stored procedures and scheduled jobs to support our applications to create reports for customers.

• Development of Entity classes and establish relationships between them using Annotations.

• Developed DAO pattern to retrieve the data from the database., SOAP web services (JAX-WS) for sending and getting data from different applications WSDL generation and exposed the service at server-side over the Internet and tested the services using JUnit.

• Used ANT build tool for compiling and generating war files. EDUCATION

Master of Business Administration Osmania University



Contact this candidate