Post Job Free
Sign in

Data Integration Governance

Location:
Gilbert, AZ
Posted:
August 11, 2025

Contact this candidate

Resume:

Summary

Results-driven Informatica ETL Developer with 8 years of experience in ETL design, data integration, and cloud migration, with deep expertise in Informatica PowerCenter, IDMC, and PowerExchange.

Expertise in ETL design, development, and optimization, with a strong background in data integration, migration, and transformation.

Proficient in Informatica PowerCenter, IDMC (IICS), PowerExchange, and IDQ 10.0 for building enterprise-scale ETL and data quality solutions.

Built energy data pipelines for Xcel Energy using Informatica and PowerExchange to process smart meter data on Exadata.

Designed and Developed ETL Pipelines using Azure Data Factory (ADF) to orchestrate and automate data movement and transformation across cloud and on-premise sources.

Configured Databricks Linked Service in ADF to securely connect and trigger Databricks notebooks with parameterized inputs for dynamic ETL processing.

Optimized Performance of ADF Pipelines by implementing partitioning, parallelism, and data flow transformations for efficient data processing.

Scheduled batch jobs in ADF that trigger Databricks notebooks for automated nightly data processing and transformation workflows.

Developed SnapLogic pipelines and APIs in the Xcel Energy project for real-time and batch data exchange across enterprise systems.

Loaded curated financial and operational data into Snowflake for reporting and analytics, and orchestrated batch processing in Databricks notebooks at BNY Mellon and Xcel Energy.

Expertise in Informatica MDM, IDQ, and various data integration platforms to support enterprise-wide data governance and analytics initiatives.

Worked closely with business stakeholders to define MDM data governance policies and data stewardship workflows.

Skilled in performance tuning, data governance, and real-time data processing across diverse industries, including pharma, energy, finance, and healthcare.

Proven ability to work with large-scale data warehouses and integrate various databases, cloud platforms, and real-time data sources.

Expertise in PL/SQL development, writing complex stored procedures, functions, triggers, and performance tuning.

Strong background in Data Warehousing, ETL processes, and data modeling (Star Schema, Snowflake Schema).

Applied DevOps practices including version control (Git), deployment coordination, CI/CD automation, as well as robust error handling, data validation, and unit testing to ensure consistent, high-quality ETL pipeline delivery across all projects.

Experienced in implementing robust error handling, data validation, and unit testing practices across ETL pipelines to ensure high-quality, reliable data processing in all projects.

Delivered ETL solutions across healthcare (Highmark), finance (BNY Mellon), education (Access Info Sources), and energy (Xcel Energy) domains.

Skills

ETL Tools: Informatica PowerCenter, IDMC(IICS), Power Exchange, SnapLogic

Cloud Platforms: Microsoft Azure (ADF, Azure Synapse, Azure SQL, Azure Blob Storage)

Databases: SQL Server, Exadata

Programming: PL/SQL, SQL, Python

Big Data: Snowflake, Databricks

Data Governance: Informatica IDQ, Informatica MDM

Reporting Tools: Power BI, Tableau, Cognos

DevOps & CI/CD: Azure DevOps, Git, Jenkins

Experience

02/2023 - Current

Sr. Informatica Developer, Xcel energy, Minneapolis

Designed a cloud-based data pipeline for Xcel Energy to process real-time energy consumption data.

Developed and optimized ETL workflows using Informatica PowerCenter and IICS to process large-scale energy data efficiently.

Designed and developed end-to-end ETL pipelines using Informatica PowerCenter and IICS to integrate large-scale energy consumption and operational data.

Developed and optimized Azure Data Factory (ADF) pipelines for data ingestion, transformation, and orchestration at Xcel Energy, ensuring efficient and scalable ETL processes.

Designing and implementing ETL workflows using Informatica and SnapLogic.

Integrated IDMC Data Quality in IICS pipelines to apply profiling, cleansing, and validation rules for accurate energy usage analytics.

Developed and optimized ETL pipelines using SnapLogic to integrate data from diverse sources, ensuring efficient data transformation and loading.

Designed and implemented MDM solutions using Informatica MDM to create a single, reliable source of truth for enterprise data.

Developed and maintained SQL scripts to extract, transform, and load data into MDM Hub.

Integrated ADF with Azure services such as Azure Synapse, Azure SQL Database, and Blob Storage to streamline data workflows.

Optimized pipeline performance by leveraging ADF’s Data Flows, parameterization, and pipeline parallelism for improved efficiency.

Wrote and optimized PL/SQL procedures, functions, and scripts to transform, validate, and cleanse source data prior to ETL ingestion.

Optimized Informatica PowerCenter mappings, sessions, and workflows, ensuring high-performance ETL loads.

Collaborated with data engineers to parameterize and test Databricks notebooks used for batch data processing.

Enabled real-time energy consumption analytics by integrating IoT smart meter data using Informatica and Azure pipelines to feed Power BI dashboards.

Migrated legacy ETL jobs from on-premise Oracle Exadata to Azure Synapse and Snowflake, reducing costs by 25%.

Converted legacy Informatica PowerCenter mappings into IICS and Azure Data Factory pipelines.

Designed and developed Informatica PowerCenter ETL pipelines for data migration and transformation in Azure Cloud.

Built reusable Databricks notebooks and parameterized workflows to support multiple data processing scenarios and dynamic pipeline execution.

Designed and implemented Azure Data Lake Storage Gen2 structures to support scalable, secure, and high-performance data storage for both raw and curated datasets across the enterprise.

11/2021 - 01/2023

ETL Informatica Developer, BNY Mellon, New York

Migrated legacy ETL workflows from Informatica PowerCenter to cloud-native pipelines using Informatica IDMC, enhancing performance and scalability.

Designed and developed scalable ETL pipelines in IDMC to support BNY Mellon's financial and supply chain applications.

Integrated data from on-premise Oracle databases, AWS S3, and Snowflake into IDMC pipelines for unified data processing.

Developed and maintained hybrid ETL workflows in Informatica PowerCenter and Informatica Cloud during the transition phase.

Created PL/SQL packages and stored procedures to validate and transform data before and after ETL loads.

Enabled Tableau reporting by designing ETL workflows in IDMC to load curated financial and operational data into Snowflake for executive dashboards.

Optimized IDMC workflows, resulting in a 40% reduction in overall ETL processing time.

Collaborated with team members to establish IDMC best practices and supported peers in troubleshooting and development.

09/2019 - 11/2021

Informatica Developer, Highmark, Pittsburgh, PA

Worked extensively with Informatica IDQ 10.0 and data governance frameworks to support healthcare data quality initiatives at Highmark.

Designed and implemented data quality rules and workflows using Informatica IDQ 10.0 to enhance data profiling, cleansing, and standardization.

Worked closely with the Azure Data Factory (ADF) team to integrate IDQ processes into cloud-based data pipelines.

Automated data cleansing workflows and integrated them with ETL processes in Informatica PowerCenter and IDQ.

Integrated real-time and batch data from Oracle EBS into the Enterprise Data Warehouse using Informatica PowerCenter.

Created and maintained PL/SQL code to support various data processing workflows for the finance and supply chain domains.

Built data integration solutions using Informatica PowerCenter and IDQ to consolidate healthcare claims and provider data for reporting and analytics.

Developed address validation, duplicate detection, and data enrichment processes to ensure high-quality data for analytics and reporting.

Developed and maintained ETL processes to supply high-quality healthcare claims and provider data to Cognos reporting systems.

05/2017 - 08/2019

ETL Developer, Access Info Sources, Hyderabad

Developed and optimized ETL mappings, workflows, and sessions in Informatica PowerCenter for student information systems and academic performance data.

Utilized PowerExchange for seamless integration of real-time and CDC data from mainframe, relational, and non-relational sources.

Extracted and transformed data from various source systems (DB2, Oracle, SQL Server, Flat Files, XML) into a centralized data warehouse to support academic reporting.

Involved in migration of legacy ETL processes into Informatica with embedded PL/SQL procedures, improving maintainability and performance.

Wrote and maintained PL/SQL procedures and scripts for data validation and transformation before loading into the data warehouse.

Performed data validation, reconciliation, and performance tuning to ensure efficiency and accuracy of ETL workflows.

Automated CDC (Change Data Capture) processes using PowerExchange to facilitate real-time data ingestion.

Designed on-premise data pipelines using Informatica PowerCenter, with future planning for cloud readiness.

Developed ETL workflows to supply clean and validated data for reporting and business analysis.

Worked with business users and data developers to gather requirements, validate mappings, deliver performant ETL workflows using Informatica PowerCenter, and provide production support to ensure timely and accurate data loads.

Niveditha Konduru

+1-201-***-**** • ******************@*****.*** • united States



Contact this candidate