Post Job Free
Sign in

Data Quality Technical Lead

Location:
Pittsburgh, PA
Posted:
February 06, 2025

Contact this candidate

Resume:

PRANITHA VODNALA

Technical Lead (ETL Informatica, IICS)

***************@*****.***

+1-216-***-****

Experience Summary:

Over 15 years of experience in data warehousing and ETL processes using Informatica PowerCenter, Informatica Intelligent Cloud Services (IICS), and Teradata. Proficient in Google Cloud Platform (GCP), Snowflake, and Oracle ORMB.

Experienced Tech Lead managing development teams across multiple projects, providing technical leadership throughout the Software Development Life Cycle (SDLC) including requirements analysis, design, development, testing, and

maintenance.

Designed and implemented complex ETL processes using Informatica IDMC, automating data workflows for high-volume, enterprise-level applications. Optimized data pipelines to ensure real-time data processing and integration.

Configured and managed cloud-based data integrations within the Informatica IDMC platform, enhancing data quality, governance, and master data management.

Developed and optimized complex PL/SQL scripts for extracting, transforming, and loading (ETL) data into Oracle databases. Integrated PL/SQL with Informatica PowerCenter for automated ETL workflows, improving performance and handling large datasets efficiently.

Regularly interfaced with Product Managers, Analysts, Dev Managers, Business Teams, and IT Ops, ensuring alignment with business objectives and data governance standards.

Managed work backlogs using tools like JIRA and provided accurate reporting on team velocity, sprint burndown rates, defect metrics, and lessons learned. Coordinated with cross-functional teams during project phases to resolve issues.

Developed and maintained complex ETL processes using Informatica PowerCenter on a Linux environment, ensuring data accuracy and performance.

Designed and implemented scalable cloud-based ETL workflows using Informatica Intelligent Cloud Services (IICS) to streamline data integration.

Served as a key implementer in three successful Informatica IDMC SaaS projects, delivering scalable data management solutions for enterprise clients.

Utilized Informatica Enterprise Data Catalog (EDC) to create and manage metadata repositories, providing comprehensive data lineage and governance.

Implemented data profiling and data cleansing processes using Informatica Data Quality (IDQ) to ensure high data quality standards.

Recommended changes to existing applications and identified impacted interfaces. Developed technical solutions and created BSA documentation, including source-to-target mappings and data models.

Managed upgrades for Oracle Revenue Management and Billing (ORMB) applications, troubleshooting server issues, and debugging WebSphere and ORMB application servers.

Utilized automation scheduling tools like ESP Scheduler, Control-M, and Autosys. Developed UNIX shell scripts and PowerShell scripts to automate ETL processes.

Created complex mappings using Informatica for data extraction, transformation, and loading from various sources to target systems. Migrated ETL processes across environments and conducted production release validations.

Designed and implemented batch processes to integrate external workflow management systems with Informatica IDMC, ensuring seamless data transfers.

Led the technical architecture and design for IDMC migration projects, ensuring alignment with best practices and client requirements.

Designed and implemented data ingestion pipelines using Informatica IDMC for large-scale projects.

Experience with Teradata utilities (Fast Load, MultiLoad, TPT, Tpump) and relational databases like Oracle, SQL Server, DB2, MS Access, and HANA.

Prepared project documentation, including ETL specifications, source-to-target mappings, test cases, and deployment plans. Managed code deployments across DEV, TEST, and UAT environments using tools like Terraform and Liquibase.

Worked closely with business users, functional design teams, testing teams, and offshore teams to ensure successful project delivery and issue resolution.

Conducted technical data quality checks, validated data files, and certified source file usage for ETL processes.

Provided recommendations for process improvements and system enhancements, focusing on optimizing performance and reducing manual interventions.

Technical Skills:

Lead Roles: Technical Lead, Business Analyst, Scrum Master

ETL Tools: IICS, IDMC, GCP, Big Query, Informatica 10.2, Oracle ORMB, SSIS.

Cloud Platforms: Snowflake, AWS (S3, ECS), Azure Data Factory, GCP

Database: SQL Server, Oracle, DB2, Oracle ORMB, TeraData, Salesforce, Snowflake, BigQuery, SAP HANA.

Web Servers: WebSphere, REST, SOAP API connectors

Education:

Bachelor of Technology (B. Tech) in Computer Sciences from Hyderabad, India

Professional Experience:

Client : Amneal Pharmaceuticals (CA)

Role : Lead ETL Developer Duration : Aug 2024 to present

Tools/Environment: Power BI, REST API, FLEX, JD Edward, Informatica, IICS,GCP, CDI, CAI, Python, ServiceNow, Tableau, Salesforce, SQL Server, Remedy Tool.

Responsibilities:

•Collaborating with cross-functional teams, including finance, supply chain, and IT, to gather and analyze requirements for automating returns handling, reconciliation, and reporting processes.

•Designing and documenting detailed workflows to enhance the accuracy and efficiency of pricing models, product classifications, and historical data analysis.

•Conducting gap analyses to identify inefficiencies in existing systems and proposing innovative solutions to streamline return processes and improve operational efficiency.

•Facilitating workshops and stakeholder meetings to validate requirements, clarify ambiguities, and ensure alignment with organizational objectives.

•Developing comprehensive process flow diagrams, data mapping documents, and use cases to support system development and testing phases.

•Coordinating with development and QA teams to ensure successful implementation of system functionalities, meeting business and compliance requirements.

•Monitoring project progress and providing regular updates to senior management, ensuring transparency and alignment with project goals.

•Leading efforts to ensure compliance with pharmaceutical industry regulations and internal business rules throughout the system lifecycle.

•Mentoring team members on technical aspects of the RMS, fostering a deeper understanding of processes and systems within the organization.

•Spearheaded the creation of User Requirement Specifications (URS) for the Return Management System (RMS), aligning system capabilities with Amneal's operational and compliance standards.

•Ensured compliance with pharmaceutical industry standards and internal business rules in system design and functionality.

Client : Charles Schwab, Westlake, TX (Wipro)

Role : Technical Lead (ETL Informatica), Business Analyst Duration : Oct 2021 – Jun 2024

Tools/Environment: IICS, GCP, Informatica, AWS, Avro Ingestion, BIG Query, Python, Databricks, Power BI, Tableau, SPOS, Salesforce, Control M Scheduler, ServiceNow, ControlM, Remedy Tool, Teradata, Nexus, SQL Server.

Responsibilities:

Led technical reviews for ETL projects, overseeing solution design, code review, and performance tuning, and provided UAT testing support.

Managed TDA Conversion and data migration from legacy systems to modern platforms using PL/SQL and Informatica PowerCenter, achieving zero data loss.

Utilized Informatica IDMC for integrating data from multiple sources into cloud and on-premises platforms.

Designed and implemented complex ETL processes with Informatica Cloud Application Integration (CAI) for seamless integration of cloud and on-premise applications, including Salesforce and SAP.

Utilized Informatica Intelligent Data Management Cloud (IDMC) command line utility to automate, manage, and schedule data workflows, enhancing operational efficiency and reducing manual intervention.

Managed ETL jobs, executed batch processes, and monitored job status using IDMC command line tools, improving system reliability and uptime.

Developed custom scripts leveraging IDMC CLI to streamline the integration process, enabling faster data ingestion and transformation.

Configured job schedules in Control-M and Autosys, automating data integration tasks across multiple environments to support business-critical operations.

Integrated ETL processes with Control-M scheduling for efficient task automation, ensuring data processing jobs are completed according to business timelines.

Enhanced data management and performance by migrating ETL processes to Snowflake, utilizing advanced features like micro-partitioning and data sharing.

Developed ETL processes and data integration workflows in multiple Informatica MDM SaaS implementations, focusing on master data management across cloud platforms.

Collaborated with cross-functional teams to configure and deploy Informatica MDM SaaS solutions, ensuring seamless data migration and cloud integration.

Optimized PowerCenter workflows and mappings to enhance data processing efficiency and reduce execution time.

Migrated on-premise ETL jobs to IICS for seamless cloud integration, improving scalability and reducing infrastructure costs.

Developed custom API integrations to automate workflows between Informatica IDMC and third-party applications, enhancing system interoperability.

Utilized REST and SOAP APIs to orchestrate real-time data exchanges between Informatica IDMC and external workflow systems, optimizing operational efficiency.

Collaborated with client management to define migration goals, business objectives, and technical specifications for Informatica solutions.

Provided end-to-end guidance on data migration strategies, including data extraction, transformation, and loading (ETL) processes within IDMC.

Implemented data mapping techniques to transform and load data between systems.

Optimized data ingestion and ETL processes in Informatica IDMC to improve performance and scalability.

Migrated on-premises ETL processes to Informatica IDMC, improving performance and scalability in the cloud and implementing data governance frameworks.

Developed ETL pipelines with Informatica IDMC and created reusable integration templates to streamline development.

Executed data integration workflows with Informatica Intelligent Cloud Services (IICS), including cloud data integration and synchronization with Snowflake, Redshift, and BigQuery.

Managed task flows in IICS for Salesforce data extraction, transformation, and loading using Salesforce Connector, Bulk API, and Standard API.

Automated file processes with PowerShell scripts in CAI and Cloud Data Integration, enhancing efficiency.

Coordinated with cross-functional teams to gather requirements, design solutions, and address issues; facilitated Scrum meetings and sprint planning.

Conducted data validation, UAT sign-offs, and mock testing for data migration projects, ensuring accuracy and system integration.

Educated the team on methods beyond Scrum, such as Team Kanban, to enhance process efficiency.

Involved in customer and account conversion activities, generating data extracts and ensuring successful data integration and synchronization.

Client : BNY Melon, Pittsburgh, PA(Wipro) Role : Technical Lead

Duration : Dec 2019 – Oct 2021

Environment: EDS, WebSphere, Oracle ORMB, App Engine, Unix, Shell Scripting, ESP Scheduler, ServiceNow, ControlM, MS-SQL Server, Java, Git Lab, Nexus

Responsibilities:

Developed batch jobs for loading data into Oracle tables, supporting critical business report generation.

Upgraded ORMB applications from version 2.8 to 3.0, managing deployment and validation.

Troubleshot and debugged server start-up issues, ensuring reliable system operations.

Enhanced ETL performance by tuning PL/SQL queries and database indexes, reducing processing time by 40%.

Led the design of ETL processes for effective data flow into data warehouses.

Provided production support for PL/SQL and ETL processes in a Unix environment, resolving data loading and transformation issues.

Configured and monitored message queues (such as Apache Kafka, RabbitMQ) to manage asynchronous communication between IDMC and external applications, ensuring data consistency.

Collaborated with workflow management teams to design batch job scheduling and automation scripts for efficient data processing in IDMC.

Automated ETL processes using Azure DevOps and other cloud-native tools.

Automated job scheduling and monitoring with Unix shell scripts, integrating with Informatica workflows for increased efficiency.

Led the development and testing of data migration strategies in three IDMC SaaS projects, ensuring accurate and timely data transfers to cloud environments.

Tested and validated data quality and master data management rules in three IDMC and MDM SaaS implementations, ensuring data accuracy and governance compliance.

Participated in the end-to-end implementation of Informatica MDM SaaS solutions, including configuring workflows, developing API integrations, and testing data processing pipelines.

Developed scalable and efficient data architecture designs to optimize data integration and migration workflows.

Served as a technical advisor to both the IDMC migration team and client stakeholders, ensuring seamless integration with existing systems.

Reviewed and approved solution designs to ensure compliance with client standards, security protocols, and performance requirements.

Integrated EDC with PowerCenter to provide metadata-driven insights, supporting impact analysis and compliance initiatives.

Performed performance tuning and troubleshooting for PowerCenter sessions and mappings in a high-volume data environment.

Conducted data validation and quality checks to ensure consistency across data models.

Implemented data mapping and transformation workflows in Informatica IDMC.

Collaborated on code releases and deployments across DEV, TEST, QA, and PROD environments, ensuring smooth transitions.

Managed API integrations with REST and SOAP web services through Informatica Cloud Application Integration (CAI).

Ensured data security and compliance with Azure security best practices.

Ensured compliance with data governance and security policies using Informatica IDMC.

Implemented data quality and governance in Informatica Cloud Data Integration projects to maintain data accuracy and compliance.

Designed scalable data integration solutions in Informatica Intelligent Cloud Services (IICS), leveraging advanced techniques for optimization.

Utilized Informatica Process Designer for managing and monitoring complex data integrations.

Tuned CAI processes for high availability and performance, minimizing downtime.

Managed user access, created accounts, and migrated WebSphere application logins to LDAP.

Deployed code and managed WebSphere application servers and ORMB Database, ensuring stability.

Enabled App Viewer in ORMB Application Servers and handled EAR, WAR, and iHelp file deployments.

Leveraged Informatica IDMC’s AI-powered tools for data discovery and visibility.

Configured secure data connections and managed user access within Informatica IDMC.

Optimized Informatica IDMC workflows, reducing processing times for large-scale transformations.

Worked with EDS ETL tools, developing mappings and managing code migrations across environments.

Created workflows, tasks, and SQL scripts for data deployment and validations.

Developed shell scripts for file format updates and performed production data validations.

Company : DXC Technology (CSC Services) Client : BB&T, Dallas( Texas )

Role : Informatica Developer Duration : Dec 2014 – June 2019

Environment: Informatica 9.1.6, UNIX, Hana, ESP Scheduler, Mainframes, DB2, Mainframe, Webservice Integration (SOAP and REST API protocols)

Responsibilities:

Used the Informatica Power Center to develop Mappings and Workflows for extracting, transforming, and loading data.

Coordinating with offshore team.

Successfully Loaded Data into different targets from various source systems like Oracle Database, Flat files, Mainframes, SQL Server, etc into the Staging table and then to the target database HANA.

Creating low level and high-level design documents.

Ensured error handling and retry mechanisms for API-driven integrations to maintain robust and reliable data workflows across platforms.

Streamlined batch processing workflows in a Linux environment to facilitate large-scale data integration tasks with minimal downtime.

Worked closely with application architects to align batch and API integration processes with business objectives and compliance requirements.

Worked closely with clients during three IDMC projects to gather requirements, design data management architectures, and implement solutions that met business objectives.

Supported the testing and troubleshooting of cloud-based data integration workflows in three Informatica MDM SaaS deployments, ensuring high availability and reliability.

Contributed to performance tuning and optimization efforts across three IDMC and MDM SaaS projects, improving data processing speed and efficiency.

Part of the Agile Scrum Meetings for project status updates and progress

Worked extensively on transformations like Filter, Router, Aggregator, Lookup, Update Strategy, Stored Procedure, Sequence generator and joiner transformations.

Thoroughly testing our web services to ensure they perform as expected under different conditions.

Implementing monitoring and logging to track performance metrics, detect errors, and troubleshoot issues in real-time

Collaborated with data architects to design robust ETL solutions that integrate with other enterprise data management tools.

Managed and monitored IICS workflows and orchestrations on Linux servers to ensure reliable and secure data operations.

Performed metadata extraction, cataloging, and reporting using EDC, enabling improved visibility into data assets across the organization.

Performed data quality assessments and root cause analysis using IDQ to identify and resolve data quality issues in large datasets.

Conducted workshops and training sessions for the migration team and client staff on best practices for using Informatica tools within IDMC.

Provided ongoing technical guidance to troubleshoot migration challenges, ensuring smooth project execution and delivery.

Ensured data quality, governance, and security compliance throughout the migration process, working closely with client data stewards.

Extracting data from flat files.

Created Workflows, tasks, database connections, FTP connections using workflow manager.

Schedule loads using ESP Scheduler.

Creating technical specification documents.

Developing the ETL jobs and creating unit test scripts.

Working on UNIX shell Scripts to load data and automation.

Testing/Validating the Data with SQL queries against the developed ETL.

Reviewing the tasks done by other team members.

Company : Accenture Services, India

Client : Liberty Mutual (Indiana Polis), Verizon Data Services, India Duration : Nov 2006 – Nov 2011.

Role : Development, Production Support and Maintenance.

Environment: Informatica 8.6.1, DB2, Tera Data, UNIX, ESP Scheduler, ORACLE PL/SQL

Responsibilities:

Used the Informatica Power Center to develop Mappings and Workflows for extracting, transforming, and loading data into warehouse database.

Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.

Used Informatica features to implement Type I, II, and III changes in slowly changing dimension tables.

Written documentation to describe program development, logic, coding, testing, changes and corrections.

Creating complex mappings using various transformations, developing strategies for Extraction, Transformation and Loading (ETL) mechanism using Informatica.

Translate customer requirements into formal requirements and design documents.

Worked extensively on transformations like Filter, Router, Aggregator, Lookup, Update Strategy, Stored Procedure, Sequence generator and joiner transformations.

Writing scripts for data cleansing, data validation, data transformation for the data coming from different source systems.

Reviewing Detail Design Document and Technical specification docs for end-to-end ETL process flow for each source systems.

Involved in Reviews Required Documents before Release as a part of Automation.

Involved in achieving Accenture process and policy with respect to deliverables.

Involved in Unit Testing and preparing test cases. Also involved in Peer Reviews.

Worked extensively on transformations like Filter, Router, Aggregator, Lookup, Update Strategy, Stored Procedure, Sequence generator and joiner transformations.

Working on Service Change Requests delivered on time for release.

Developing Informatica Mappings and Workflows to extract data and to load into Teradata staging area using Fast Load/Tpump utilities.

Monitoring and reporting issues for the Daily, weekly, and Monthly processes. Also, work on resolving issues on priority basis and report it to management.

Reviewing Detail Design Document and Technical specification docs for end-to-end ETL process flow for each source systems.

Worked on Trouble Tickets. Monitored and validated Loads.

Used the Teradata fast load/Multiload utilities to load data into tables.



Contact this candidate