Sai Pranav
Email: *****************@*****.*** Ph: 445-***-****
Professional Summary:
Senior ETL Developer with 7+ years of experience in Informatica PowerCenter, Oracle Data Integrator (ODI), IBM DataStage, and OBIEE, delivering enterprise-grade Data Integration and Business Intelligence solutions.
Expertise in end-to-end ETL development – requirement gathering, design, data mapping, transformation, and load processes for large-scale Data Warehousing projects.
Hands-on experience in Informatica PowerCenter (10.x/9.x), creating complex mappings, workflows, reusable transformations, and performance-tuned ETL pipelines.
Strong working knowledge of ODI (11g/12c) for building data integration interfaces, packages, procedures, and knowledge modules to support heterogeneous source-to-target systems.
Experienced in IBM DataStage (8.x/9.x/11.x) – parallel jobs, job sequencing, data transformations, and performance optimization in high-volume ETL environments.
Proficient in OBIEE (11g/12c) – designing RPD (Repository), creating metadata layers, building dashboards, and delivering ad-hoc reporting solutions.
Expertise in Dimensional Modeling (Star Schema, Snowflake Schema) and Data Warehouse design for OLAP and reporting systems.
Strong experience with Relational Databases (Oracle, SQL Server, DB2, Teradata, PostgreSQL), including SQL, PL/SQL, T-SQL scripting for ETL support and performance tuning.
Skilled in ETL performance optimization – partitioning, pushdown optimization, indexing, caching strategies, and parallel job execution.
Involved in data migration projects from legacy systems (Oracle, flat files, DB2, Sybase) into modern Data Warehouses using Informatica, ODI, and DataStage.
Expertise in ETL scheduling and automation using Control-M, Autosys, and ODI scheduling agents.
Experienced in Data Quality Management – cleansing, deduplication, and validation using Informatica IDQ and custom PL/SQL routines.
Strong understanding of ETL best practices – reusable components, error handling, auditing, logging, and recovery mechanisms.
Developed OBIEE dashboards and reports with drill-down, drill-through, and KPI visualizations for financial and operational analytics.
Skilled in metadata management and lineage documentation – Source-to-Target Mapping (STTM), data dictionaries, and ETL design documents.
Hands-on with Cloud Data Integration – ODI and Informatica IICS pipelines for Snowflake, AWS S3/Redshift, Azure SQL Database.
Experienced in ETL testing, defect resolution, and production support, ensuring SLA adherence and high data availability.
Collaborated in Agile/Scrum environments, actively participating in sprint planning, backlog grooming, and mentoring junior developers.
Proven ability to work across Finance, Telecom, Insurance, and Healthcare domains, translating business requirements into scalable ETL/BI solutions.
Recognized for delivering end-to-end Data Integration & BI solutions by combining Informatica, ODI, DataStage, and OBIEE expertise with strong database and cloud knowledge.
TECHNICAL SKILLS:
ETL Tools
Informatica PowerCenter (10.x/9.x/8.x/7.x), Informatica Cloud (IICS), Informatica BDM, Informatica Data Quality (IDQ), Informatica Analyst, Informatica Developer, Informatica MDM (E360, C360, Supplier 360, P360), Informatica Data Director (IDD), Informatica PowerExchange, IBM DataStage, Microsoft SSIS, Pentaho
Data Quality / Governance
Informatica Data Quality (IDQ), Informatica Master Data Management (MDM), Informatica Catalog
Databases & Cloud DW
Oracle (11g/12c/21c), Microsoft SQL Server, Teradata, DB2, Snowflake, Google BigQuery, Azure SQL Database, PostgreSQL, MySQL
Programming / Scripting
SQL, PL/SQL, T-SQL, Shell Scripting, Python
Scheduling Tools
Control-M, Autosys, Informatica Scheduler, TWS
BI & Reporting Tools
Tableau, Power BI, SSRS, Cognos
Data Modeling Tools
Erwin Data Modeler, ER/Studio, Microsoft Visio
Other Tools / Platforms
SQL Developer, TOAD, DBeaver, SQL*Loader, Jira, Confluence, Git, Jenkins, ServiceNow, Office 365
Professional Experience:
PNC Bank – East Brunswick, NJ May 2024 - Present
Role: Sr. SQL /ETL Developer
Summary:
Data professional adept in developing logical and physical data models using ERWIN, with expertise in ETL processes through Azure Data Factory. Proficient in Azure DevOps and familiar with Azure Storage, Databricks Delta, and Azure Catalog. Skilled in C#, VB.NET, and ASP.NET for web and Windows application development. Proven track record of crafting complex SQL queries, stored procedures, and views to meet business requirements. Proficient in PowerShell scripting, SQL analysis, and Autosys scheduling. Strong background in database optimization and version control using TFS and GIT. Effective communicator with a talent for designing and deploying reports using SSIS and SSRS for various stakeholders.
Responsibilities:
Developed Logical and physical data model using ERWIN and mapped the data into database objects. Perform ETL from various data sources using Azure Data Factory.
Experienced in Azure DevOps.
Worked on Azure Storage (Blob/Data Lake Store), Azure Catalog, Databricks Delta.
Extensive work experience on Web based and Windows applications using C#, VB. NET, ASP. NET.
Web module development and deployment using ASP.net.
Experienced in developing data governance policies and procedures to maintain data integrity within the MDM system
Based on business requirements, developed the complex SQL queries with Joins and T-SQL, Stored Procedure, Views, Trigger to implement the business rules and transformations.
Stored Procedures, User Defined Functions, Views, TSQL Scripting for complex business logic.
Created reports from complex SQL queries and MDX queries.
Extensively used Joins and sub-queries for complex queries involving multiple tables from different databases.
PowerShell commands and scripting with Batch processing of scripts, SQL for Data analysis and Autosys scheduling experience.
Optimized the database by creating various clustered, non-clustered indexes and indexed views.
Increased the performance necessary for statistical reporting by 25% after performance monitoring, tuning and optimizing indexes.
Knowledgeable in configuring data matching and merging rules in MDM tools to eliminate duplicates and ensure data accuracy.
Used Team Foundation Server (TFS) and GIT for version control.
Created Alerts for successful or unsuccessful completion of Scheduled Jobs.
Used various SSIS tasks such as Conditional Split, Derived Column, which were used for Data Scrubbing, data validation checks during Staging, before loading the data into the Data warehouse.
Used SSIS to implement the slowly Changing Transformation, to maintain Historical Data in Data warehouse.
Created SSIS Packages to export and import data from CSV files, Text files and Excel Spreadsheets with different extensions.
Designed and developed matrix and tabular reports with drill down, drill through and drop down menu option using SSRS.
Created Ad-Hoc Reports, Summary Reports, Sub Reports, and Drill-down Reports using SSRS.
Scheduled reports for daily, weekly, monthly reports for executives, Business analyst and customer representatives for various categories and regions based on business needs using SQL Server Reporting services (SSRS).
Ability to effectively communicate with all levels of users and team members.
Scheduled Jobs using AutoSys, SQL Server Agent and Window Task Scheduler.
Tools & Technologies: MS SQL Server 2019/2017/2016/2014/2012, Visual studio 2017/2016/2010/2008 (SSIS/SSRS/SSAS), ETL, Oracle, MS Access, Excel, Teradata, Windows Server 2008, Netezza, Visual Source Safe, Net Frame Work, ADO.Net, VB.Net, ASP.Net, XML Parsing, VB6, VB Script, and C#, SAP, Workday, REST full APIs, GIT, SVN, Maven, Mule 3.9.x/4.1.x, Anypoint Studio 6.5/7.5, Java 7, Data Weave, Cloud Hub, Sales force, Seibel, Jenkins, any point Platform, TIBCO.
Saint Anthony Hospital – Chicago, IL. Jan 2023 – May 2024
Role: Sr. SQL /ETL Developer
Summary:
Experienced professional adept at orchestrating diverse healthcare data projects, from HEDIS quality reviews to EDI transactions and HL7 messaging. Proficient in PowerShell scripting for automation, alongside crafting intricate SQL and PL/SQL procedures. Skilled in designing and implementing complex data pipelines utilizing Azure Data Factory and SSIS, ensuring seamless extraction, transformation, and loading of data from various sources. Demonstrated expertise in data modeling, ETL development, and metadata management, with a keen focus on HR/payroll systems integration using tools like Workday Prism. Dedicated to leveraging technology, including TFS and GIT, for efficient version control and collaboration.
Responsibility:
Experience with HEDIS and or other quality medical record review projects.
Design and build PowerShell scripts to automate jobs and tasks.
Writing complex SQL queries and PL/SQL procedures, functions, and triggers.
Responsible for designing and implementing complex data pipelines that involve extracting data from various sources, transforming it, and loading it into the desired destinations and Writing data transformation logic using Azure Data Factory's data transformation activities.
Working in healthcare EDI transaction (837, 835, 820, 834, 277, 997, 999) and EDI testing process.
Skilled in integrating MDM systems with ETL processes to streamline data extraction, transformation, and loading operations.
Developed various T-SQL stored procedures, triggers, views and adding/changing tables for data load, transformation and extraction.
Developing and maintaining SSIS packages to extract data from various healthcare systems and transform it into a format that conforms to the HL7 standard.
Working with HL7 messages to ensure they are being sent and received correctly between healthcare systems.
Proficient in designing and implementing Master Data Management (MDM) solutions to ensure data quality and consistency across the organization.
Designing, developing, and maintaining ETL processes using SSIS to extract, transform, and load data from HR/payroll systems, such as Workday, into a data warehouse.
Utilizing Workday Prism to analyze and report on HR/payroll data in the data warehouse.
Understanding the HR/payroll data domain and using this knowledge to design effective data models, data dictionaries, and metadata repositories that support the data warehouse.
Wrote standard & complex T-SQL Queries to perform data validation and graph validation to make sure test results matched back to expected results based on business requirements.
Created ETL packages with different data sources (SQL Server, Flat files, Excel source files, Oracle) and loaded the data into target tables by performing different kinds of transformations using SSIS.
Managed the Metadata associated with the ETL processes used to populate the Data Warehouse.
Used Team Foundation Server (TFS) and GIT for version control.
Designing and developing data models for the data warehouse, including conceptual, logical, and physical data models.
Building and maintaining the data warehouse infrastructure, including database servers, storage, and backup and recovery systems.
Tools & Technologies: MS SQL Server 2019/2017/2014/2012, Visual studio 2019/2017/2014/2012, T-SQL, SSIS,SSRS, TFS, Advanced Serve, Erwin v7.2, ASP.Net, C#, Netezza, Oracle Enterprise Manager (OEM), Oracle Application Express (APEX), Oracle Data Integrator (ODI), Oracle Business Intelligence Enterprise Edition (OBIEE), Oracle Identity and Access Management (IAM).
American Express – INDIA Aug 2021 – Aug 2022
ETL Informatica Developer
Responsibilities:
Designed and developed ETL workflows and mappings using Informatica PowerCenter to process large-scale financial and transactional data from multiple sources (Oracle, flat files, XML).
Led cloud migration initiative by re-engineering on-prem PowerCenter workflows into Informatica IDMC pipelines for Azure cloud.
Worked on Cloning, data retention period and fail-safe mechanism in snowflake.
Worked closely with DBA teams to optimize PL/SQL procedures, improving query performance by 30%.
Processed XML messages in real time and push to AWS S3 Bucket and Snowflake.
Developed Data Integration components with Snowflake, Aure Blob Storage, and AWS S3.
Automated workflow monitoring and job alerts via Unix shell scripting, reducing manual interventions by 25%.
Generate Flat files using Informatica and load them into snowflake for DB2.
Implemented data masking and security policies using Informatica DDM to ensure compliance with GDPR and HIPAA.
Developed Custom Data Cleansing, Dynamic Schema Validation using Talend.
Developed mappings to load into staging tables and then to Dimensions and Facts.
Used existing ETL standards to develop these mappings.
Coordinated with business stakeholders to gather requirements and create detailed design documents and mapping specifications.
Used Control-M for job scheduling and monitoring production loads with minimal SLA breaches.
Developed and scheduled the different jobs using Autosys.
Developed shell scripts to do FTP, clean up the older files, run the etl workflows using PMCMD, download and Split the JSON files from web URL’s, Backup and archive of the files, calling the procedures, Param tables etc.
As an Agile team member, involved in creating tasks for different stories in JIRA, attended planning, backlog and standup meetings etc.
Involved in loading the JSON files by using Java transformation in Informatica.
After done the cloud migration, I developed PYTHON scripts for downloading data files from Web URL in JSON format & processing CSV, XML format files, calling the Postgres procedures and SQL’s, Split the files and back up or archive logic, data cleaning and business validation rules, and transform logic etc.
Involved in Production deployments, and in data collections, analysis, and management.
As onsite lead consultant responsible all technical deliverables resolving issues and response to the client queries at client side.
Involved in gathering requirements and passing knowledge to the offshore and tracking the status of different tasks which were done by offshore team and reporting to the customer.
Involved in creating stories, tasks, estimate stories, attending backlog and retro meetings using hybrid Agile in Azure platform.
GSNOW is used for creating change requests for every month releases for Production deployments of different applications.
Worked on challenging environment, supported different applications, involved in developing reusable mechanisms for data cleaning and business rules.
Worked on data control platform for level 1 to 3 rules validations for API, Flatfiles, JSON, XML, COBOL format of files, MFT etc for all sources and Targets.
The DCP platform used to define source, Target and business rules metadata, Tableau was used for reporting purposes.
Environment: JSON, XML, COBOL, API, Flatfiles, PMCMD, CSV, XML, PL/SQL,JIRA, MS SQL SERVER 2014, SQL, SSIS, SSRS, PL/SQL BIRST
American International Group, INDIA. Jun 2019 – Aug 2021
ETL Developer
Roles & Responsibilities:
Designing and developing high-performance data pipelines using Informatica - IICS and Azure Data Factory. Implementing data ingestion processes from various sources, including structured, semi-structured, and unstructured data. Collaborate with data stewards to resolve data quality issues and maintain data lineage.
Responsible for designing, developing, and maintaining data pipelines and data warehousing solutions using Snowflake SQL. Configured monitoring and alerting mechanisms to ensure data pipeline health and performance.
Implementing data partitioning and indexing strategies for optimal performance and developing and executing data warehouse maintenance procedures, including data refreshes, backups, and recovery plans.
Working with data analysts and business users to understand their data needs and requirements and creating and maintaining data marts and data cubes for specific analytical purposes.
Conducting data validation, reconciliation, and integrity checks in the ETL process, identifying, and troubleshooting data quality issues, creating, and executing test cases, and collaborating with ETL developers to ensure the accuracy of data transformations.
Utilizing workflow management tools like AutoSys or Apache Airflow to automate data pipeline execution.
Integrating data pipelines with scheduling tools for regular data updates and processing and developing scripts using Python to automate data transformation and validation tasks.
Environment:Autosys, Apache, Python, ETL, SQL, JSON, XML, COBOL, API, SQL, PL/SQL, T-SQL, Shell Scripting, Python
Standard Chartered Bank, Bangalore, India May 2018 - May 2019
ETL Developer
Roles & Responsibilities:
Was responsible for the data Integration, cleansing, validation and data migration from the SAP system.
Involved in deep analysis in building data warehouse, performed detail analysis, Design and development of ETL life cycle and performed needed optimization and tuning.
Analysed source data coming from SAP Sales, Finance Domain.
Worked with Data Warehouse team in developing Dimensional Model.
Designed Star Schema and created Fact and Dimension Tables for the Warehouse.
Designed ETL specification documents to load the data in target using various transformations according to the business requirements.
Implemented Slowly Changing Dimension (SCD) type 2 to maintain historical data.
Created the whole documentation of packages and applied Naming Standards.
Utilized several Transformations like Lookup, Fuzzy Lookup for implementing the Transformation Logic in the Packages.
Created complex ETL packages using SSIS to extract data from staging tables to partitioned tables with incremental load.
Created SSIS Reusable Packages to extract data from Multi formatted Flat files, Excel, XML files into UL Database and DB2 Billing Systems.
Developed, deployed, and monitored SSIS Packages
Created SSIS package for loading the data coming from various interfaces like OMS, Orders, Adjustments and Objectives and used multiple transformation in SSIS to collect data from various sources.
Created SSRS reports for Franchise Health Reports.
Extracted data from flat files and from RDBS sources like Oracle and SQL Server.
Created Jobs and scheduled Packages using SQL Server Management Studio for the Daily Load.
Created Maintenance Plans and setup log files to monitor jobs
Involved in designing and creating BIRST data model and reports.
Schedule and control access by creating groups, roles etc. using BIRST connect connection.
Attended functional testing and user acceptance sessions and worked on the feedback provided by them.
Environment: MS SQL SERVER 2014, SQL, SSIS, SSRS, PL/SQL BIRST
Education:
• Master of Business Administration in Information Technology Management from Rivier University, Nashua, New Hampshire, USA, 2024
• Bachelor of Business Administration (BBA) from Amity University, Hyderabad, India, 2019
Certification:
• Salesforce administrator
• Salesforce Platform Developer - 1