Post Job Free
Sign in

Ab Initio Data Warehousing

Location:
Irving, TX
Posted:
May 20, 2025

Contact this candidate

Resume:

Ponmuthu Subbiah

+1-469-***-****

**********@*****.*** LinkedIn

SUMMARY

Over 16+ years of admirable experience in techno functional/business analysis, requirement gathering/analysis, development, testing, and implementation phases for projects using ETL tools (Informatica, DataStage, Ab Initio) Oracle, Teradata, DB2 and UNIX shell scripting in both Windows/Unix environment

2 Years of experience in Informatica Intelligent Cloud services (IICS)

Good experience with specialization in Data Warehousing, Decision support system and extensive experience in implementing full life cycle Data Warehousing Projects.

Expert in data integration, data quality, data migration, and test data management

Good knowledge of banking domain, telecom, and manufacturing

Good experience in analyzing issues raised by business or client, including converting the technical details to functional for the business to understand.

Expertise in UNIX shell scripting such as wrapper scripts and configuration scripts

Experience in leading diversified and globally located teams at onsite/offshore as single point of contact (SPOC)

Hands on experience with METADATA HUB (MDH) administration tools, utility for creating metadata hub data stores

Handled all the UAT and production inquiries that come as BAU activities from the entire region for all end of day processes and intraday processes.

Manage and troubleshoot PROD and UAT box and provide L3 support and assist in investigation and resolution of PROD issues.

Trained and mentored team members with functional and technical knowledge of all the applications under my track and coordinated work between offshore and onsite.

Extensively worked with Informatica, DataStage, Ab Initio GDE 3.x, 4.0, Co-operating System, and Data Manipulation Language (DML)

Worked extensively with different scheduling job like Cron, Control-M, Ab Initio Control Center and TWS.

Designed, developed, tested, and implemented Ab Initio Graphs for large decision support systems (DSS)

Worked extensively with different scheduling job like Cron, Control-M, and Ab Initio OC

Subject matter expert in architecture, design, development, and configuration of ETL applications

Technology Skills: hands-on experience in data warehousing concepts like OLAP, OLTP, star schema, snowflake schema, fact table, dimension table, logical data modeling, physical modeling, and dimension data modeling.

Generic Skills: requirements analysis, software estimation, software quality assurance, software quality control, and software configuration management

Source Side Skills: source data analysis (statistics, unique values, and outliers), source-to-target mapping, requirements gathering and analysis, project definition, and resource planning.

Management Proficiency

Personal skills include strong communication, requirements gathering, high degree of initiative, planning, and excellent customer relations

Excel at communicating with stakeholders to provide accurate reporting and information regarding the ongoing projects and initiatives

Ensuring the delivery from the projects adhered to the appropriate levels of quality, on time and within budget, and in accordance with the plan and governance models

Effective issues and risks management with status tracking and compliance with quality tools and standards

Managed data migration project and successfully met confidential aggressive deadline.

Migrated large proprietary database running in oracle environment to vendor specific database running in SQL server

Leading the initiative, aiming towards business process engineering to bring in improvements/ productivity enhancements from the delivery perspective

Team management includes conflict resolution, issue management, individual appraisals and performance analysis, and reviews

Prepare baseline project plans, including schedule management plan, change management plan, stakeholder management plan, quality management plan, and configuration management plan

Estimate budgets for small/large project proposals, perform cost control activities, and monitor costs timely, including monthly billing activities

Report to customers with exact and precise information during monthly review meetings

TECHNICAL PROFICIENCY

ETL Tool

:

Informatica 7x, DataStage3x, Ab Initio 3x, 4.0

Databases

:

DB2, Oracle 9i/10g, PL/SQL Programming

Scripting

:

Unix Shell Scripting

Tools

:

Control Center (CO>Op 3X), Control-M

Development Languages

:

SQL, PL/SQL, Unix Shell Scripting

EDUCATION

Bachelor of Engineering, University of Madras, 2003

EXPERIENCE

BROWSE PROSPECTS, LLC Tech Lead Mar 2025 – Till date

Inspirix Technologies LLC Team Lead Mar 2024 – Feb 2025

TM Floyd & Company (Companion Professional Services LLC) Team Lead Dec 2022 – Feb 2024

Tata Consultancy Services Project Lead Aug 2006 – Dec 2022

SVSS Developer Apr 2004 – May 2006

RELEVANT PROJECT EXPERIENCE

1.

Project

:

Data Quality Automation

Client

:

BROWSE PROSPECTS LLC, USA

Technology

:

ETL Tools – Informatica PowerCenter, UNIX, Oracle, Control M

Period

:

Mar 2025 – Till date

Role

:

Technical Lead

Responsibilities

Building ETL architecture and Source-to-Target mapping for seamless data loading into Data Warehouse. Creating PL/SQL elements: Stored Procedures, Functions, Packages, and Triggers. Involved in Performance tuning at source, target,mappings, sessions, and system levels. Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session, and Auto-sys scripts to schedule jobs (workflows).

2.

Project

:

DEL

Client

:

Inspirix Technologies LLC, USA

Technology

:

ETL Tools – Informatica PowerCenter, UNIX, Oracle, Auto-Sys

Period

:

Mar 2024 – Feb 2025

Role

:

Technical Lead

Responsibilities

Developed ETL pipelines using Informatica PowerCenter to transform and enrich incoming data from Oracle DB before loading it into Azure Data Warehouse (Azure DW).

Conducted data analysis and contributed to data modeling efforts, improving data quality and accuracy.

Configured monitoring and alerting tools to ensure system health and respond to ETL failures promptly,

collaborating with cross-functional teams for issue resolution.

Led the migration and modernization of an existing on-premises data lake to Azure Data Lake, enhancing data scalability and accessibility.

Implemented automated monitoring and alerting mechanisms to proactively identify and address performance bottlenecks and ETL failures.

3.

Project

:

CCTI Process

Client

:

BCBSA, USA

Technology

:

ETL Tools – Ab Initio 3x,4x, EME, UNIX, DB2, SQL,CO>Op 3x, Control-M

Period

:

Dec 2022 – Feb 2024

Role

:

Technical Lead

The Blue Cross and Blue Shield Association (BCBSA) serves as the cohesive force that brings the 36 independent Blue Cross and Blue Shield companies into a national system. The Care Coordination (CCTI) is a process of planning, arranging, and coordinating of care of members to bring together healthcare resources from across the continuum of care to enhance members’ health and optimize healthcare. It links members healthcare needs to Plan services and resources (such as case management and disease management). The downstream applications create reports for customers. This application processes data and stores it in the data warehouse through the daily, weekly, and quarterly jobs process.

Responsibilities

Applied advanced knowledge of Ab Initio Graphical Development Environment (GDE), Ab Initio Data Profiler, Ab Initio EME, DB2, SQL, and UNIX

Deployed Ab Initio Graphs and scripts between environments

Involved in Code reviews, deployment, and PIR review calls.

Worked on various programs, conditional components and transform functions, sub graphs, and most components like datasets, filter, join, sort, partition, etc. and used lookups.

Created and updated Control-M schedules working with application team to design complex batch automation

Extensive usage of multifile system where data is partitioned into four partition for parallel processing

Wide usage of lookup files while getting data from multiple sources and size of the data is limited

Developed generic graphs for data cleaning, data validation and data transformations.

Responsible for cleaning the data from sour system using Ab-initio component such as join, Dedup sorted, De-normalize, Normalize., Reformat, Filter by Expression, Rollup

Extensively worded on ETL/Data pipelines to transform data and load from AWS S3 to Snowflake.

Creating and maintaining the test environment

Involved in migrating the code from dev environment to Test environment through CICD process for all the projects

Provided release notes for every release in CCTI.

Documented updates to CCTI architecture documentation

Involved in producing the ad hoc tracking reports.

Creating & uploading the MDH data stores using utilities.

Involved in effective communication between the team and customer to ensure all the deliverables were met on time.

4.

Project

:

CBNA (CITI Bank North America – Production support Lead)

Client

:

Citigroup, Texas, USA

Technology

:

Ab Initio, Teradata, Unix Shell Script

Period

:

Nov 2020 – Dec 2022

Role

:

Technical Lead

Citigroup Inc. (branded Citi) is a major American financial services company based in New York City. Citigroup Inc. has the world’s largest financial services network, spanning 140 countries with approximately 16,000 offices worldwide. Citigroup’s Global Consumer Group-North America business delivers innovative, best-in-class products to customers across the U.S. and Canada under a single, industry-leading brand. Component businesses comprise the financial services sector’s most diverse product offerings, including banking services, credit cards, loans, and insurance. Global Consumer Group-North America consists of four primary product businesses.

Responsibilities

Interacted with clients and other stakeholders to gather requirements and address the daily issues.

Facilitated meetings with different internal and external teams to understand the architecture and scope of the project.

Worked on enhancement projects for creating graphs, data cleansing, data validation and transformation.

Worked with Partition component & Departition (Concatenate, Gather, Interleave )

Provided daily status updates to manager and coordinated with offshore development team for technical/business clarification and knowledge sharing.

Created SQL script to bring and integrate the data from source

Tested the product in controlled, real situations before going live and assessed production readiness of new changes/projects.

Coordinated in installing & configuring of Control-M environment

Worked on performance tunning for the long running jobs both from ETL point as well as from DB (Teradata)

Worked on producing the Daily, Weekly and Monthly system health check DB reports (Teradata)

Closely worked with Teradata vendor for the DB performance issue

Created and reviewed all the changes going into production from a capacity and performance standpoint to ensure service continuity (non-functional requirements); attended change and release meetings and prepared for new releases and technical checkout of applications.

Worked on performance tunning on the long running SQL scripts

Responsible for problem management, capacity management, design and execution of service improvement plans, and service readiness plans, which brought dollar savings for the CITI, routine testing of resilience, expansion and tuning of monitoring, and continued development of routine health checks and knowledge objects.

Prepared application recovery plans and coordinated with Enterprise Business Continuity team for conducting recovery exercise (COB) for applications.

5.

Project

:

DQIP – Digital DQ (Data Quality and Integrated Platform – Digital Data Quality)

Client

:

Citigroup, Texas, USA

Technology

:

Ab Initio, Oracle, Unix Shell Script

Period

:

Nov 2018 – Nov 2020

Role

:

Technical Lead

Citigroup Inc. (branded Citi) is a major American financial services company based in New York City. Citigroup Inc. has the world’s largest financial services network, spanning 140 countries with approximately 16,000 offices worldwide. Citigroup’s Global Consumer Group-North America business delivers innovative, best-in-class products to customers across the U.S. and Canada under a single, industry-leading brand. Component businesses comprise the financial services sector’s most diverse product offerings, including banking services, credit cards, loans, and insurance. Global Consumer Group-North America consists of four primary product businesses.

Responsibilities

Interacted with clients and other stakeholders to gather requirements and address the daily issues.

Facilitated meetings with different internal and external teams to understand the architecture and scope of the project.

Prepared design documents based on the requirements, developed code for new enhancements, constructed the data model, and maintained data as per business standards and integrations as per business requirements.

Designed and created Data Quality baseline flow diagrams, which includes error handling and test plan flow data

Performed requirements gathering, process mapping, Business process Reengineering, Release management, test and End-user support

Provided daily status updates to manager and coordinated with offshore development team for technical/business clarification and knowledge sharing.

Tested the product in controlled, real situations before going live and assessed production readiness of new changes/projects.

Created and reviewed all the changes going into production from a capacity and performance standpoint to ensure service continuity (non-functional requirements); attended change and release meetings and prepared for new releases and technical checkout of applications.

Triaged, escalated, resolved, communicated, and tracked to resolution all incidents, including management of all tickets (“Incident(s)” shall be defined as unplanned interruption to an IT service, a bug, or error in any application, whether logged as a ticket) and all other support requirements, including support related to system health and other related server inquiries (individually and collectively) identified by Proactive Services

Executed data modelling and designed schematic data models with SQL, enhancing data consistency.

Utilized Power BI to create daily, weekly, and monthly dashboards for executives.

Identified and fixed data quality issues

Developed and maintained database systems.

6.

Project

:

DLE – Data Lineage Explorer

Client

:

Citigroup, Texas, USA

Technology

:

Ab Initio, Oracle, Unix Shell Script

Period

:

Oct 2017 – Oct 2018

Role

:

Technical Lead

Responsibilities

Traced and catalogue data processes, transformation logic and manual adjustments to identify data governance issues.

Highlighted Data Quality issues and areas of system control gaps

Linked data lineage to data quality and business glossary work within the overall data governance program.

Implemented Data Governance using Collibra.

Overseen the configuration of Collibra for reference and master data domains.

Worked with Finance, Risk team to create Data Governance glossary, Data governance framework and process flow

7.

Project

:

Enterprise Data Warehouse – Enhancement & Support

Client

:

Avery Dennison, Mentor, OH (Offshore)

Technology

:

IBM DataStage, DB2, Oracle, SQL, UNIX

Period

:

May 2015 – Sep 2017

Role

:

Sr. Developer

The objective of the project was to bring the new site information from the legacy system into the existing data warehouse system, which required a lot of changes in the existing ETL system.

Responsibilities

Understood and analyzed the requirements from the BAs to establish the same in existing CRI system without impacting the existing functionalities.

Analyzed the application and prepared documentation with detailed design specifications.

Prepared high-level and low-level design documentation appropriate to accommodate the changes in the applications.

Created working prototype for similar functional applications to consolidate.

Developed the ETL, prepared unit test cases, and executed the same to validate data and performance to align with requirements.

Analyzed and prepared documentation.

Created design and developed and modified the existing ETL.

Prepared and executed test cases and validated the data.

Prepared deployment steps

Migrated the ETL to the existing CRI production system with the help of Release Management team

8.

Project

:

New TEF (Electronic Fund Transfer)

Client

:

Banco De Chile, Santiago Chile, SA (Onshore)

Technology

:

IBM DataStage, Oracle, DB2, Unix Shell Script, Control-M

Period

:

Aug 2013 – Apr 2015

Role

:

Developer/Module Lead

The objective of the project was to bring all the data from the legacy system into that data warehouse and to perform the scheduled reconciliation process. The reconciliation was being done manually, which was time consuming and inaccurate. This this was done through an automated process.

Responsibilities

Developed ETL programs using DataStage to implement the business requirements

Build DataStage jobs to read files from multiple vendors using complex flat file and sequential file stages

Developed ETLs to do the reconciliation process and purge the historical transactions from primary table

Tuned the performance of mappings by following best practices and also applied several methods to get best performance by decreasing the runtime of workflows

Designed and built shell script utility to automate the tasks for running DataStage batches

Used Control-M Scheduler to monitor the DataStage batches

Involved in customer meetings to understand the business requirements and prepared high-level design based on requirements

Prepared test case and test plan

Resolved production/QA-related issues

Involved in production support and resolved the critical/high issues post deployment

9.

Project

:

Engineering Data Warehouse

Client

:

Motorola Solutions Inc. (Offshore)

Technology

:

Informatica PowerCenter, DB2, Oracle

Period

:

Apr 2011 – Jun 2013

Role

:

Sr. Developer

The objective of this project was to bring all the legacy data to the engineering data warehouse system.

Responsibilities

Involved in identifying the requirements and analysis to provide solution with various possible options

Involved in rationalization of components without business impacts

Identified and developed common components and coordinated for their project-wise reuse

Involved in design, coding, and testing phases

Involved in implementing the application into production

Tested the entire application and checked for inconsistencies/bugs caused during functionality test done by business users

Interacted with onsite coordinator for technical aspects of the project and participated in technical review meetings with onsite coordinator as well as user

Performed quality assurance activities using TCS proprietary tool likes IPMS

Documented the impact analysis and component design and got sign off on the same

Developed interfaces as per the proposed/agreed upon approach; developed code was reviewed to align with coding standards

Developed various codes based on the requirement of the reports

Created deployable objects and deployed them on different testing environments

10.

Project

:

Enterprise Data Warehouse – Production Support & Small

Client

:

Motorola Inc. (Offshore)

Technology

:

Informatica PowerCenter, Oracle, BMC Remedy

Period

:

Aug 2008 – Feb 2011

Role

:

Technical Lead

Main objective of this project was to support and provide enhancement to the enterprise data warehouse system. Whenever there was an issue at production system (data issues, application inaccessible ) user would log a ticket and the issue had to be resolved within the SLA.

Responsibilities

Involved in enhancing and monitoring the issue raised by customers and addressing the issue within the SLA

Worked on support requests/tickets regarding the applications; this involved investigation and enhancements/modifications to the code

Involved in client calls to get the requirement for the enhancement project and created the technical design for implementation

Enforced code standard using TCS Assent

Logged defects in portal and did root cause analysis whenever necessary

Interacted with onsite coordinator for technical aspects of the project

Fixed various data and application issues using PL/SQL, Informatica, and Business Objects

Prepared weekly status report and presented metrics to the customer

Involved in preparing test case, test plan, and customer UAT

Actively participated in doing major/small/emergency projects that came as an enhancements or project requests

Involved in enhancements and maintenance activities of the data warehouse, including performance tuning

Provided production support to resolve the ongoing issues and troubleshoot the problems

11.

Project

:

RCP Customer Conversion

Client

:

Motorola Inc. (Offshore)

Technology

:

Oracle Application 11i (AOL, AR), Oracle DB, UNIX

Period

:

Sep 2006 – Jul 2008

Role

:

Technical Lead

Main objective of this project was to migrate all the customer from the legacy system and bring it to the Oracle ERP system.

Responsibilities

Prepared a technical design document based on the functional requirement

Participated in technical review meetings with onsite coordinator as well as user to get the design signoff

Created the custom program to load all existing customers from the legacy system to bring it into Oracle with proper validations

Enforced code standard using TCS Assent

Logged defects in portal and did root cause analysis whenever necessary

Created test cases and test plan to cover all kinds of test scenarios, tested the entire application, and checked for inconsistencies/bugs if any caused due to the integrated module

Involved in customer UAT and code deployment into production

Provided post-deployment support for the application

Prepared technical specifications from the functional specifications documents

Converted the technical specification into technical design documents

Developed conversion programs that were registered through AOL as concurrent programs and run from SRS form

Performed unit testing and logged the results

Prepared test case and test data

Executed test and logged the test results

12.

Project

:

AR Customer Conversion

Client

:

TRU Global Inc. (Offshore)

Technology

:

Oracle Application 11i (AOL, AR), Oracle DB, UNIX

Period

:

Apr 2004 – May 2006

Role

:

Developer

All customers from the legacy system were migrated to Oracle ERP modules.

Responsibilities

Involved in customer/contact conversion

Participated in CRP meetings with onsite coordinator as well as user to get the design sign off

Developed the parts usage report

Responsible for email alerting system for the concurrent programs

Ran multiple concurrent programs depending on various conditions in a programmatic manner

Created test cases and test plan to cover all kinds of test scenarios, test the entire application, and check for inconsistencies/bugs if any caused due to the integrated module

Involved in customer UAT and code deployment into production

Provided post-deployment support for the application



Contact this candidate