Post Job Free
Sign in

Data Warehouse Ab Initio

Location:
Allen, TX
Posted:
May 26, 2025

Contact this candidate

Resume:

Sai Sathish Nimmalagadda

Dallas, TX

Professional Summary:

Extensive experience in Master Data Management, Data warehouse, Business Intelligence, Data Modelling, application development for front end, back end, and middle tier technologies.

Extensive experience in Data Warehousing and hands-on with ETL tools like Ab Initio & Informatica 9.x.

Experience using Informatica BDE, Hadoop/HIVE to provide Big Data solution.

Strong experience in PL/SQL, Shell Scripting, and other programming languages.

Expert at using Ab Initio ETL Tools for extracting data from multiple sources, transforming data and loading the refined data into ODS/Data Mart and or Data Warehouse repositories.

Experience in full lifecycle implementation of Data Warehouse projects.

Experience in Extracting data from Mainframe Systems and Transforming and Loading data into data repositories.

Adept at using Data Warehouse concepts and Dimensional Data Modelling (Star and Snowflake Schemas) and conceptual/logical/physical data modelling using Erwin.

Expertise in designing and performance tuning of complex mappings and SQL Queries

Extensive experience developing RDBMS applications including Oracle11g, IBM DB2, Sybase 12.5, MS SQL Server 2005, MS Access SQL, PL/SQL, T-SQL Procedures, SQL*Loader, Functions, Packages and Triggers

Lots of experience dealing with the business end users. Excellent ability to understand and implement projects from the management perspective.

Excellent team building skills. Led and managed teams responsible for delivering/maintaining critical projects.

Experience of working as a Developer, Designer and Team Lead and Manager.

Good team player, hardworking, self-motivated and self-learner with admirable interpersonal and communication skills.

Skills Matrices and Training :

15+ years ETL development

15+ years with Oracle

15+ years with PL/SQL

Strong Unix/Linux shell scripting

Experience with Perl scripting

Attended Training Session on Ab Initio (Intro Course), conducted by Ab Initio.

Informatica Power Center 7.1 Training by Informatica

Attended Informatica Power Center 9 training by Informatica.

Attended IBM Initiate MDM 9.7 training by IBM.

Attended Agile Training

Technical Skills:

Operating Systems: Windows 95/98/NT/2000, UNIX (Sun Solaris2.6/2.8, Linux 2.4, HP-Unix)

Master Data Management: IBM Initiate 9.7, Initiate 10.1, Infosphere V11.x

ETL Tools: Ab Initio (GDE 3.x, Co>OperatingSystem3.15), Informatica 10.1, Informatica BDE 9.1, SAP BODS

Languages: SQL, PL/SQL, Unix Korn Shell, C, C++, JavaScript, VBScript, HTML, XML, ASP, and .NET Framework

RDBMS: Teradata V2R5, Oracle 11g, SQL Server 7.0/2000/2005/2008 MS Access 97/2000, NOSQL

Scheduling Tools: Autosys, Control M, Tidal

Database Utilities: Teradata SQL Assistance 6.1, Toad 7.5, SQL Loader, BTEQ Win, SQL Server Management Studio, HUE

DSS Tools: Crystal Reports 8.0/8.5, Business Objects XI R3

CM Tools: Ab Initio EME Environment, PVCS, Visual SourceSafe 6.0, Harvest, Subversion

Testing Tools: Test Director 7.6, Clear Quest

Education:

Bachelor’s degree – Osmania University, 1996

Masters in computer applications, Osmania University,1999

Professional Experience:

UST Global, BCBS – Dallas, TX Dec 2024 – Present

Lead ETL Developer/ Architect

Performed source system analysis, profiling, data mappings, develop ETL and flows, and if needed, data modelling as needed for data migration.

Developed ETL scripts using best practices to migrate data from legacy systems to the target system considering efficiency, performance, and compliance.

Periodically walkthrough the data model and ETL processes with the application team and business stakeholders

Maintained in-depth and up-to-date knowledge of standards, guidelines, and industry trends for Data Migration and Architecture

Designed, implemented, and maintained security during data migration for PHI/PII data.

Involved in production support best practices in writing the Production Incident report providing the root cause and solutions for P1 incidents.

Created a framework to ingest data from new sources into data warehouse.

Performed tests and evaluations regularly to ensure the design, data security, and integrity of the data is maintained.

Environment: Ab Initio, Oracle, PL/SQL, Unix/Linux, Shell Scripting, Production Support.

Express Scripts - Remote Jun 2023 – Sep 2024

DATA/ETL Architect

Design and develop consolidated, conformed Enterprise Data Warehouses and Data Lakes that store all critical data across Customer, Provider, Claims, Client, and Benefits data.

Developed the framework to ingest data from various sources both for both Initial and Incremental loads from Teradata DW to Big Data and Cloud.

Worked on various Hadoop file formats like parquet ORC and AVRO.

Designed and developed ETL integration patterns using Python on Spark

Using a variety of technologies, including Ab-Initio, BTEQ, and PySpark, develop code in adherence to best practices that acquire data from a variety of both structured and unstructured data sources.

Developed Generic Graphs and used PSETS to execute the same for loading and unloading the data.

Used Metaprogramming to generate DML dynamically.

The targets include Teradata, Hive, HDFS, AWS S3, and Snowflake.

Build routines to load the data into AWS S3 buckets using Ab Initio

Loaded the data into AWS Redshift DB tables.

Created and configured the S3 buckets along with access permissions.

Used snow pipe to automate the load of data from SW3 buckets to snowflake tables.

Responsible for maintaining and upgrading the EC2 instances.

Build and deploy Python-based ETL (extract, transform, load) pipelines to integrate data from multiple sources and prepare it for analysis.

Use JIRA Scrum Board to create Epics, User Stories, and Tasks to align into Sprints and Agile.

Developed matrix of best practices, identifying gaps in processing or design quality in legacy data warehouse to target improvements for new database.

Environment: Ab-Initio, GDE 4.x, Teradata, BTEQ, Teradata SQL, Unix/Linux, Shell Scripting, Perl, PySpark, Production Support Control-M scheduling tool AWS S3 and Snowflake

Mphasis - Remote Dec 2021 – May 2023

Client 1) Charles Schwab Jan 2023 – May 2023

Testing Lead - CDW Migration

Worked on testing newly built workflows that load into GCP using Big Query

Compared the results of the workflows run in On Premises to GCP using MDCERT.

Documented the differences and reported the same in Jira for development team to work on the defects.

Validated the results of the workflows for three successive runs before promoting them to higher environment.

Acts as a leader in the testing community within Mphasis and performs demo, knowledge sharing sessions.

Environment: Informatica Power center, IICS, Teradata, Google Cloud Platform, Big Query, Control-M, Jira

Client 2) FedEx Dec 2021 – Dec 2022

Delivery Project Lead

As part of FedEx Condor Wave 2 project Mphasis team helped FedEx to modernize the applications by migrating them to Cloud using Microsoft Azure platform. As Delivery Project Lead was responsible for migration of applications into cloud and was responsible right from the inception to final on boarding of the application into cloud infrastructure.

Worked on assessment of current Architecture and documented the same in Application Intake Form

Responsible for preparing the Technical Migration Document featuring the existing architecture and components on premises in FedEx ecosystem to the future state in Cloud.

Responsible for creating the Application Design form to include future state Servers and other software components.

Responsible for designing the final state architecture for applications using COTS products like Ab Initio and Tibco

Created Virtual Machines in the cloud and took care of provisioning various software packages needed for the application.

Provisioned the databases in the lower regions like Development and SIT as Ephemeral and Persistent DB

Attend design sessions with clients and help them drive towards the final architecture state for the applications in Cloud.

Attended client VP level meetings to update on the progress of Application Migration

Used ADO as project tacking tool to report the progress of Application Migration

Acts as a leader in the developer community within Mphasis and performs demo, knowledge sharing sessions.

Environment: Ab Initio, Oracle, Teradata, Unix/Linux, Azure Cloud Infrastructure, Azure DevOps, One Automation (Scheduling tool), Production Support

Magellan Health – Richmond, VA Nov 2019 – Nov 2021

Senior ETL Integration Engineer

Develop ETL solutions using Ab Initio and Snap Logic to extract, transform the highly sensitive pharmacy data from the Core claims adjudication and enrolment systems to various internal and external clients.

Functions as primary practitioner coach on the team to grow the capabilities of other engineers on the team.

Extract data from legacy systems, then transforming and delivering that data into a usable format for the new system.

Integrates systems into databases and other applications using middleware such as Snap Logic, REST based services, etc.

Developed Ab Initio graphs and high-level design and detail design document for data movement.

Developed parameterized Ab Initio Graphs to load the data into Oracle Staging Tables.

Performed transformations of source data with transform components like Write Multiple Files, Run Program, Join, Dedup Sorted, Reformat, Filter-by-Expression and Rollup, etc.

Unit testing of new and existing Graphs.

Used Automated Test Framework to test the Ab Initio graphs.

Write and modify several application specific Config scripts in UNIX in order to pass the environment variables.

Writing Korn Shell Scripts in Aix UNIX to automate the ETL process.

Daily routine includes project initiation, requirements gathering, and understanding business requirements, interacting with business users, writing functional specification documentation, data analysis, planning, data mapping, data modelling, process architecture, writing detailed design documents, ETL development, system integration, and deployment.

Acts as a leader in the developer community within Magellan and performs demo, knowledge sharing sessions.

Environment: Ab Initio, Oracle, PL/SQL, Unix/Linux, Shell Scripting, Snap Logic, Perl, Production Support

Blue Cross Blue Shield – Washington DC Mar 2017 – Oct 2019

Senior ETL Developer

FEPOC program administers healthcare for Federal employees and reports the Office of Personnel Management (OPM). As part of introduction of new PPO product to the members, the whole enrolment system is being migrated from mainframe/DB2 environment to Next Generation Platform hosted on Big Data environment on Hadoop/Hive platform. The data is forklifted from mainframe using IBM replication process to acquisition layer, then to trusted zone and finally to publish zone. The reports are generated from the publish zone and sent to users for validation. I also worked on TDM project to create generic graphs to identify and mask PHI from production to lower environments for testing.

Developed the framework to extract and transform the data using Ab Initio/Hadoop.

Attended the meetings with Business SMEs to understand the requirements.

Developed efficient, high quality data integration code leveraging the most relevant features of Ab Initio big data components like read hive table/write hive table.

Used ETL to generate complex reports using different dimensions like age, sex, options, and member types.

Developed parameterized Ab Initio Graphs to load the data into Mainframe/DB2 tables.

Create cfd, pset, data, and ctrl files.

Performed transformations of source data with transform components like Write Multiple Files, Run Program, Join, Dedup Sorted, Reformat, Filter-by-Expression and Rollup, etc.

Unit testing of new and existing Graphs.

Used Automated Test Framework to test the Ab Initio graphs.

Write and modify several application specific Config scripts in UNIX in order to pass the environment variables.

Writing Korn Shell Scripts in Aix UNIX to automate the ETL process.

Used Control-M to schedule the jobs.

Created system documentation for ETL/Data Integration processes to be developed.

Worked with testing teams to test the code SIT and higher environments before moving to production.

Developed documentation for production support team and conducted handover meetings for smooth transition.

Performed tuning of Ab Initio graphs to ensure optimized use of resources.

Created graphs as part of TDM project to mask PHI information and load the masked data into lower environments.

Environment: - Hadoop, Hive, Ab Initio GDE 3.3.3.1, Ab Initio EME, Unix, Control-M, Mainframe, DB2, HP ALM

WellCare Insurance – Tampa FL May 2016 – Feb 2017

Lead ETL Developer

WellCare Health Plans, Inc. focuses exclusively on providing government-sponsored managed care services, primarily through Medicaid, Medicare Advantage and Medicare Prescription Drug Plans, to families, children, seniors, and individuals with complex medical needs. The company served approximately 3.8 million members nationwide as of December 15, 2015. WellCare is a FORTUNE 500 company based in Tampa, Fla. that employs more than 6,900 associates nationwide. WellCare has implemented physical MDM solution for the providers and Virtual MDM solution for the Members.

Interacted with Business to Collect Information for Data Conversion.

Created Technical specifications based on the Business Requirements.

Used flat files, SQL Server as source and Oracle as Target.

Developed Shell scripts, Autosys Jil files for running the Informatica Jobs, aborting Informatica Jobs.

Worked extensively in Informatica Designer, Workflow Manager, Workflow Monitor.

Extensively worked in the performance tuning of the programs, ETL procedures and processes.

Used SQL tools like TOAD to run SQL queries and validate the data in warehouse.

Bring together data from multiple systems and build whole records and data sets using match & merge.

Tuned the PME engine to increase the match percentage for Provider Physical MDM.

Resolved lot of issues in Member Virtual MDM, including the cleanup of Data and Algorithm tuning

Worked on real time services both SOAP and REST Services using customized member segments.

Enhanced the member and provider data with new address and phone numbers from external sources.

Pre-cleanse and standardize data (e.g., gender, address, and names)

Provide visibility across multiple data domains (e.g., providers that are also members)

Re-Architect entire MDM/ETL solution from previous implementation

Provided oversight of ETL using Ab Initio graphs for adding new source data and perform CDC.

Built the ETL framework to load the golden view of Member and Provider in Green Plum environment using Ab Initio.

Worked on Reports to generate the crosswalk between existing and new members.

Worked with Release Management Team on the schedule of releases.

Worked with Defect Review Board to review the defects with Testing Team to resolve and work on defects.

Worked on party data model to load the new source data into MDM.

Environment: Oracle 11g, IBM Infosphere 11.3, Ab Initio, Autosys, IBM Web Sphere 8.5, Green Plum, Hue, NOSQL

Hartford Insurance – Hartford CT Apr 2015 – Apr 2016

Lead ETL Developer

As ETL Developer worked at Hartford Insurance a major commercial Insurance provider for small business. As part ComPAS program – Commercial Markets Policy Administration System, Hartford Insurance wanted to modernize the aging policy Administration process using the IBM Infosphere MDM V11.3 to create golden record for their small business customers. Lead Design, Build, and Test activities for IBM MDM platform for Insurance Commercial Lines Customer Account implementation. Work with cross work stream technical leads to integration with Operational Data Store and new Policy Administration system. Lead conversion runs and production deployments from Siebel Legacy Systems. Should be familiar with MDM concepts, Probabilistic search, Infosphere, Initiate, XML, transformations (XSLT), and Infrastructure paradigms. Participated in Performance engineering efforts. I am also involved in Data Governance using IBM Information Governance Tool.

MDM - Designed MDM data model, algorithm, weight tuning, composite views, relationships & hierarchies, inbound and outbound event publishing.

Informatica – Designed Informatica mappings for initial and CDC delta detection of source systems data.

Created Technical Specifications based on the Business Requirements

Worked extensively in Informatica Designer, Workflow Manager, Workflow Monitor.

Developed PL/SQL procedures for processing business login in the database.

Used SQL tools like TOAD to run SQL queries to validate the data in the warehouse.

Tested all the mappings and sessions in Development, UAT environments.

Infrastructure setup – Designed DEV, TEST and PROD environments in terms of Linux Operating System packages, firewall/network rules/ports enablement/communications, DMZ internet and intranet, DNS, High Availability and Load Balancing.

Installation and Configuration of MDM Standard Edition v 11.3 along with its bundled or suite of products such as WAS 8.5, IBM Workbench, Oracle RAC etc.

Developed reports using Tableau.

Ownership of the project and successfully delivered project working with onsite & offshore teams in aggressive deadlines.

Environment: Oracle 11g, IBM Infosphere 11.3, Informatica 9.5 Power Center, Autosys, IBM Web Sphere 8.5, Tableau

First Midwest Bank (Independent Consultant) Oct 2014 – Mar 2015

Senior ETL Developer

As MDM Architect worked on two installations, a Financial Bank in Midwest and a health care client in North Carolina. This included mastering data from varied sources like banking, credit cards, mortgage, trust, and online banking products. Collaborated with various teams to gather the requirements to come up with the final design of MDM system implemented at both the clients.

As Technical Architect, worked with different source SMEs to understand the data captured for the Customer.

Created a Data Extract Strategy to capture all the relevant customer information.

Worked with the SQL Server ETL Team to create the Change Data Capture process.

Created the algorithms for both Party and Organization Domains

Created the process for Initial Load of data into MDM Server

Installed Initiate MDS v11.3 on Windows 2012 Server.

Design bucketing algorithm and configure PME to populate weights table.

Define Data Extract elements and configure Workbench MPXDATA jobs for bulk cross match (BXM Process).

Configured Inbound Brokers for the ingestion of daily delta changes.

Created Composite View of the entity EMCA, MMCA and golden record in Enterprise Viewer

Installed Workbench, Enterprise Viewer, and Inspector utilities.

Run analytics profiling tool prior to re-distributing buckets.

Developed standards and methodologies for benchmarking, performance, evaluation, testing, data security and data privacy.

Environment: SQL Server 2005, IBM Infosphere 11.3

Computer Sciences Corporation – Baltimore MD May 2014 – Sept 2014

Senior ETL Developer

As ETL/MDM Architect was responsible for leading a team of Informatica Developers, leading ETL design, architecture and development activities and administering and optimizing the pre-existing Informatica ETL implementation and providing guidance and operational support. Developed reliable and complex ETL processes to support new development of the Enterprise Master Data Management projects.

As Technical Architect, work with prime vendor Accenture to review existing data model for Federal Facilitated Marketplace.

Design scalable Java EE SOA architecture for MDM Cloud Services.

Implement MDM governance process and whiteboard solutions for Y2015 Small Business Healthcare Enrollment.

Plan multiple work streams and delineate tasks for building MDM Identity Hub.

Consult Department of Health and Human Services for COTS product selections, Data Center requirements and strategic road map.

For the business interfaces, present to Government the ‘Initial Operating Capability’ to the CMS Technical Review Board and ultimately present the ‘Final Operating Capability’ for the U.S. Senate.

Installed Initiate MDS v10.1 Red Hat Linux in Data Center VMs.

Design bucketing algorithm and configure PME to populate weights table.

Define Data Extract elements and configure Workbench MPXDATA jobs for bulk cross match (BXM Process).

Configured Inbound Brokers for the ingestion of daily delta changes.

Created Composite View of the entity EMCA, MMCA and golden record in Enterprise Viewer

Installed Workbench, Enterprise Viewer, and Inspector utilities.

Run analytics profiling tool prior to re-distributing buckets.

Use FPF filter to adjust auto-link threshold.

Configure LDAP security and Data Steward grouping hierarchies.

Manage technical resources on a daily JAD format.

Review test plans and ETL architecture. Prepare Data Model (in Erwin) and Virtual Data Dictionary.

Developed standards and methodologies for benchmarking, performance, evaluation, testing, data security and data privacy.

Environment: Informatica PowerCenter 9.5, Informatica Data Quality Tool, Designer, Workflow Manager, Workflow Monitor, and Repository Manager, IBM Infosphere 10.1, REHL, Oracle 11g

SunTrust Bank – Richmond, VA Jan 2014 – Apr 2014

ETL Developer

Created a formal description of a system and detailed plan of the system at component level to guide its implementation.

Documented the structure of components, their inter-relationships, and the principles and guidelines governing their design and evolution over time.

Created and reviewed the process maps of AS-IS state and future state.

Discussed with Business to make sure all the functionality is covered in the future state.

Developed the standards and methodologies for benchmarking, performance, evaluation, testing, data security and data privacy.

Environment: Ab Initio, Mainframe, SQL Server, Pega, Empower and Lending Space, Medb, MLCS and DIME

Capital One – Richmond, VA Apr 2012 – Jan 2014

ETL Developer

Capital One is a diversified financial services company offering a broad array of credit, savings, banks, and loan products to customers in the United States, UK, and Canada. EDS Customer Data Management team manages Customer information across all the lines of business like Capital One Credit Cards, Capital One Banks, Home loans, Auto loans, Brokerage etc.,

Initiate Systems Master Data Services

Implementation and Management of CDI/MDM Platform, Management of Enterprise DWH

Successfully managed a team of 15-20 resources (On-Site and Off-Shore) in CDI/MDM and ETL

Successfully integrated ~150 M customer account records across 32 sources.

Successfully implemented High Availability with 99.999% availability

CDI Platform Infrastructure-MDM, IBM Initiate implementation and upgrade from 8.5 to version 9.7. Enhance Initiate algorithms to support larger datasets and optimizing algorithms. Infra-Structure Upgrade to support larger volume data sets, by leveraging Software and Hardware capabilities.

HSBC Branded Book of Business – Integrating HSBC Customer data in CDI/MDM platform.

Adding New Source in CDI Platform.

Big Bucket Analysis and update String Frequency Table and rematch impacted population.

Nick Name Analysis, False Positive Filter, anonymize attributes, run POC and publish results.

HH Algorithm Tuning – Custom Address Standardization to cleanse the fields prior to match.

Customer 360 view – Enhanced CDI platform to support Real Time (Customer ADD/Modify/Delete).

ING, HBC, SMR, COAF Customer Match Solution using Search Harness Process.

Inbound Batch Process – Implemented Batch Solution process to perform mandatory checks.

Outbound Process – Implemented Customer Key Splits and Merges.

Bulk Cross Match – Implemented Solution for HSBC and HBC integrations.

TXM – Implemented Solution to rematch impacted customer data.

Successfully used BXM (Bulk Cross Match) solution to identify the common customers between Capital One and partnership customer data

Successfully used IXM (Incremental Cross Match) to integrate the 8.9M customer account records within 10 hours into CDI.

Inbound Batch Process – Implemented Batch Solution process to perform mandatory checks.

Outbound Process – Implemented Customer Key Splits and Merges.

Successfully designed and implemented Initiate Algorithms for Person, Business and Household

Successfully implemented in-house built Match Quality Operational Manual Review process

Responsible for the management and implementation of the EDW and CDI/MDM solution

Worked with Initiate Support, DBA’s, UNIX, Storage teams to tune and improve performance.

Successfully designed and deployed Inbound, Entity Management, Outbound broker processes

Successfully lead several system conversions and deliver complex new business capabilities.

Strong Vendor Management skills: IBM/Initiate and On-Site/Off-Shore Resourcing partners

Environment: IBM Initiate MDS CDI/MDM Version 9.7, Ab Initio GDE 3.1, Co-op 3.1, Ab Initio ACE, UNIX/Linux, Oracle, Teradata

Fannie Mae – Washington DC Dec 2006 – Apr 2012

Senior ETL Developer

Involved in design and architecture of the whole application.

Designed various IR2 modules for Loan Setup and Payment processing modules for various programs like 1MP, 2MP, HAFA, FHA and RDHAMP.

Involved in various High-level design and detail level design review meetings.

Involved in designing the workflow for the whole project.

FTP the new and existing Ab Initio applications to the local host.

Involved in various handshake meeting between Fannie Mae and external partners.

Supported testing efforts in both system test and acceptance environments.

Supported various releases in Production as well as acceptance environments.

Designed the file transfer B2B protocols with different recipients using Flex File Exchange

Involved and supported Ad hoc Database Modification and File Modification processes.

Developed process for Cleaning up the ETL UNIX file system and BOXI Wintel file system.

Developed Data Quality mappings using Informatica Data Quality tool.

Scheduled the calendar jobs using the Autosys Scheduler

Involved in process design as per Risk and Compliance Standards

Environment: ETL, Ab Initio (GDE 1.15, Co>Op 2.15), Ab Initio Enterprise Meta Environment, Mainframe, Oracle 10g, Clear Case, Clear Quest, Remedy, Autosys, UNIX, Business Objects XI R3, Informatica Power Center 9.1, Designer, Workflow Manager, Workflow Monitor, and Repository Manager

Wachovia – Charlotte, NC Sept 2005 – Dec 2006

Senior ETL Technical Lead

With the growth of Wachovia, a critical need to monitor internally held securities, at an enterprise level, has arisen. This project will provide an automated, proactive approach to managing the processes, controls and reporting associated with security collateral management.

Responsible for gathering business and functional requirements.

Involved as Designer and Developer for Securities Collateral Tracking (SCT).

Development of source data profiling and analysis – review of data content and metadata will facilitate data mapping and validate assumptions that were made in the business requirements.

System extract processes – CDMG will develop the data requirements for each system of record extract and will work with business and IT support staff to create and/or modify extract processes.

Testing and data validation – each extract process will be validated, and data elements within each file will be reviewed to ensure data quality prior to passing the file to Risk Management.

Involved in various EME data store operations like creating sandbox, code check-in, and code check-out according to the environment settings for the Securities Collateral Tracking (SCT) application.

Developed Ab Initio graphs and high-level design and detail design document for data movement.

Developed Ab Initio Graphs to load the data into Loan, Loan-Coll, and Acct-Address MVS data files for RDW Group.

FTP the new and existing Ab Initio applications to the local host involved in Universal cleansing process for COBOL copybooks and mvs data files.

Create cfd, pset, data, and ctrl files.

Writing functions for Date/Time list formats.

Running Development Scripts to convert the COBOL copybooks to create clean EBCDIC and ASCII DML’s.

Scheduling the Graphs using Autosys Scheduler.

Environment: ETL, Ab Initio (GDE 1.13.3.2, Co>Op 2.13.1), Ab Initio Enterprise Meta Environment, Mainframe, COBOL Copybooks, MVS, Harvest, Autosys, UNIX

Previous experience would be provided upon Request.



Contact this candidate