Sign in

ETL developer

Plano, TX, 75093
aroung 55K per annum. Negotiable
September 16, 2008

Contact this candidate




* years of experience in IT Industry in Credit Card and Media industries

• Experience in software design, development, analysis, testing, documentation and reporting

• 2 yrs of experience in Ab Initio, SQL, Unix shell scripting

• Hands on experience on ETL (extraction, transformation and loading) mechanism strategies for high volume Data Warehousing projects

• Experience working with various Heterogeneous Source Systems like Oracle, and Flat Files

• Extensive experience with EME for version control, impact analysis and dependency analysis

• Extensive experience in Korn Shell Scripting to maximize Ab-Initio data parallelism and Multi File System (MFS) techniques

• Excellent Programming and Problem Solving skills having worked on C and C++

• Good experience in Scheduling, Production Support and Troubleshooting


Industries: Financial Services (Credit Cards), Media, Contact Center

Clients: Discover Financial, Chicago Tribune

Operating Systems: Microsoft Windows, MS-DOS, Unix, Linux

Languages: C, C++ (Multi-threaded programming), SQL, Unix shell scripting, COBOL,

Visual Basic 6.0

Tools: Ab Initio GDE version 1.13, 1.14 PDL, Tivoli Systems – JS Console (scheduling

tool), Business Object, EME 2.1x

Databases: SQL, Oracle 9i


• Strong analytical, problem solving, learning and team skills

• Track record of systematic, productive and high quality delivery

• Excellent communication skills


Discover Financial Services (Credit card Division), Chicago, IL Oct’07 – till date

ETL Developer

Development of tracking application for IVR (Interactive Voice Recognition) call routing

The client is one of the largest credit card companies in the US, both in terms of credit card transactions as well as card issuances.

The objective of the project was to improve customer satisfaction by tracking customer phone calls and gradually customer service quality. Information related to call parameters was extracted to report the same on a daily, weekly and monthly basis. The application would take data from 3 IVR servers containing call data such as customer information, location, wait time, talk time, hold time, start time, end time, call route, department, server name, destination etc. After processing the data, the application would load it onto the production Oracle tables. This data was then used for aggregation and analysis and for generation of MIS reports for further business analysis.


• Developed tracking and reporting application in Ab Initio

• Analyzed the specification provided by the client

• Extensively worked in UNIX environment using UNIX Shell Scripts to catalog the daily and monthly files

• Populated the variables used in the graphs in configuration file using UNIX Shell Scripts

• Used UNIX Shell Scripts to write wrappers

• Cataloged daily and monthly files using UNIX Shell Scripts

• Involved in designing fact, dimension and aggregate tables

• Created required documents like Mapping Documents

• Wide usage of Lookup Files while getting data from multiple sources

• Created and maintained AB Initio Graphs using various components like Transform, Partition, De-partition, Dataset, Database, Reformat, Rollup, Scan, Join, Sort, Normalize, Fuse, Input Table, Output Table, Validate, FTP etc

• Created AB initio multi file system, that could process multiple partitions of data at the same time using AB Initio’s M_type shell commands

• Created various AB Initio Multi File Systems (MFS) to run graphs in parallel.

• Worked extensively on ORACLE tables.

• Modified the existing table structures by adding, deleting and modifying several columns for better reporting

• Wrote the .dbc files for the Development, Testing and Production Environments.

• Involved in mapping of data in ORACLE tables by executing various complex SQL queries and sub queries

• Handled database issues while loading data into the Oracle tables

• Involved in removing duplicates from tables to eliminate primary key violation and unusable indexes

• Involved in Unit testing and integration testing.

• Wrote test scripts using UNIX Shell Scripts

• Led the team to complete the assigned Modules

• Promoted to UAT and Production environments

• Verified release manuals and other related documents

• Communicated directly with client

• Involved in Change Management, Production Support, and Troubleshooting.

• Scheduled Ab Initio jobs on JS Console after its successful release

• Involved in maintenance of the jobs

Environment: Unix, AB Initio GDE 1.13, SQL, Oracle

Discover Financial Services (Credit Card Division), Chicago, IL Apr’07 – Oct’07

AB Initio Developer

Modification of existing applications to enable Enterprise Metadata Environment control

The objective of the project was to use Ab Initio features to the fullest instead of being dependent on the cooperative environment (Unix). This was done by converting existing AB Initio applications to its respective EME version. The process involved converting all applications in PDL (Parametric Definition Language) instead of using the configuration files written in Unix Shell Scripts. Earlier the configuration files contained variable definitions used in AB Initio applications, and also the input/output parameter details. Post development, the graph extracts all these details from PDL instead of the initiating Unix scripts (configuration file).

This project helped the institution in the following ways:

• Version Control: This involves keeping track of the older versions of the graph which were checked in to production thus permitting the reverting to an earlier version if any wrong changes are released. This was not there in the older versions.

• Source Code Control: To ensure that the source code is kept intact, the application is locked for making changes giving only a read-only permission to others during that time.

• Dependency Analysis: This was incorporated in order to keep track of the flow of data across applications so as to aid in impact analysis of changes in the application to the overall process


• Initially worked as a team member and developer, growing into a lead developer role guiding and mentoring junior resources in team

• Wrote and modified several application specific config scripts in UNIX Shell Scripts in order to pass environment variables to ab-initio graphs

• Involved in writing wrapper scripts using UNIX Shell scripts

• Involved in writing trigger files in UNIX Shell scripts

• Converted the existing applications into the EME

• Developed new applications in EME

• Created Graph level and Project level parameters according to the requirements

• Involved in check-in and check-out of EME Projects.

• Tested the actual flow to ensure that all the conditions are being populated correctly (in order to ensure that wrong data is not being delivered)

• Communicated with client to develop better understanding of production issues and help resolving them.

• Created sandboxes and projects using air utilities commands.

• Monitored and provided production support of all converted applications

• Provided QA and bug status reporting and removed defects

• Provided UNIX training to junior resources

Environment: Unix, Oracle, Ab Initio 1.14 PDL, GDE connectivity between EME and development server.

Discover Financial Services, Chicago, IL Oct’06 – Mar’07

Data Warehouse Developer

Enterprise Data Store System (EDS)

The Enterprise Data Store (EDS) is part of its card processing system of discover cards running on AIX based midrange servers. The transactions to be processed by the EDS system are downloaded from mainframes and other servers to Unix. EDS handles these data transfers between Mainframes and AIX machines. It validates the downloaded transactions and groups them according to the transaction types for each account. These extracts are then uploaded back to mainframes or other servers to be processed by other systems and are stored on to the EDS Data Store.

This system consists of applications developed using Ab Initio which runs on the client’s AIX boxes. They are scheduled and run from the Tivoli Job Scheduler. The Maintenance Activity involves creating, developing and running graphs on receipt of ad-hoc requests from the client to supplement Business Intelligence. Data is pulled from a variety of sources including mainframes, AIX boxes and the EDS which is the data warehouse.


• Extensively worked in UNIX environment using Unix Shell scripts to catalog the files

• Worked on Unix Shell Scripts to develop config scripts to define the variables of the Ab-initio graphs

• Undertook Unix Shell scripts for writing run scripts

• Worked with COBOL copybook creating EBCDIC and ASCII DML formats

• Extensively involved in mapping of DMLs both in ASCII and EBCDIC forms.

• Handled database issues while loading data into the Oracle tables

• Performed transformations of source data with Transform Components like Join, Dedup Sorted, Denormalize, Normalize, Reformat, Filter-by-Expression, Rollup

• Worked with Departition Components like Concatenate, Gather and Merge in order to departition and repartition data from Multifiles accordingly

• Created Summary tables using Rollup, Scan and Aggregate.

• Used various SQL Queries for extracting data from Oracle tables.

• Extensively worked using transform components like rollup, scan, reformat, dedup sorted, filter by expression and join.

• Scheduled the ETL jobs using JS CONSOLE and loaded the data into target tables from staging area using exchange partition process.

• Mapped DMLs inorder to generate end results.

• Created Ab Initio multifile system, shell scripts on different nodes that can process multiple partitions of data at the same time using Abinitio M_type shell commands.

• Used Ab Initio for Error Handling by attaching error and rejecting files to each transformation and making provision for capturing and analyzing the message and data separately

• Fixed identified bugs

• Involved in Production support and identified and

• Provided Knowledge Transfer sessions to other Team members.

• Provided training to junior resouces

Chicago Tribune, Chicago, IL

Data Warehouse Developer Jul’05 – Apr’06

Daily Report Generation

The client is Chicago's leading local newspaper. The project involved analysis and reporting on customer service and customer relationship management (CRM) related issues such as feedback, complaints, delivery issues, billing issues, promotions and discounts etc.

Data extraction was done by using SQL concepts and Joins from Universal Production Tables and User Objects built on Oracle/DB2. Various manipulations were performed on the extracted data and MIS reports were generated. This project used Business Object tools for report generation.


• Undertook data extraction from ORACLE tables using SQL queries and subqueries.

• Generated reports using BI Tool

• Fixed identified bugs

Environment: Business Object, SQL, Oracle


Bachelor of Engineering, Pune University, Pune, India

Contact this candidate