Sign in

Data Developer

Tempe, Arizona, United States
February 23, 2018

Contact this candidate




Over * years of IT work experience in ETL Development, Testing and Support, Admin projects in technologies Datastage 8.0X/8.5X, 11.5 SQL (IBM DB2, Oracle), and UNIX.

Extensive experience using ETL Datastage 8x, 11.5 Parallel versions on UNIX/Windows environments.

Working as an Onsite coordinator with direct customer interactions, understanding requirements.

Managed 5-10 Members in development projects as module lead and team lead.

Experienced in creating complex mappings using various transformations, and developing strategies for Extraction, Transformation and Loading (ETL) mechanism.

Expertise in developing strategies for Extraction, Transformation and Loading (ETL) mechanisms using ETL Datastage 8.5X/8X/11.5.

Design and develop a scalable complex solution using an optimum number of stages.

Expertise in tuning a parallel application to determine where bottlenecks exist and how to eliminate them.

Excellent experience in Datastage Designer to create complex mappings using different Stages like Transformer, Filter, Join, Merge, look up, File stages, Passive stages and other active stages, Database Connectors of Oracle, IBM DB2, Aggregator transformations to pipeline data to the target.

Experienced in ETL and Queries performance tuning.

Experienced in Datastage Admin activities like IBM Websphere and Infosphere suite Installations in windows environment.

Strong experience in Business Analysis, Design, User Requirement gathering, analyzing High level and Detailed Designs, source to target mappings and data transformations, identifying scope and estimates, developing ETL Jobs/Sequences SQL queries in handling the data, problem solving etc, in Insurance and Retail domains.

Performed data validation by Unit Testing, Integration and User Acceptance Testing.

Experienced in Unix Scripting.

Conducted ETL Datastage trainings and groomed associates for ETL Projects


Bachelor of Technology, Information Technology, India

IBM Certified Solution Developer – InfoSphere DataStage v8.5

DB2 9 Family Fundamentals

ITIL 2011 Foundation Certificate in IT Service Management


ETL Tools: IBM Infosphere DataStage 11.5, IBM WebSphere DataStage 8.5/8.1

RDBMS: Oracle, IBM DB2, Sybase

Operating systems: IBM AIX 5.2/4.x, Windows 2000/2003/XP.

Languages: SQL

Third Party Tools: Citrix, Control M Scheduling Tool, SQL Developer, Plutora, Service Now



WYCAN UI Modernization

Period: JANUARY 2016 to TILL DATE

LOB: Insurance

Position: Datastage Developer & Initiative Lead

Location: USA


Delivered the Data Migration Strategy Document, presented the Target Table Structure to the Client and conducted Kick Off workshops with the State, followed by detailed MOMs with descriptions of the Source and Target Data structures and Data.

The scope of UI Modernization data migration is to migrate legacy data from the data sources as declared in the data inventory (finalized by WYCAN and TCS Teams) to operate the new UI Modernization application.

The Wyoming State will provide data in segment flat files, loading the data from these flat files to the 'intermediate' database.

Analyzing the source data for data anomalies and understanding the source data with respect to the target table structure

Collaborating with the client to obtain data in a standardized format and with good quality

Created an ‘Extraction Issue’ spreadsheet to inform and hold the Client/ respective Teams accountable for data of poor quality

Working along with the Client on data mapping – to map the data between the Source Columns and the Target Column.

Exclusively using DB2 Connector stage, Sequential file stage, Transformer stage, Join stage, Lookup stage, Surrogate key stage, Remove duplicate stage, Funnel, Filter and some other stages for developing jobs to reflect the business rules.

Using Job Sequences to control the execution of the job flow using various activities like Job Activity, Email Notification, Sequencer, Routine activity and Exec Command Activities.

Used DataStage Director to schedule, monitor, cleanup resources and run jobs with several invocation ids for various run streams for a particular system.

Constructing and testing of conversion scripts/programs

Loading data into UI Modernization database after performing the necessary transformations on the source data.

Verifying and validating the data loaded in collaboration with the state

Validating and verifying the data loaded using ETL Reconciliation Programs or database scripts.

Defect Fixing and performing Root Cause Analysis on the defects by categorizing them as Extraction, Mapping or Code defects

Performing Impact Analysis on Change Requests and proposed solutions for Defects as a change/ enhancement in one module might affect other modules. Proposed changes and Solutions will be analyzed and prioritized based on Impact, Time Constraints and Resource Constraints of the Application and Team

Delivering the Strategy, Mapping, Weekly Defect Tracker and other relevant Data Migration Documents

Documenting the Database Changes, Extraction Issues, Performance Improvement steps, Review Comments, Best Practices, Verification and Validation scenarios during the Development Phase and Defects during Testing Phases.


EDW Support

Period: April 2015 to Jan 2016


Position: Datastage SUPPORT & Enhancements

Location: USA

The “Target Legacy .Com” Team offers Support to around 45 Non Production applications almost all of which interact with the downstream Database System - EDW. We worked on Ctrl M scripts, ETL job flows, performance tuning, providing analysis and resolutions. The processes are based on ITIL.


Analyzing jobs on data performance issues; performance tuning long running jobs.

Implementing CRs after analyzing the impact to the application, making changes to jobs wherever necessary.

Working with scheduling tools like Control M Enterprise Manager

Coordinating with other teams both internally and externally, involving with the process and driving incidents/ issues to resolutions handling escalations

Independently perform analysis of issues, participating in development of potential solutions, and making recommendations to ensure accurate and timely resolution.

Encouraging performance tuning of ETL jobs using parallel – pipelining techniques and proper use of Stages

Understanding ETL Job Flows between applications through jobs scheduled in Control M and providing a quicker resolution.

Worked with DataStage Designer to import/export metadata from database, jobs and routines between DataStage projects

Implemented Error Handling scenarios in Datastage jobs to write the rejected records to a table.

Performed debugging, troubleshooting, monitoring and performance tuning using DataStage.

Used different Parallel Extender Partitioning techniques in the stages to facilitate the best parallelism in the Parallel Extender jobs.

Used Datastage director to run and monitor the jobs for performance statistics

Performed performance tuning of jobs by interpreting performance statistics of developed jobs


Understanding, analyzing and documenting requirements, asking the right questions to the Client

Single Point of Contact for the Client and Onsite Coordinator for Offshore Team

Constantly questioning the status quo & evaluating the current processes and identifying newer simpler means as solutions

Reporting and abiding the industry standard following the principles of ITIL

Moderating between different Interfacing Teams, identifying issues, prioritizing the solutions based on time and resource constraints in the Project/Team

Knowledge Management:

Collecting and Creating Knowledge Articles for an issue removing the necessity to analyze and resolve from scratch the next time the issue occurs. Spearheaded the movement across the Support Team and successfully created a Database of issues integrated into the Service Now (ITSM) Tool

Client: AIG

The GCI Program delivered the functionality to process financial transactions within OneClaim as its final phase of delivery for the UK Commercial Automobile Line of Business rollout, and hence required all financial information to be present in OneClaim.

To ensure that all claims that can be processed are being processed in OneClaim, the claims data for those claims that were specifically excluded from R1 migration should also be in the new system to avoid dual system working practices. This drove the need for the migration of both claim and financial data from back end (GOALD) to the OneClaim (front - end) system.


PERIOD: JUNE 2013 to march 2015


Position: Developer


Extensively used DataStage Designer to develop various parallel jobs to extract claims related data, policies related data, provider and practitioners related data and to do necessary transformations and load into target tables or send output files.

Used Datastage Director to schedule, monitor, cleanup resources and run jobs with several invocation ids for various run streams for a particular system.

Worked with DataStage designer to import/export metadata from database, jobs and routines between Datastage projects

Developed various SQL scripts for extracting the source data

Design DataStage Jobs for One time, Daily & Biweekly Incremental Loading.

Developed SQL scripts for validating the data loaded in to target tables and also prepared flow charts, design documents and unit test documents.

Wrote UNIX shell Scripts for file validations and scheduling DataStage jobs and for preprocessing and post processing of the files.

Extensively worked on Parallel Jobs using various stages like Join, Transformer, Lookup. Pivot, Modify, InfoSphere Change Data Capture, Difference, Filter, Funnel, Copy, Sort, FTP, Sequential File, Complex Flat File, Data set, DB2 enterprise/connector, XML Stage, Merge, Aggregator, Row Generator, Shared Container, Remove Duplicates.

GCI Release 3.1 – Data Migration

Period: DEC 2010 to May 2013


Position: Developer


Processing the data through the three stages by loading Stage in Tables, manipulating the data and then loading the Stage out Tables and then loading the Target tables.

Participate in the technical workshop and technical discussion with AIG client.

Prepare and maintain Technical Design Documents of data stage jobs.

Solve the data related and design related issues raised by QA Team.

Ensuring proper end to end delivery and correctness of the Output data.

Use Hierarchical XML transformer stage and MQ stages to prepare the Item XML’s and send to downstream systems as per the business needs.

Maintain the coding standard.

Designing ETL jobs & sequences to extract, load datasets as per the ETL data mappings with my own HLD and LLD.

Wherever required, interpreting business requirement specifications into detailed ETL data mappings.

Designing, documenting and executing ETL system and integration test plans, working along with testing team, gathering test data wherever required

Major modules: Load & Unload Stage In and Stage Out tables, Currency Conversion Logic, Enumeration Logics.

Involved in Unit Testing, Functional Testing, Integration Testing and Regression Testing

Review of code, technical trouble shooting and debugging the code for defects.

Broad understanding of dimensional data modeling like Star schema, Snow flake schema, and Fact and Dimension table designs through normalization techniques.

Contact Details:

Ph: 612-***-****


Linked In Profile:

Contact this candidate