Post Job Free
Sign in

ETL Datastage Lead /Architect

Location:
Tampa, FL, 33637
Posted:
June 07, 2008

Contact this candidate

Resume:

Objective

• Take a Lead position in the development of software projects.

• Establish Good Communication with the User/Client.

• Understand User requirements and make the deliverables at appropriate time.

• Analyze Design, Develop and Implement Data warehousing Applications using IBM DataStage, SQL Server, DB2, Teradata & UNIX, building Data Marts.

Summary

• Over 15 years of experience in Information Technology industry

• Strong skills in ETL ( DataStage8.0.1/PX/EE and Orchestrate) Architect/Design and development and Performance Tuning the Parallel Extender DataStage ETL Jobs.

• Strong skills on Analysis, Design, Develop and Implement software applications using C/C++/UNIX/Korn Shell script environment

• Strong working experience on Data Warehousing applications, directly responsible for the Extraction, Staging, Transformation, Pre Loading and Loading of data from multiple sources into Data Warehouse.

• Strong knowledge and experience in data warehouse development life cycle, dimensional modeling and Slowly changing dimensions

• Strong skills in Master Data Management ( MDM ) like data collection and data transformation

• Involved in full life cycle product development

• Expertise in C++, SED, AWK, Perl, Socket programming

• Expertise in Object Oriented analysis, design and programming using C++

• Expertise in developing multi threaded applications

• Strong skills in developing the Graphical User Interfaces (GUI) using Motif, Java as per the user requirements

• Lead the mission critical implementation of passenger Reservation Systems at all major locations and Networking of these sites in India, this includes the migration of data from the legacy systems to the Client/Server environment

• Good skills in working on Alpha systems in Open VMS/C/FORTRAN/DCL environment

• Willing to work under challenging environments and determined to succeed.

• Good interaction with the user/client knowing & understanding their requirements and attending to their requests

• Independent problem solving Techniques and good team player with strong interpersonal and communication skills

Technical Skills

Operating Systems : HP-UX B.11.11, AIX – 5.2 ( UNIX ), Solaris 2.6-7, Linux, VMS, Windows XP

ETL : DataStage 8.0.1, DataStage 7.5.1.A., ORCHESTRATE ( Parallel Extender )

Languages : C, C++, Visual C++, VB, FORTRAN, SQL, PL/SQL, Pro C

Unix : Unix Korn/C Shell scripting, Make, Awk, Sed, Perl, Cron, Tar, Vi, Regular Expressions, UNIX internals ( IPC, TCP/IP Sockets, POSIX threads, Signals, Memory Management, I/O sub system)

Java/Internet : J2EE, Servlets, RMI, HTTP, JDBC, Multi threading, Sockets, Applets, AWT, Swing, HTML, DHTML,CGI, XML, XSL, Java Script, VB Script, JSP, EJB

Hardware : ALPHA/VAX, PDP –11, Pentiums, PII, PIII etc

Networks : Sockets, TCP/IP, Decnet on Open VMS

Middleware : RTR (Reliable Transaction Router), CORBA

API : Trillium Software, Sync-sort

Tools : Rational Clearcase6.0, Visio 2000, ERwin

Database : TeraData, DB2, ORACLE, SQL Server 2005

Professional Experience

Baxter Healthcare, Round Lake, IL August ’07 – Present

Sr. Software Programmer/Architect

Global Sales Transparency

The objective of GST ( Global Sales Transparency ) Dataware house project is to collect the Global Items sales and cost data from each country and made available to GST repository , conformed into a single reportable sales and cost tables, cleansed and populated into dimensional Fact tables. All related attributes to the fact table Global Item, Company, Geography, etc. populated into dimensional tables. GST data mart ( Facts & Dimensions ) in turn analyzed and used by Business Objects group for reporting and to make business and marketing decisions..

Responsibilities

• Involved in the design of GST architecture, code reviews of Datastage jobs and implemented best practices in Datastage jobs

• Designed the DataStage jobs, job sequences and Shared Containers to implement the business requirements and design specifications

• Worked as Datastage Administrator, defined projects, environment variables, user groups and privileges and set up different environments (Dev/QA/Prod). Created master controlling sequencer jobs using the Data Stage Job Sequencer.

• Expertise in working with various operational sources like Teradata, DB2, SQL Server 2005, Flat files into a staging area

• Used shared containers for multiple jobs, which have the same business logic.

• Extensively worked with Job sequences using Job Activity, Email Notification, Sequencer, Wait for File activities to control and execute the Data stage Parallel jobs.

• Developed functional specifications for data Extraction, transformation and load processes for implementation.

• Designed and Developed UNIX shell Scripts for file validation and scheduling Data Stage jobs

• Developed DataStage Jobs, define job parameters, reference range lookups.

• Extensively used the CDC ( Change Data Capture ) stage to implement the slowly changing Dimensional and Fact tables.

• Used stages like Transformer, sequential, Aggregator, Data Set, File Set, Remove Duplicates, Sort, Join, Merge, Lookup, Funnel, Copy, Modify, Filter, Change Data Capture, Surrogate Key, External Source, External Target, Compare and Schedule jobs through UNIX shell scripts using UNIX CRON utility.

• Performed debugging, troubleshooting, monitoring and performance tuning using Datastage.

• Used different Parallel Extender Partitioning techniques in the stages to facilitate the best parallelism in the Parallel Extender jobs.

• Modified Configuration file according to space constraints.

• Writes technical specifications document and design process diagrams

• Extensively used Restart ability Techniques in the job sequencers.

• Provided Standard Documentation, Best practices, Common ETL Project Templates

Environment: DataStage Enterprise Edition 8.0.1, IBM Websphere Datastage, C/C++, Orchestrate ( ETL – Parallel Extender ), UNIX – (Sun Solaris), Korn Shell Scripts,Vi-editor,SFTP,PERL, SQL Server 2005, MS Visio 2003 ,Oracle9i

Wells Fargo Capital Markets, Clayton, MO Oct ’06 – July ‘07

Sr. Software Programmer

Capital Markets Operational Data Store (CMODS)

The ODS (Operational Data Store) is a core component of the CMODS system which receives and integrates data coming from all the feeder systems.The Capital Markets Operational Data Store (CMODS) system was developed with the intent to support the lending requirements for Pooling/Delivery and Asset Sales. CMODS is currently being used primarily by the Finance and Pipeline/Warehouse Asset Valuation (PWAV) groups. Additionally, there are data extracts that are being provided to the Servicing Portfolio Management (SPM) group as well as the Asset Sales group. CMODS merges multiple loan origination systems into a single foundational repository. The ODS contains normalized data from origination and servicing systems for loans from rate lock to settlement. This data is used for analysis and decision making for valuation and Profit and Loss diagnostics. The data mart contains historic data to support Finance reporting and analysis to isolate undiagnosed P&L .Business Objects reporting and query tool used for reporting needs supported by daily and monthly reports for the Finance group.

Responsibilities

• Designed the DataStage jobs, Job sequences, Containers and re usable components to implement the business requirements and Business design specifications

• Writes technical specifications document and design process diagrams.

• Developed functional specifications for data acquisition, transformation and load processes for implementation.

• Developed Network Data Mover/Connect: Direct scripts to transfer Data files to and fro from Mainframe data source and other originating feeder systems.

• Develops DB2 Export/Load scripts for data extract and load process. Create excel/worksheet format data files using Db2 Export utility.

• Create DB2 Triggers, SQL scripts.

• Developed UNIX shell scripts for data files processing and process integration tasks.

• Developed DataStage Jobs, define job parameters, reference lookups, filter criteria,

• Used DB2, SORT, JOIN, MERGE, FILTER, FUNNEL, MODIFY stages. Schedule jobs through UNIX shell scripts in UNIX CRON utility.

• Responsible for DataStage Project export/import and creating the release patches.

• Prepares unit test, integration test, and performance test plan documents. Performs unit test, integration test, and performance test per the approved test plan, documents test results, and escalates issues to the manager.

• Supports users during functional, UAT, regression, and performance testing.

• Develops appropriate documentation on ETL jobs

Environment: DataStage 7.5.2, C/C++, Orchestrate ( ETL – Parallel Extender ), UNIX – (AIX 5.2) , Korn Shell Scripts,Vi-editor,SFTP,PERL,DB2 UDB 8.2/9.0 DB2 Load/Import/Export utilities, WINSQL, NDM/Connect:Direct, Rational ClearCase 6.0, MS Visio 2003, PAC2000 change control tool.

Verizon, TAMPA, FL Oct ’02 - Sep ‘06

Sr Software Programmer/Lead ( DataStage Architect )

Active Enterprise Data Warehouse (AEDW)

The objective of this project is to develop the Active Enterprise Data Warehouse (AEDW) for VERIZON Customers ( Billing and Service order data ), which will provide single source of customer data across the Verizon footprint, including former Bell Atlantic and former GTE ( legacy systems ). This data is in turn analyzed to make business and marketing decisions. The data is sourced from multiple billing, Service Order systems (VZ450, NOCV) and is housed in the centralized AEDW.

Responsibilities

• Analyzed the source and the target systems to aid the design of the Enterprise Data Warehouse.

• Total responsible for the coding and implementation of VZ450 billing data to AEDW environment

• Participated in the high-level design of the ETL system and mapping of business rules

• Developed ETL extraction routines to extract data from Teradata database using Teraread and XML files using DataStage XML in Stage

• Developed master key process routines and automation of keys update in DataStage

• Migrated RCAD jobs from Orchestrate environment to Data Stage environment

• Conducted performance analysis and optimized the Data stage/Orchestrate jobs

• Developed code to process large volumes of data (terabytes) using Data stage/Orchestrate

• Developed low-level design to depict the functionality of each individual system.

• Developed the functions to manage the Extract Transform and Load of the EDW using a combination of C/C++, Korn Shell scripts, DataStage 7.5 ,Orchestrate Parallel Extender

• Conducted the Unit testing and System testing of the individual modules.

• Aiding the system and production support to fix the post-release glitches.

• Developed MLOAD,BTEQ scripts in HP_UNX environment

• Deleloped the Production run stripts for Datastage Project

• Involved in the Datastage Administration activities in UNIX environment

• Mentoring team on how to program Data stage/Orchestrate parallel jobs, development of presentation materials and training

• Size of the project was 40 software developers.

• Lead the team of 8 developers and involved in the development of project from scratch to implementation ( Full lifecycle )

Environment: DataStage 7.5.1.A, C/C++, Orchestrate ( ETL – Parallel Extender ), HP-UX B.11.11 ( UNIX), QualityStage 7.5, Profile Stage, XML, NCR Teradata SQL-6.2, MLOAD, FASTEXPORT,BETQ, Korn Shell Scripts, Vi-editor,SFTP, PERL,Erwin, Visio 2000,Lotus Notes,CMIS ( Configuration Management Information System ) , Continuus

IBM, RTP, NC Oct ’00 – Sep ‘02

Sr Software Programmer/Analyst

CDB/CKI (Customer Key Integration)

Customer Key Integration (CKI) is one of the processes of IBM Customer Information Initiative. The objective of the CKI is to create an update process that would maintain unique reference number at Customer establishment/site and contact levels, which will provide the a single enterprise-view of IBM customers. This view follows a site and contact model that supports IBM business objectives. The process flow of this project starts from Standard Input file creation which is the Input to CKI process and CKI process creates the Master Reference File /Cross Reference File. This file will be used in the post processing of CDB

Responsibilities

• Total responsible for CKI Development and enhancements

• Designed and Developed new features and enhancements using C/C++

• Involved in design and development of data structures using OO methodologies ( Data encapsulation, Inheritance and Polymorphism )

• Developed the CKI run flow for various processes using Korn Shell

• Extensively modified the existing Korn Shell programs to add new functionality to the CKI application

• Developed programs in C++ to eliminate the duplicate records from different sources

• Developed tools for the analysis of CKI results using Korn Shell and PERL

• Involved in the Design and reviewing of Transactional CKI process

• Performed a system level integration of various components

• Performed extensive unit and integration testing

• Provided installation and training support for the product

Environment : C, C++, Korn Shell Script, Vi-editor,SFTP, PERL, AIX (UNIX), Java, XML, Trillium (API), Relationship Guardian, Sync sort software, CMVC (Configuration Management Version Control), Cron - Scheduler

www.indianrail.gov.in Jan’ 95 – Mar’ 00

Indian Railways, India

COUNTRYWIDE NETWORK FOR COPUTERIZED ENHANCED RESERVATION AND TICKETING (CONCERT)

Senior Software programmer/Analyst

The object of this project is to build a network of Railway passenger Reservation systems (PRS) at all five major cities in India. The computerized passenger reservation system in the networking environment will provide convenience to the passengers by enabling them to get reservation on any train in any class from any counter at any location where the computerized reservation terminals are provided. To achieve this objective the distributed approach has been used for implementing the global reservation facility. CONCERT has Client/Server architecture to enable distributed transaction processing on RTR (Reliable Transaction Router, described as message middle ware and TP monitor, is data driven, Fault Tolerant and provide two phase commit protocol and Network Transparency to the application). The clients were designed to provide services to the passengers and servers are designed to provide services for the global reservation facility. This software has been implemented at five locations of Indian railways and these sites have been networked over 64KB leased lines using CISCO routers.

Responsibilities

• Involved in the analysis and design of On-line Clients, IVRS (Interactive Voice Response System) and fare modules of CONCERT application

• Designed, Developed and implemented the Interactive Voice Response System using C++.

• Designed, developed and implemented the Reservation Availability Position and Information using C

• Technical lead and project lead for the project.

• In charge of collecting and analyzing all the requirements from customer and Management

• Developed technical design and requirements analysis.

• Designed and developed the on-line functions of CONCERT application. In this module, there are 60 online functions to provide various facilities to the passengers including foreign tourist reservation, cancellation and enquiry clients.

• Designed and developed the credit card sub system for the validation of hot listed credit card numbers.

• Involved in the design and development of fare module of CONCERT

• Written several Shell scripts for backup procedures, batch procedures

• Performed a system level integration of various components like IVRS, Display systems, Clients, Credit card server and fare module of CONCERT

• Performed extensive unit and integration testing

• Provided installation and training support for the product

• Developed the server side programs for an internet enquiry web site using Java

• Involved in updating the database and generation of reports

Environment : C, C++, FORTRAN, JAVA, TCP/IP, UNIX, Shell Scripting, PERL, RTR, Open VMS

Indian Railways, India July’ 94 – Dec’ 94

An Expert System for Dynamic Allocation of Facilities at Stations

It is a rule-based system designed and developed to facilitate allocation of facilities such as Platforms, Washing lines, Stabling lines etc to trains arriving and/or departing from a Railway station. The two major tasks are the generation of Master berthing chart due to the periodic changes in the train schedules and secondly the on-line re-allocation of trains due to disruption of the regular schedule by unforeseen circumstances

Responsibilities:

• Involved in the analysis, design and development of the system

• Developed the complete user interface in Visual C++.

• Involved in coding, testing and implementation of the system

• Performed extensive unit and integration testing

• Provided installation and training support for the product

Environment : C, Visual C++, Windows NT

Indian Railways, India Dec’ 92 – June’ 94

Visual Interactive Simulator for Train operations and performance

It is a simulator developed incorporating operation research and discrete event simulation technique for planning train operation and analysis of train performance. It is visual interactive simulator and provides the facility for dynamic train charting, depicting movements of all passenger and freight trains.

Responsibilities:

• Involved in the design and development of graphical user interface in OSF/Motif

• Developed the Platting routines on Cal comp plotter.

• Written several Shell Scripts to automate system related processes

• Performed extensive unit and integration testing

Environment : C, UNIX, OSF/Motif and X - Windows

BN railroad and Indian Railways, India Nov’ 91 – Nov’ 92

Line Capacity Model

This is a computer based simulation model developed initially by BN railway board (USA). It was written in FORTRAN and Simscript on IBM 3090 in batch mode. The project includes the porting of this software on DEC workstations, which includes rewriting some portion of program in ‘C’, modification of programs to increase its scope and applicability so that it can be used for other railways.

Responsibilities:

• Involved in the development of a graphical user interface using Simgraphics, ‘C’ and Sim-script II under X – Windows on DEC Workstation 5125

• Designed and developed the plotting routines for statistical reports.

• Involved in system study, analysis , design and coding.

• Written several Shell Scripts

Environment : C, FORTRAN, UNIX under X – Windows

Indian Railways, India Jan’ 90 – Oct’ 91

Locomotive Information Network

Locomotive Information Network had been designed to assist in the maintenance of Locomotive is also designed to generate loco failure analysis, loco punctuality reports. All locations have been connected through railway hotlines. The system at central control is used for day-to-day analysis of reports and for long term planning. The loco sheds aims at monitoring the shed activities to help managers taking the right decision.

Responsibilities:

• In charge of collecting and analyzing all the requirements from customers and Management for the module

• Developed technical design and requirements analysis for various control modules

• Involved in the development of MIS reports

• Written several Shell Scripts

• Performed extensive unit and integration testing

• Provided installation and training support for the product

Environment: UNIX, C, Shell Scripting

Education

Masters in Computer Science and Engineering



Contact this candidate