Post Job Free
Sign in

ETL DataStage Developer

Location:
Texas City, TX
Salary:
$60/Hr
Posted:
March 02, 2018

Contact this candidate

Resume:

K. Ranjith Kumar

ac4n9i@r.postjobfree.com – 408-***-****

Professional Summary:

Having 9+ years of experience as an ETL Developer using Data Stage Parallel Extender.

Hands on Datastage 9.1, 8.5, 8.1 and 7.5.2 as ETL Tool for Extracting data from various sources like Databases, Flat files, Excel sheets and load into target systems.

Excellent in Designing, Developing, Documenting, Testing of ETL jobs and mappings in Parallel jobs using Data Stage to populate tables.

Familiar with functionalities of Data Stage client components like Designer, Director, Manager.

Excellent in system development Life cycle (SDLC) of Enterprise warehouse and Strong understanding of the principles of Data Warehouse concepts and Dimensional modeling.

Coordinating with onshore team and gathering the requirements and delivering code on time.

Worked extensively in implementation of transformations like Data Cleansing, Data Scrubbing to improve the quality of data and Worked on Data Transformations, Data Loading, and Data Type Conversion.

Well versed in using various parallel stages like dataset, Sequential File, Transformer, Filter, Join, Lookup, Merge, Sort, Funnel, Aggregator, Copy, and Surrogate Key generator.

Extensively used Database stages Oracle, ODBC, Sybase Enterprise and Dynamic RDBMS.

I have used real time stages like XML in order to read the data from XML/JSON files.

Proven track record in troubleshooting of Data Stage jobs and addressing production issues like performance tuning and enhancement.

Excellent in using highly scalable parallel processing Infrastructure using DataStage Parallel Extender.

Good knowledge in Sql and UNIX.

Ability to work effectively and efficiently in a team and individually with excellent interpersonal, technical and communication skills.

Did a POC project to call API from Datastage and read the JSON files as input.

Trained on Hadoop and have knowledge on HDFS, PIG, SQOOP and HIVE.

Willing to adopt new environment and learning technologies.

Technical Skills:

ETL : DataStage Parallel Extender7.5/8.1/8.5/8.7/9.1

Reporting : Tableau 8x/9x, Qliksense

RDBMS : Oracle 9i/8i, Sybase, Sybase IQ, SQL server

Languages : C, C++

O/S : UNIX, MS- Windows xp/2003/2007

Tools : Ultra Edit, Putty, Winscp, HPSM, ALM

Projects Profile:

Project #1

Project : Appetize

Client : MGM Resorts

Role : ETL Developer

Duration : Aug 2017 – Till date

Project Description: After successful completion of Project, I was assigned a similar project to extract data from another vendor called as “Appetize”. Working with API’s for data extraction was one of a Project that opened a pathway for many other vendors to work with our team. This project is to pull the data from parking vendor Appetize from API’s using the Datastage and load into database.

Responsibilities:

Created tables in SQL server based on the data dictionary.

Using Execute Command activity to download the JSON file from API’s using CURL command.

Import Meta data using the XML stage.

Using XML stage to read the JSON file and loaded into the database.

Created loop in the Sequencer jobs to fetch all Order related data for a particular day as each file holds only hundred Orders.

Exported/Imported ISX using IBM Information server manager.

Performed unit testing for the jobs developed to ensure all fields have the data as per the data dictionary.

Environment: Data stage 9.1, SQL Server, UNIX and WIN 7

Project #2

Project : Zenoti SPA

Client : MGM Resorts

Role : ETL Developer

Duration : May 2017 – Sept 2017

Project Description: MGM Resorts have different kinds of hospitality services. In which SPA is one of the main services. The idea was to extract the data from one of the vendors called Zenoti. On MGM Zenoti is an application that helps customers to book appointments in advance. The trickiest part was the extraction. Zenoti application is different than any other applications in MGM. They have given us the REST API’s to extract the data from their servers. I could work with Security, UNIX and Datastage Admin team to setup the required environment to pull the data from the API servers. Setting up the ports was challenging as I had to go through each and every security protocol. More over there wasn’t a direct stage available in 9.1 Datastage version so I came up with an idea that saved not only just time but also resources on Datastage. I was able to use the CURL Command to call the Token APIs followed by required different levels of information.

Responsibilities:

Created tables in SQL server based on the data dictionary.

Using Execute Command activity to download the JSON file from API’s using CURL command.

Import Meta data using the XML stage.

Using XML stage to read the JSON file and loaded into the database.

Created loop in the Sequencer jobs to fetch all properties data for a particular day as each property will have different authentication string.

Exported/Imported ISX using IBM Information server manager.

Performed unit testing for the jobs developed to ensure all fields have the data as per the data dictionary.

Environment: Data stage 9.1, SQL Server, UNIX and WIN 7

Project #3

Project : Super Stage Initial load

Client : MGM Resorts

Role : ETL Developer

Duration : Jan 2017 – Apr 2017

Project Description:

In this project, our goal was to load the data from two source systems (Infogen/Opera) into one single server called Super Stage. I had to work with different teams such as Data Architecture, Business Analysts, Data Profilers and came up with a plan for a Successful Implementation. The goal was to move the data smoothly and turn on the Golden Gate replication as a final task. I have made sure the source systems data was built on time according to the project requirements.

Responsibilities:

Created the generic jobs to load the data from Oracle to SQL Server.

Used the grid parameters to improve the performance of the job run.

Exported/Imported ISX using IBM Information server manager.

Develop the validation jobs to check the source record count with target record count.

Performed unit testing for the jobs developed to ensure all fields have the data as per the data dictionary.

Environment: Data stage 9.1, SQL Server, Oracle, UNIX and WIN 7

Project #4

Project : Finance Cube

Client : MGM Resorts

Role : ETL Developer

Duration : Apr 2016 – Dec 2016

Project Description: This project is to maintain an analytical cube to aid in the analysis of the MGM Resorts general ledger. This is a key initiative to aid in analyzing the general ledger according to industry standard metrics and definitions. The result will also be used for gauging the effect of various Profit Growth Plan Initiatives. This cube will serve to slice the general ledger based on a new chart of accounts developed by Enterprise Analytics and allow time period comparisons of like dimensions.

Responsibilities:

Involved in design, development, and implementation phases of the project.

Designed the data stage jobs based on the mapping documents.

Created Technical specification and UTC, and Deployment documents.

Creating the Sequencer jobs by using the job sequencer.

Exported ISX using IBM Information server manager.

Performed unit testing for the jobs developed to ensure that it meets the requirements as per the business needs.

Environment: Data stage 9.1, Teradata, UNIX and WIN 7

Project #5

Project : GRS HOSRVH04EOL

Client : Walmart

Role : ETL Developer

Duration : Jan 2016 – Mar 2016

Project Description: Decommission an UNIX server reaching end of life which is hosting 59 Informix databases, nearly 300 application components and 50 socket programs connecting similar UNIX boxes within Walmart. One of the key database RULES is widely used by different business units under buy domain for US home office. Global Replenishment Solution Datastage processes connecting this rules database needs to be remediated by pointing to the new home.

Responsibilities:

Involved in design, development, and implementation phases of the project.

Created Technical specification and UTC, and Deployment documents.

Prepared queries to fetch the data as per the mapping with performance.

Creating the Sequencer jobs by using the job sequencer.

Exported ISX using IBM Information server manager, Datastage Designer.

Performed unit testing for the jobs developed to ensure that it meets the requirements as per the business needs.

Environment: Data stage 9.1, Informix, Oracle, UNIX and WIN 7

Project #6

Project : CSP Facets Upgrade

Client : UHG

Role : ETL Developer

Duration : Sep 2015 – Dec 2015

Project Description: UnitedHealth Group Inc. is a diversified managed health care company and it offers a spectrum of products and services through two operating businesses, UnitedHealth care and Optum. UnitedHealth Group serves approximately 70 million individuals nationwide.

As part of CSP Facets upgrade project upgrade the Facets version 4.7 to 5.2 and changing database Sybase to Oracle and Datastage version 8.5 to 8.7 for all the existing processes without changing the existing functionality.

Responsibilities:

Involved in development, testing and implementation phases of the project.

Updated the jobs by replacing Oracle connector stages with Sybase enterprise stages.

Involved in writing the queries in oracle to implement the same Sybase queries logic.

Handled data issues like nulls as part of development.

Involved in unit testing, Dev parallel testing.

Working as onsite coordinate and coordinate with offshore team to make sure to meet the deadlines for deliverables.

Exported ISX using IBM Information server manager and created the FAST BPD package to deploy jobs in higher environments.

Environment: Data stage 8.7/8.5, Oracle 10, Sybase, UNIX and WIN 7

Project #7

Project : Kaiser Foundation Health Plan

Client : Kaiser Permanente

Role : ETL Developer

Duration : Jun 2015 – Sep 2015

Project Description: Software which handles all document generation for health plans related to Kaiser Group employees. Using this software the company can generate different health plans, version control them at department, office and employee level. The key models are intake, submission, generation and publishing.

Responsibilities:

Involved in design, development, and implementation phases of the project.

Designed the data stage jobs based on the mapping documents.

Created Technical specification and UTC, and Deployment documents.

Prepared queries to fetch the data as per the mapping with performance.

Creating the Sequencer jobs by using the job sequencer.

Exported ISX using IBM Information server manager, Datastage Designer.

Performed unit testing for the jobs developed to ensure that it meets the requirements as per the business needs.

Environment: Data stage 8.5, Oracle 10, Qliksense, UNIX and WIN 7

Project #8

Project : CSP – Capitation Flat File Extract

Client : UHG

Role : ETL Developer

Duration : Nov 2014 – Mar 2015

Project Description: The main purpose of this project is to create flat files at Payment level for Capitation Operations team to help them review the details of the payment before releasing the payment to providers.

Responsibilities:

Involved in design, development, and implementation phases of the project.

Created Technical specification and UTC, and Deployment documents.

Prepared queries to fetch the data as per the mapping with performance.

Created the CSV files as final output for each combination of Pay to provider id and Check payment reference id.

Environment: Data stage 8.5, Sybase IQ, Oracle 10, UNIX and WIN 7

Project #9

Project : CSP – 834 User Friendly

Client : UHG

Role : ETL Developer

Duration : Apr 2014 – Oct 2014

Project Description:

The project is to create a process for different states to load the 834 and supplemental files raw data to be stored in to the database. The main purpose is to store this data to generate report based on the state requirements.

Responsibilities:

Created Technical specification and UTC, and Deployment documents.

Created UNIX shell script calling Load table script to load the data into the Sybase IQ database.

Performed unit testing for the jobs developed to ensure that it meets the requirements as per the business needs.

Prepared the CFW scripts for the datastage jobs.

Environment: Data stage 8.5, Sybase IQ, Oracle 10, UNIX and WIN 7

Project #10

Project : CSP – Capitation 837/835 Process

Client : UHG

Role : ETL Developer

Duration : May 2013 – Mar 2014

Project Description: The project is to create a process for the New Jersey state which will calculate the manual adjustments for the provider. The main purpose is to hold every month New Jersey state provider details and calculate fund amount after all the adjustments made for the providers and those details need to send to the state in the form of 837 file. We will receive acknowledgement for the 837 file in the form of 835 file with ICN number.

Responsibilities:

Involved in requirement analysis, planning, design, development, and implementation phases of the project.

Creating the Sequencer jobs by using the job sequencer.

Created Technical specification and UTC, and Deployment documents.

Exported ISX using IBM Information server manager, Datastage Designer.

Created tables with proper indexes to improve the performance by eliminating the full table scans.

Performed unit testing for the jobs developed to ensure that it meets the requirements as per the business needs.

Split the files based on the size if it exceeds 25MB using UNIX shell script.

Preparing the CFW scripts for the datastage jobs.

Environment: Data stage 8.5, Sybase, Oracle 10, UNIX and WIN 7

Project #11

Project : CSP – FSDB Medical Claim Processing

Client : UHG

Role : ETL Developer

Duration : Dec 2011 – Apr 2013

Project Description: The project main purpose is to create a single Claim processing system. Currently it has three different plans named Diamond, Cosmos and River valley Facets for the claims processing and the company decided to utilize Facets as the new Claim processing system for United Health care Community & state and it is named as CSP (Community and strategic platform). The Claim feeds have three different extracts called PAID, PNPB & Membership. The claim feeds designed for the Maryland state and the remaining states will be replicated to the remaining states Texas, Nebraska, Washington, Kansas, Ohio, New york, New jersey and Wisconsin. For every state migration we need to implement specific state rules without affecting the existing ones.

Responsibilities:

Analysis of the HLD and mapping documents.

Designed the data stage jobs based on the mapping documents and created the fixed width file as final output.

Worked HP Quality Center to resolve Data related defects.

Involved in preparing the technical specification document.

Developed the queries to extract required information from Sybase database.

Sequencing the jobs by using the job sequencer for sequence of jobs to be executed at scheduled time.

Performed performance tuning of the jobs developed.

Involved in fixing the production defects.

Involved in preparing the Unit test cases.

Environment: Data stage 7.5.2, Sybase, Oracle 10, UNIX and WIN 7

Project #12

Project : Automatic Upload of Fee Schedules

Client : UHG

Role : ETL Developer

Duration : Jun 2011 – Dec 2011

Project Description: The creation of Auto Upload of Fee schedules to load the thousands of fee schedule into Networx. UHN will send one mass file that will contain all fee schedules. A second file will be sent containing header information for new fee schedules. A backend process will be created which will have the capacity to auto-upload mass amounts of fee schedules and retro processing for both facility and professional fee schedules. This process will be able to detect new fee schedules in the form of a CSV flat file, which will be placed in a FTP file location by UHN on a monthly or quarterly basis. The file will be validated against existing records in the Networx tables. The records from the work table will then be used to create individual files. If specific records cause a failure, those records will be moved to an error report. If the validation process passes, the program will continue with the insert of the new records.

Responsibilities:

Extensively used Parallel Stages like Row Generator, Column Generator and Peek for development and de-bugging purposes.

Involved in daily calls with onsite coordinator.

Created the complex CSV files as final output by using the datastage designer.

Developed several jobs to improve performance by reducing runtime using different partitioning techniques.

Used different stages of Datastage Designer like Join, Lookup, Transformer, Funnel, Filter, Copy, Aggregator, Column generator, sequential file, dataset and Sort etc.

Environment: Data stage 7.5.2, Sybase, Oracle 10, UNIX and WIN 7

Project #13

Project : The Mortgage Integration Business Intelligence Program

Client : Lloyds Bank Group, UK

Role : ETL Developer

Duration : Jun 2010 – Feb 2011

Project Description: The creation of LBG by merger of LTSB and HBOS, initiated the need of unifying the mortgage data of both the organizations. In current state, the FDM system addresses the reporting / MI requirements of LBG Mortgages for LTSB and C&G brands. However, going forward this will be achieved by establishing a Mortgage data warehouse (MDW) with industry standard data architecture, strong data governance, new platform and standard tools for data exploration. This is planned to be achieved under ‘The Mortgage Integration Business Intelligence Program’.

The primary objective of the project is to develop, test and implement a robust, automated and reliable Extract, Transform and Load (ETL) process to populate the data from the MSP System to MDS (Mortgage Data Store). This ETL process gathers data from source system interfaces transforms data, loads data into the data warehouse. This ETL process will be periodic in nature to support the data requirements by the data marts and analytical applications. A data quality process to audit and validate the data across this process will also be realized as part of the ETL system.

The MDS is based on the FSLDM model and the subject areas that are covered as part of this program are Application, Location, Party, Product, Property, Revenue, Risk, Channel and Brand.

Responsibilities:

The project involved extracting data from various sources, transforming the data from these sources before loading the data into target (warehouse) tables.

Analysis of the functional specification provided by BAs.

Designed and created jobs using various stages Flat file, Dataset, Lookup, Change capture, Sort, Filter, Funnel, Transformer and Database stages etc.

Involved in preparing the low level design documents.

Designed the data stage jobs for extract, transform, CDC, and load process.

Sequencing the jobs by using the job sequencer for sequence of jobs to be executed at scheduled time.

Involved in writing the shell scripts.

Involved in preparing the Unit test cases.

Environment: Data stage 8.1, Oracle10g, UNIX and WIN XP

PROJECT #14

Project : Silver Mark EDW

Client : Silver Mark, USA

Role : ETL Developer

Duration : Aug 09 – April 10

Project Description: Silver Mark is US based Insurance Company, is one of the largest insurance providers for Home insurance, Motor insurance, Travel insurance and Card insurance. This project involves building of data warehousing for Motor insurance. After developing the warehouse we need to integrate this motor warehouse with existing warehouses there by building an Enterprise Data warehouse.

The source data, which includes

Types of policies

Information about policy holder

Policy covered item details

Premium Payment details

Types of claims like fast claim payments, fast repairs claim.

Responsibilities:

Understanding the ETL Specifications and Preparing Design Documents.

Created Data Stage jobs taking performance into consideration.

Extracting data from various sources like Flat files, Excel sheets, oracle using ODBC enterprise stage, Sequential file and Dynamic RDBMS stage.

Working with the Processing stages like Sort, Filter, Transformer, Modify, Funnel, Join, Lookup…etc.

Used all stage types like Processing, File, and Database.

Perform Exporting and Import (when ever required) activities on a daily basis with the DataStage client tool i.e., DataStage manager.

Environment: Data Stage 7.5.2, Oracle 9i, Win XP Professional, UNIX

Project #15

Project : South Financial Data warehouse

Client : The South Financial Group

Role : ETL Developer

Duration : Oct 08 – Jul 09

Project Description: The South Financial Group is the largest publicly-traded bank holding company headquartered in South Carolina. The South Financial Group uses a super-community bank strategy, serving small and middle market businesses and retail customers. The DW will give the business an enterprise wide view of the information which can be used for analysis. At a high level, the decision makers can use the DW to make decisions on Historical Business Trends, Business Performance and Possible Fraud Cases. Using the above indicators relevant business decisions can be taken to further improve the business performance.

Responsibilities:

Worked extensively on different components like DataStage Manager, DataStage Designer, and Data Stage Director.

Data Cleansing and aggregation while transforming.

Done performance tuning in Data Stage.

Used DataStage Designer to create the table definitions for the CSV and flat files, import the table definitions into the repository, import and export the projects, release and package the jobs.

Involved in Change Requests according to the requirements after analyzing.

Environment: DataStage 7.5.2, Oracle9i, windows 2003, Unix

Qualification:

Master of Computer Application from Sri Krishnadevaraya University.



Contact this candidate