Sign in

Data Developer

Manteca, California, United States
January 11, 2018

Contact this candidate

Email id:

Contact No: 925-***-****

Kevin Palraj

Professional Summary.

• 7 years of total experience in IT and extensive ETL tool experience using IBM Infosphere Data Stage

(Ver 9.1/8.5/8.1/7.5), designing and developing jobs using IBM web sphere Data Stage.

• Experienced in using highly scalable grid architecture using Datastage parallel Extender (DS-PX) and highly proficient in Datastage server engine.

• Provided ETL design solutions to multiple assignments, data integration/migrations projects to various clients.

• Strong in writing UNIX/Linux shell scripts that are required for the pre-and post-validations, process control mechanisms and unit testing.

• Excellent experience working with Oracle, Teradata, Db2, PL/SQL and writing stored procedures and Triggers.

• Involved in all the stages of Software Development Life Cycle and proven track record in troubleshooting of DataStage jobs and addressing UAT and production issues like performance tuning and enhancement.

• Experience in Data Analysis, Data Cleansing (Scrubbing), Data Validation and Verification, Data Conversion, Data Migrations and Data Mining.

• Experienced in interacting with Clients, Business Analysts, UAT Users and Developers.

• Have experience in testing reconcile (initial) and delta (daily) loads, created automated jobs to do this

• Created many automated process and implemented many new design of part of supporting application

• Strong understanding of the principles of DW and EDW using Fact Tables, Dimension Tables and star schema modelling.

• Skilled in writing technical specifications documents, translating user requirements to technical specifications in the form of LLD and HLD and strong understanding of business processes & interface with IT.

• Excellent track record as a leader and team player with effective communication skills.

• Experience TWS scheduler

• Experience in production support and remedy management work process/ (INC,WO,CRQ,PBI ),high bridge calls

• Effective in cross-functional and global environments to manage multiple tasks & assignments concurrently.

• Multilingual, highly organized, detail-oriented professional with strong technical skills.

• Basic Knowledge in Python – Web maps, folium, Tkinter,Pandas -hands on experience on own

• Knowledge in Big data, Hadoop aspiration to learn new technologies Educational

• Bachelors of Engineering in Electronics and Communications from Anna University, India, 2010 Certifications Completed

Technical Skills

IBM Corp IBM Certified Solution Developer Infosphere Datastage v 8.5 Operating System Windows, Unix,IOS

Languages Unix/Linux Shell scripting, SQL, PL/SQL, Python 3.x Databases Oracle 9i, 10g, MS SQL Server 2000, 2005, MySQL ETL IBM Information Server 9.1/8.5, IBM Data stage 9.1/8.5/8.1/7.x IBM Information analyzer, Informatica 9.x

Reporting Cognos 11

Email id:

Contact No: 925-***-****

Professional Experience/Projects

Project 1:

Single web -based tool that links information about Kaiser financial suppliers invoices and inventories seamlessly in one place Onelink FDW/EPM comprise mainly on Pharmacy Data, Drugs Data, Medical Surgical Materials,SCM and financial.

Project 2:

The project is to upgrade HRDW data warehouse application from Datastage 8.5 to 9.1. Project 3:

LDGL a Bolton application built under MyHR platform in 2007 is currently integrated with OneLink Platform to process and post summarized Labor Costing Accounting Entries into General Ledger. LDGL in MyHR platform will continue its core functions with exception of GL Processing which will completely processed at the detail level in OneLink System (as opposed to current summary integration). This will benefit OneLink ERP with Journal drill-down capability to the Labor Costing detail such as for Payroll and Leave Accounting detail

Project 4 :

POU stands for Point Of Use. Kaiser has used POU enhancements that were put in place to help health connect folks, one of the primary money generating business is Operations. In the Health Connect System, the doctors and patients use EPIC application to look at the patient history or the operation history. EPIC system also maintains all the material information related to an operation. Health Connect is a team that maintains all the information in EPIC system starting from fixing surgery for patients. Once an operation is fixed, an operating room will be allocated to the patient and for each type of operation, there exists a procedure card which will have all the information about the material (Supply/Implant type) required for the operation, which could be supplies or Implants required to perform that particular operation like cotton, scissors etc. The materials could be classified into two types depending on how they are used - Implants and Supplies. Implant is an item that will be used inside the body. Supplies are the materials used to carry out the operations like Scissors, bandages, knives etc.



• Responsibility to analyze and interpret all complex data on all target systems to provide resolutions for all data/performance issues and coordinate with data analyst to validate all requirements.

• Participates in requirement gathering, develops and performs unit testing on all ETL components for new enhancement for existing applications

• Responsible for finding root cause analysis on all processes and resolve all production issues on spot, validate all data, perform routine tests on databases and provide support to Application

• Creating tasks for Team members in Rally

Kaiser Permanente, Pleasanton, CA, USA Dec 2014 to till Date ETL Datastage Lead

Email id:

Contact No: 925-***-****

• Extensively developed Datastage server routines using Datastage Basic Language as part of the development process.

• used XML stages to receive process the files into Databases

• Managing the estimation and capacity for Team members

• Migrated jobs from Datastage 8.5 to 9.1

• Created new automated jobs to reconcile the source and target data

• Performed regression for all projects that were migrated to new version.

• Coordinated with offshore team and other teams to successfully implement this project.

• Prepare Unit test cases and Unit test case document for the same, align with standard test case Document. Involve in code review and peer review and suggested for enhancements and performance.

• Identifying processes which may be enhanced or tuned on/off to reduce cost and improve customer satisfaction

• Wrote a Unix script to run the jobs and other SFTP process

• Production support and remedy management for ETL/Cognos project Environment: IBM Datastage 9.1/8.5 MPP system, Linux, Oracle, SQL server, Tivoli. Walgreens IND-CHE July 2013 to Nov 2014

ETL Developer

Title: 340B

340B is a government mandated program through which drug manufacturers provide discounted drugs to qualifying patients of participating 340B clinics across US. These clinics can choose to dispense the drugs onsite or through a 3rd party such as Walgreens. Walgreens participates in this program through a 3 way relationship including federally qualified health care providers (FQHCs) and drug wholesalers. Walgreens dispenses medication on the behalf of the FQHC's (client) and is replenished by the wholesalers at no cost to Walgreens. The 340B complete application will accumulate and maintain Walgreens 340B Program information. This information will be further used by Marketing, Sales, Accounting / Finance and other Walgreens departments for storing, retrieving and updating 340B related data, procuring 340B drugs from wholesaler/vendor, 340B claims processing with the clients/clinics, purchase order generation and invoicing Project 1:

Migrating the 8.01 Data stage code to 9.1


• Responsible for migrating the code from data stage Version 8.01 to 9.1 and performs End-End parallel Testing and data validation between 8.01 & 9.1 in 4 environments (DEV, UAT, Performance area, PROD)

• created manual migration plans, check list

• created mount space on UNIX level to point the newer version tool file system

• Responsible for monitoring performance/data sync between 8.01 and 9.1 and shares the report to the Client

• Responsible for performing root cause analysis on all processes and resolve all production issues and Validate all data, to find new bug in newer version of tool and creating document to submit at IBM Environment: IBM Datastage 9.1/8.1, Linux, DB2

USAA June 2012 to June 2013

ETL Developer

Title: USAA-Fin Crimes

USAA -DCCTD Team develops ETL model to capture the bank file data and load into database for monitoring suspicious AIRLINE transaction data who booked the tickets through USAA- Credit or Debit card. This Project developed based on agile methodology


Email id:

Contact No: 925-***-****

• Understood the Technical specifications and developed Datastage server jobs for Extract Transformation and Loading process of EDW

• Extensively developed Datastage server routines using Datastage Basic Language as part of the development process

• Developed job sequences by identifying independent and dependent flows with proper roll-back strategies incorporated in the sequence

• Closely co-ordination between Client and the team to ensure smooth delivery

• Ad-hoc testing, analyse and problem solving

• Preparing and sending Daily status report pertaining to the test activity Environment: IBM Datastage 8.7, Linux, Oracle


Well Point IND-CHE Feb 2011 to April 2012

ETL Developer

Title: Provider finder

Provider finder is an initiative to enhance customer experience by providing search capabilities on providers to consumers and also to encourage insurance customers to visit the web site to make informed decisions on providers. Based on consumer feedback from Foresee surveys and logged customer care issues, WellPoint is enhancing the usability and functionality of the online Provider Finder tool, currently hosted by the vendor Ingenix. As part of the new, redesigned Consumer Web portal, the provider finder function and tool needs to be enhanced so as to support the improved

consumer experience, have a consistent look and feel to match the overall consumer portal, provide an intuitive, simple to navigate approach and reduce 'jargon' in content Responsibilities:

• Developed ETL codes and performed tests (Unit, Integration) on all ETL codes for system Data and analyse all data.

• Responsible for creating BTEQ scripts for bulk load which was loaded through ETL –informatica

• Creating TSD documents and load into share point.

• Checking the logs and debug the jobs

Environment: Informatica 9.X/ Teradata,BTEQ Scripts,Oracle.Unix

Contact this candidate