Post Job Free
Sign in

Etl Developer Package Delivery

Location:
Bentonville, AR
Posted:
May 26, 2023

Contact this candidate

Resume:

Sirish Kumar

Email: ***********.*@*****.***

Cell: 512-***-****

Professional Summary

11+ years of IT experience in the System Analysis, Design, Development, Implementation, Testing and Production support of Database, Data warehousing using Data Modeling, Data Extraction, Data Transformation, Data Loading and Data Analysis.

Remarkable experience in ETL Development using IBM Ascential Datastage 9.1/8.5/8.1/7.5.x(DataStage Manager, DataStage Designer, DataStage Director, Parallel Extender), creating Fact Tables, Dimension Tables using Star Schema Modeling.

Extensive knowledge in Development, Analysis and Design of ETL methodologies in all the phases of Data Warehousing life cycle.

Experience on IBM Infosphere (DataStage, Quality Stage) / 9.1/8.1

Good Knowledge on IBM Infosphere (DataStage, Quality Stage) / Information Analyzer 9.1/8.1 and IBM FastTrack.

Experience in both Parallel Extender Jobs and Server Jobs in DataStage.

Played an integral part in the building of a multi-server, multi-database enterprise Data warehouse using DataStage ETL (extract, transform and load) tools and SQL Server to load legacy business data.

Knowledge of Data Warehouse Architecture and Designing Star Schema, Snow flake Schema, Fact and Dimensional Tables, Physical and Logical Data Modeling using Erwin.

Technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.

Hands on experience in Relational database (RDBMS), Oracle, SQL Server, DB2 and MS Access.

Strong Experience in writing PL/SQL, Stored Procedures, Functions and Database Triggers.

Designed Mapping documents.

Good knowledge in data profiling and metadata management by means of profiling Stage and Metadata stage.

Aligned all the project related documents(TTD,TSD,FD,UTD,SIT,UAT) SharePoint

Experience in usage of Quality Stage for maintaining and cleansing of data.

Solid experience in using UNIX and writing UNIX shell scripts.

Experienced in working with scheduling tool Control-M.

Knowledge on Nosql Databases such as PIG, Hive and Mongo.

Good understanding of Mapped reduced Language and Balanced Optimization.

Good Understanding of BDFS to handle unstructured data.

good knowledge on Xml parser,XML composer and JSON Parser and JSON Composer

Extensive experience in loading high volume data and performance tuning.

Capable of working as a team member or individually with minimum supervision.

Flexible to adapt to any new environment with a strong desire to keep pace with latest technologies.

Participated in discussions with Project Manager, Business Analysts and Team Members on any technical and/or Business Requirement issues.

Capable of working under high stress environment with resource constraints.

Excellent analytical, communication, and facilitation skills with the ability to gain consensus across multiple teams.

Technical Skills

ETL Tools:

IBM InfoSphere (DataStage, QualityStage) DataStage (7.X,8.X,9.X) client components (Designer, Director, Manager, Administrator).

Languages:

SQL, SQL PLUS, PL/SQL, XML, HTML 4.0, DHTML, Korn-Shell Scripting,AIX Scripting

Database:

Oracle (11g/10g/9i/8i, SQL Server 2000/2005/2008), Teradata (V2R6/V2R5/V2R3), DB2 UDB.

Version Control

VSS,Power,Tortoise,SVN

Tools:

SQL* Plus 9.2, SQL Loader 9.2,Excel,TOAD,Queryman,Control-M,Autosys,TWS

Operating Systems

Windows(&/XP/2000/2003/2007),UNIX

Professional Experience

DESCRIPTION OF THE PROJECT

UPS - Wayne, NJ

ETL Developer/Data stage Developer

UPS (FDM, RTR, NOAD and LONGVIEW): UPS is world’s largest package Delivery Company, the customer solution’s division of UPS is engaged in providing world class technology solutions to facility the package shipping, package tracking and monitoring, returns package management and other similar business use cases for their corporate customers.

Financial data mart is one of the application that maintains finance data information of UPS. All the GL that are maintained through oracle will be used to run batch jobs that will make sure to balance GL with FDM database, Which further will be used to generate reports through OBIE.

RESPONSIBILITY

Worked extensively on Data warehousing, extensively used DataStage an ETL tool to design mappings to move data from Source to Target database-using Stages.

Obtained detailed understanding of data sources, Flat files.

Extensively used DataStage Designer, Administrator, and Director for creating and implementing jobs.

Automation of ETL processes using DataStage Job Sequencer and Transform functions.

Involved in Performance Tuning on the source and target at DataStage Level and Data Loading.

Performed unit testing of all monitoring jobs manually and monitored the data to see whether the data is matched.

Used Log in DataStage Director and Peek stage for debugging.

Strictly followed the change control methodologies while deploying the code from DEV to QA and Production.

Involved in Datastage migration from 8.7 to 11.5.

Good Knowledge on SOX compliance.

Involved in setting up the DB connections towards the new server

Used CCMigration tool to make code changes and migrate the code to 11.5

Involved in 24/7 ETL Production Support, maintenance, troubleshooting, problem fixing and ongoing enhancements to the Data mart.

Environment

Datastage 8.7, SQL Server, Windows Server 2008, IBM Rational Software Architect, SQL Developer,ESP Scheduler, UNIX,TFS

WAL-MART-Bentonville, AR

ETL Developer/Data stage Developer

WAL-MART PROFIT (GSM): The data will be sourced from multiple sources such as UI and WCC, DB2, Multiple business rules were defined to extract the data and transform to an XML which will be further down streamed to Big Data and SAP. Analytics will be performed on Big Data while the current business updates will be referred through SAP. By implementing this Project the suppliers doing business with Walmart can be on boarded in 24-48hours

Responsibilities

Worked extensively on Data warehousing, extensively used DataStage an ETL tool to design mappings to move data from Source to Target database-using Stages.

Obtained detailed understanding of data sources, Flat files.

Extensively used DataStage Designer, Administrator, and Director for creating and implementing jobs.

Used QualityStage stages such as investigate, standardize, match and survive for data quality and data profiling issues during the designing

Involved in creating technical documentation for source to target mapping procedures to facilitate better understanding of the process and incorporate changes as and when necessary.

Automation of ETL processes using DataStage Job Sequencer and Transform functions.

Extensively used DataStage Director for Job Scheduling, emailing production support for troubleshooting from LOG files.

Involved in Performance Tuning on the source and target at DataStage Level and Data Loading.

Performed unit testing of all monitoring jobs manually and monitored the data to see whether the data is matched.

Used SAP Accelerator Stage (Add-on) to process the SAP data.

Used Log in DataStage Director and Peek stage for debugging.

Strictly followed the change control methodologies while deploying the code from DEV to QA and Production.

Involved in 24/7 ETL Production Support, maintenance, troubleshooting, problem fixing and ongoing enhancements to the Data mart.

Environment

Datastage 8.5, DB2,Terradata, Windows Server 2008, IBM Rational Software Architect, Teradata Assisntant,TWS Scheduler, UNIX Project implemented in Agile Methodology.

WAL-MART-Bentonville, AR

ETL Developer/Data stage Developer

WAL-MART PROFIT (DI-BI): Multiple support tickets are being maintained by means of a Remedy Application in Walmart. Datastage jobs suing API connects to remedy application accessing Informix Database, pull the compliance data and load it in an Oracle Database. At later point BO is used to generate reports for Business. The project was an Upgrade from 8.1 to 9.5

Responsibilities

Worked extensively on Data warehousing, extensively used DataStage an ETL tool to design mappings to move data from Source to Target database-using Stages.

Obtained detailed understanding of data sources, Flat files.

Extensively used DataStage Designer, Administrator, and Director for creating and implementing jobs.

Automation of ETL processes using DataStage Job Sequencer and Transform functions.

Extensively used DataStage Director for Job Scheduling, emailing production support for troubleshooting from LOG files.

Migration of Datastage jobs from 8.1 to 9.5

Involved in Performance Tuning on the source and target at DataStage Level and Data Loading.

Involved in 24/7 ETL Production Support, maintenance, troubleshooting, problem fixing and ongoing enhancements to the Data mart.

Environment

Datastage 8.5, DB2,Terradata, Windows Server 2008, IBM Rational Software Architect, Teradata Assisntant,TWS Scheduler, UNIX Project implemented in Agile Methodology.

WAL-MART-Bentonville, AR

ETL Developer/Data stage Developer

WAL-MART PROFIT (DI-BI): As a part of this project we will extract the data from WCC Views, DB2 tables and source files Transform the data and load the data into SAP to generate different types of reports, Parallel data will be extracted from SAP BWPack to share with the business which will be further loaded into Essbase cubes for generating various reports.

Responsibilities

Worked extensively on Data warehousing, extensively used DataStage an ETL tool to design mappings to move data from Source to Target database-using Stages.

Obtained detailed understanding of data sources, Flat files.

Used BWPack to extract and load data to SAP.

Used SAP BW Datastage Administrator client to setup a user connectivity between SAP and datastage.

Extensively used DataStage Designer, Administrator, and Director for creating and implementing jobs.

Involved in creating technical documentation for source to target mapping procedures to facilitate better understanding of the process and incorporate changes as and when necessary.

Automation of ETL processes using DataStage Job Sequencer and Transform functions.

Extensively used DataStage Director for Job Scheduling, emailing production support for troubleshooting from LOG files.

Involved in Performance Tuning on the source and target at DataStage Level and Data Loading.

Performed unit testing of all monitoring jobs manually and monitored the data to see whether the data is matched.

Used SAP Accelerator Stage (Add-on) to process the SAP data.

Used Log in DataStage Director and Peek stage for debugging.

Strictly followed the change control methodologies while deploying the code from DEV to QA and Production.

Involved in 24/7 ETL Production Support, maintenance, troubleshooting, problem fixing and ongoing enhancements to the Data mart.

Environment

Datastage 8.5, DB2,Terradata, Windows Server 2008, IBM Rational Software Architect, Teradata Assistant, TWS Scheduler, UNIX Project implemented in Agile Methodology.

Optum (UHG)-Eden Prairie, MN

ETL Developer/Data stage Developer

OPTUM provides software and information products, advisory consulting services and business process outsourcing to participants in the health care industry. Hospitals, physicians, commercial health plans, government agencies, life sciences companies and other organizations that comprise the health care system.

We are building a Data warehouse by means of top to bottom approach specific to the client.

Responsibilities

Involved in Business Requirements Analysis and with the stake-holders.

Worked closely with Subject Matter Experts on the Requirement analysis, Source/Target data analysis.

Prepared the Implementation plan for the code migration to QA/Production.

Co-ordinated and lead the offshore team.

Migrated projects from 8.5 to 9.1.

Created the shell scripts for pre/post processing of the files.

Provided code approvals after peer review of the DataStage Jobs, steered Committee meetings and conducted Impact analysis.

Used TWS Scheduler to schedule the DataStage jobs.

Utilized IS tool Manager for Import/Export

Performed end-to-end testing of the flow

Supported through Production for successful execution.

Environment

Datastage 8.5, DB2,Terradata, Windows Server 2008, IBM Rational Software Architect, Teradata Assisntant,TWS Scheduler, UNIX Project implemented in Agile Methodology.

Baxter-Chicago, IL

ETL Developer/Data stage Developer

Baxter Healthcare Inc. (One of the leading healthcare company in US with global presence) develops, manufactures and markets products that save and sustain the lives of people with hemophilia, immune disorders, infectious diseases, kidney disease, trauma, and other chronic and acute medical conditions. As a global, diversified healthcare company, Baxter applies a unique combination of expertise in medical devices, pharmaceuticals and biotechnology to create products that advance patient care worldwide. Baxter had 2013 sales of $15.3 billion.

Responsibilities

Involved in Business Requirements Analysis and with the stake-holders.

Worked closely with Subject Matter Experts on the Requirement analysis, Source/Target data analysis.

Prepared the Technical Design Documents and Lower level Design documents for the technical specs.

Prepared the Implementation plan for the code migration to QA/Production.

Co-ordinated and lead the offshore team.

Migrated projects from 8.5 to 9.1.

Designed ETL jobs with complete understanding on how GRID works and how to reduce the job wait time in the queue before getting submitted to the grid.

Created the shell scripts for pre/post processing of the files.

Process the files received from the business and ensured that the downstream team is able to consume the data

Provided code approvals after peer review of the DataStage Jobs, steered Committee meetings and conducted Impact analysis.

Used Control-M Scheduler to schedule the DataStage jobs.

Tuned DataStage jobs for better performance to bring design parallelism.

Used CDC (Change Data Capture) Stage to Capture the New records and updated records and implemented SCD type 2.

Utilized IS tool Manager for Import/Export

Utilized migration tool for replacing oracle enterprise stages with oracle connector stages.

Performed end-to-end testing of the flow

Supported through Production for successful execution.

Supported Data Stage specific problems on call rotation 24/7.

Environment

GRID Datastage 9.1/8.5, DB2, Oracle 11g, Exadata, Windows Server 2008, IBM Rational Software Architect, TOAD/Sql Developer Control-M Scheduler Project implemented in Agile Methodology.

Wal-Mart – Bentonville, AR

ETL Developer/Data stage Developer

WAL-MART: It is the first and foremost retailer headquarters is on the San Francisco Peninsula near Silicon Valley. It has many branches across the globe and by means of top to bottom approach implementing a warehouse in specific to a country.

Responsibilities

Worked extensively on Data warehousing, extensively used DataStage an ETL tool to design mappings to move data from Source to Target database-using Stages.

Obtained detailed understanding of data sources, Flat files and Complex Data Schema.

Designed parallel jobs using various stages like Aggregator, Join, Transformer, Sort, Merge, Filter and Lookup, Sequence, ODBC.

Broadly involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using DataStage PX.

Extensively used DataStage Designer, Administrator, and Director for creating and implementing jobs.

Created shared containers to use in multiple jobs.

Involved in creating technical documentation for source to target mapping procedures to facilitate better understanding of the process and incorporate changes as and when necessary.

Automation of ETL processes using DataStage Job Sequencer and Transform functions.

Extensively used DataStage Director for Job Scheduling, emailing production support for troubleshooting from LOG files.

Involved in Performance Tuning on the source and target at DataStage Level and Data Loading.

Developed PL/SQL stored procedures for source pre load and target pre load to verify the existence of tables.

Performed unit testing of all monitoring jobs manually and monitored the data to see whether the data is matched.

Used SAP Accelerator Stage (Add-on) to process the SAP data.

Used Log in DataStage Director and Peek stage for debugging.

Strictly followed the change control methodologies while deploying the code from DEV to QA and Production.

Involved in 24/7 ETL Production Support, maintenance, troubleshooting, problem fixing and ongoing enhancements to the Data mart.

Environment

IBM InfoSphere Information Server V8.7 Suite [DataStage, QualityStage], Oracle 11g, PL/SQL, Windows Server 2003, TOAD, UNIX Shell Scripting.

Univar –Richmond, VA

ETL Developer/Datastage Developer

Univar is a leading global distributor of chemistry and related innovative products and services. They deliver chemistry (products, expertise, and relationships) that helps their customers improve the quality of life.

Responsibilities

Installed and configured BODS (IPS and BODI) 4.0 SP2 on Development and Production environments.

Involved in mapping sessions with the client functional and technical teams.

Designed technical architecture and documented technical requirements for data conversion.

Extensively used platform transforms like SQL, Query, Merge and Validation to transform and load the data from staging database to target database based on complex business rules.

Designed the sequence of jobs to load the data into target without violating the constraints.

Tuned the jobs and dataflows by using appropriate parallelism techniques to load the data into the target database within the conversion run window.

Experience in using proper recovery mechanisms to reduce the overall runtime in the event of job failure.

Developed custom SQL queries to extract the data from multiple sources.

Created custom functions to handle Julian dates and times and extensively used functions like lookup_ext, key_generation etc.,

Used aliases at the datastore level for easy migration to new database builds.

Extensively used substitution parameters for storing constant values which can be used throughout the repository.

Exported jobs between the repositories by selecting appropriate export options to avoid corruption of the repositories.

Administered and maintained Central Management Console to configure multiple repositories and also to manage the users and their privileges.

Extensively used Data Services Management Console for job scheduling.

Experience in debugging execution errors using Data Integrator logs and also by analyzing the target data.

Created file formats with both fixed with and delimited flat files, Excel workbooks and CSV files.

Performed Unit testing and documented the test results.

Logged and tracked defects using HP Application Lifecycle Management.

Environment

SAP BODS Oracle 11g, PL/SQL, SQL Developer, UNIX Shell Scripting.

AIG-Parsippany, New Jersey

Datastage Developer / MDP Developer

Responsibilities

Analyzed the existing ETL process and came up with an ETL design document that listed the jobs to load, the logic to load and the frequency of load of all the tables.

Analyzed, designed, developed, implemented and maintained Parallel jobs using Enterprise Edition of DataStage.

Migrated the project from 7.5 to 8.0.1.

Developed complex jobs using various stages like Lookup, Join, Merge, Sort, Transformer, Dataset, Row Generator, Column Generator, Sequential File and Aggregator Stages.

Extracted data from disparate sources - relational databases, oracle database, and flat files and loaded into data warehouse.

Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.

Extensively worked on Data Acquisition and Data Integration of the source data.

Worked with Metadata Definitions, Import and Export of DataStage jobs using DataStage Manager.

Design mapping documents with Transformation rules.

Defined projects and tuned parameters for fine-tuning of the projects.

Defined & implemented DataStage jobs process monitoring.

Implemented Quality Stage for data cleansing, data standardization and matching process.

Used plug-in stages such as Stored Procedure, Merge and also various stages like Sequential, Hashed, ODBC, Aggregator and Inter-Process.

Integrated data from various sources into the staging area in data warehouse for integrating and cleansing data.

Used the ETL DataStage Director to schedule, running the solution, testing and debugging its components & monitor the resulting executable versions.

Defined production support methodologies and strategies.

Environment

IBM InfoSphere Datastage, Quality stage, DataStage 7.5/8.0.1, Oracle 9i, Erwin, Query man, Autosys 4.0, Windows 2003.



Contact this candidate