Post Job Free
Sign in

Data Manager

Location:
Mahwah, NJ
Posted:
August 03, 2015

Contact this candidate

Resume:

Richa Sharma

609-***-****

******.****@*****.***

Current Location:New Jersey

Professional Summary

* ***** ** ************* ********** as ETL lead and technology analyst in IBM Infosphere Datastage (ETL) and Cognos (Reporting).

Strong understanding of Datawarehousing concepts and end to end implementation of ETL process and reporting.

Experience in interacting with business and gathering information including data sources, data targets, data definitions, data relationships and business rules to transform them into datawarehouse.

Experience in designing facts, dimensions, aggregator using snow flakes, star schema design approach.

Extensive experience in implementing slowly changing dimensions, good understanding of SCD 1, 2 and 3.

Extensive experience in Datastage coding and testing complex ETL to support Datawarehouse using Datastage Ver 7.5 and IBM infosphere information server Ver 8.5/9.1.

Proficient in implementing server jobs/parallel jobs/Sequences.

Extensive usage of different sources and targets to extract and store information like Sequential file / ODBC / DB2 connector/ Datasets.

Experience in using various stages like Aggregator, join, sort,lookup, transformer, funnel, filter, change capture, peek, row generator, column generator, copy stage.

Experience in working with shared/local containers.

Experience in performance tuning of ETL using different partitioning techniques.

Good experience in Oracle and DB2 database.

Exerience in creating/altering database objects like table/view/materialized view/triggeres/ stored procedure in Oracle.

Experience in UNIX shell scripting.

Experience in creating list/ crosstab report using Cognos V8, report studio as a reporting tool.

Good understanding of Framework manager and Cognos V8 query studio to create a framework, analyze data and generate reports.

Good domain knowledge of telecomm and publication industry.

Certified in Big data and Hadoop from Edureka and did a mini project exploring map reduce, pig, hive component of Hadoop.

Skill Set

Name

ETL Tools

IBM (Formerly Ascential) Datastage V7.5.1, IBM Datastage V8, IBM Datastage V9.1

Reporting Tool

Cognos Impromptu V7.2,Cognos 8.3(Report studio, Query studio,Framework manager)

Database

RDBMS (Oracle 9i), DB2, SQL Server

Operating System

UNIX

Education

MCA (Masters Degree in Computer applications) From Banasthali Vidyapith, Jaipur, India, in 2007.

Experience Summary

Working as a Datastage Consultant

oCompany: K-Force

oClient: UPS

oDuration: August 2013 to till date

oLocation: New Jersey, USA

Working as an ETL lead and technology analyst.

oCompany: Infosys Ltd.

oClient: Dowjones.

oDuration: September 2011 to August 2013.

oLocation: New Jersey, USA

Worked as ETL developer

oCompany: Infosys Ltd.

oClient:Telenet.

oDuration:October 2007 to August 2011

oPune, India

Professional Experience

#1 Name: Mainframe to Datastage Migration

Client: UPS- CPR (Contract Performance Reporting)

Team Size: 8

Environment: Datastage V8, ORACLE, DB2, LINUX

Description: United Parcel Service of North America, Inc., typically referred to as UPS, is an American global package delivery company It delivers more than 15 million packages a day to more than 6.1 million customers in more than 220 countries and territories around the world.

UPS have a complex costing process. It involves categorizing the data into domestic and international and do separate costing on the basis of different territory, services, shipper etc. Objective of this migration project is to migrate the logic of domestic costing from mainframe to datastage environment, which will create different files that would act as a feed for Calculator to do the costing. This involves understanding the existing mainframe logic, converting it into datastage object, comparing the result of both system and matching it till accepting limit, debugging and release support.

Responsibilities as ETL Consultant:

Inolved in all the phases of SDLC ie planning, designing, coding, testing, implementation and support.

Understanding the mainframe logic of costing process and streamline the datastage code to that process.

Use of different extraction stages of datastage ie, Sequential file stage, dataset stage, Oracle enterprise stage, DB2 connector stage.

Use of different transformation stages of datastage like transformer, funnel, filter, sort, remove duplicate, aggregator, change capture, copy, checksum, join, lookup, merge, pivot stage.

Use of different loading stages like sequential file stage, dataset stage, peek stage.

Use of datastage director to schedule, monitor, trigger, stop, reset, validate diffetnt jobs/sequence.

Usage of different partitioning technique to improve the performance of jobs.

Consultation and coordination in implementing datastage jobs.

Use of shell commands to navigate through directories, validating data in files, running the job through shell command.

Acting as an Datastage SME for all queries related to datastage, bringing Mainframe resources,who are moved to datastage, up to speed

Working on alternate effiicient design approach for conversion while keeping the core design of mainframe intact.

Doing parallel run and comparing the legacy and new results and ensure it’s same from both side.

#2 Name: Mainframe Migration (MFM), Account Merge

Client: DOWJONES (September 2011- Till Date)

Team Size: 14 (Onsite-Offshore model)

Environment: Datastage V8, Datastage V9, Cognos 8.3, DB2, UNIX

Description: Dowjones is a leader in news and business information world-wide, Dow Jones is newswires, Web sites, newspapers, newsletters, databases, magazines, radio and television. Objective of this project is to migrate print customers from the legacy mainframe system to the new framework (MOSAIC) and to provide Dowjones users a single view to access all accounts. These projecs are to provide different reports and support adhoc queries for making intelligent business decisions based on data available in various tables/views collected over a period of time. The activities of the project ranged from data extraction, staging and loading into data warehouse through Datastage.

Responsibilities as ETL Lead

Involed in all the stages of System development life cycle.

Worked closely with Business Analysts, end users to determine data sources, targets, business rules to implement ETL code and cognos reports.

Preparation of Technical specification, Detail Design Document(DDD) for the development of DataStage Extraction, Transformation and Loading (ETL) mappings to load data into various tables and defining ETL standards.

Used the Datastage Designer to develop processes for extracting, cleansing, transforming and loading data into DB2 Datawarehouse database.

Coding of new programs as per client’s specifications, modifying existing programs and review of the coded programs to ensure that they meet the requirements and standards

Used Datastage Designer for importing metadata into repository, for importing and exporting jobs into different projects.

Used Datastage Director for validating, execution, monitoring jobs and check the log files for errors.

Used different stages of Datastage Designer like Sequential/Hash file, Transformer, Merge, Oracle Enterprise/Connector, Sort, Lookup, Join, Merge, Funnel, Filter, Copy, Aggregator.

Developed several jobs to improve performance by reducing runtime using different partitioning techniques.

Populated Type I and Type II slowly changing dimension tables from several operational source files

Performed unit testing of all monitoring jobs manually and monitored the data to see whether the data is matched.

Used several stages in Sequencer like Abort Job, Wait for Job and Mail Notification stages to build an overall main Sequencer and to accomplish Re-start ability.

Created views, MQT and other DB2 objects depending upon the application group requirement.

Involved in alteration of objects definition in DB2.

Worked with Toad tool to access database.

Involved in basic shell scripting.

Coordination with offshore as an onsite coordinator.

#3 Name: Various DWH applications, Production Support

Client: TELENET (October 2007- August 2011)

Team Size: 10

Environment: Datastage V8, Cognos 8.3, Oracle,SQL Server, UNIX

Description: Telenet Group is the largest provider of broadband cable services in Belgium. Its business comprises the provision of analog and digital cable television, high speed internet and fixed and mobile telephony services, primarily to residential customers in Flanders and Brussels. In addition, Telenet offers services to business customers all across Belgium and in Luxembourg under its brand Telenet Solutions. Project was about building and maintaining the datawarehouse application for various products and customers of telenet so that strategic decision by higher management can be taken up. This includes ETL creation, report build, writing unix scripts and writing the procedure in SQL.

Responsibilities as ETL developer

Used the Datastage Designer to develop processes for extracting, cleansing, transforming and loading data into Oracle Datawarehouse database.

Prepared technical specification, Detail Design Document(DDD) for the development of DataStage Extraction, Transformation and Loading (ETL) mappings to load data into various tables and defining ETL standards.

Worked with migration of data and jobs from Datastage v7.5 to latest version of Datastage v8.7

Coded new programs as per client’s specifications, modifying existing programs and review of the coded programs to ensure that they meet the requirements and standards

Used Datastage Designer for importing metadata into repository, for importing and exporting jobs into different projects.

Used Datastage Director for validating, execution, monitoring jobs and check the log files for errors.

Used different stages of Datastage Designer like Sequential/Hash file, Transformer, Merge, Oracle Enterprise/Connector, Sort, Lookup, Join, Merge, Funnel, Filter, Copy, Aggregator.

Developed several jobs to improve performance by reducing runtime using different partitioning techniques.

Populated Type I and Type II slowly changing dimension tables from several operational source files.

Developed complex ETL using dimensional modeling.

Performed unit testing of all jobs manually and monitored the data to see whether the data is matched.

Used several stages in Sequencer like Abort Job, Wait for Job and Mail Notification stages to build an overall main Sequencer and to accomplish Re-start ability.

Co-ordinated with Onsite team members, perform production deployment, monitor production jobs and fix production aborts based on SLA.

Certifications

Domain:

1.OSS-BSS (Telecommunication Sector Fundamental Concepts)

2.Infosys Quality Systems.

Technical:

1.ORACLE – OCA (Paper1. Introduction to Oracke 9i SQL 1z0-007)

2.ORACLE –OCA (Paper2. Program with PL/SQL 1z0-147)

3.COGNOS 8 – BI0-112 cognos developer certification

4.Big data and hadoop Certification from Edureka.

Personal Achievements

Have been awarded Infosys On-the Spot Award under the category of Customer Delight.

Have automated the patch creation process in the Telenet project which was selected as a tool internal to Infosys and reduced the manual work involved in the activity extensively.

Have been awarded Infosys Bravo award for exceptional performance and winning customer confidence right from the start.



Contact this candidate