Post Job Free

Resume

Sign in

Data Manager

Location:
Elon, NC
Posted:
November 23, 2014

Contact this candidate

Resume:

Shashi Ranjan Kumar

Mobile: +1-732-***-****

E-Mail: acgtfd@r.postjobfree.com

Professional Snapshot

A competent professional with over 7+ years of experience in the areas of IT Business Analysis, Data Warehousing in

ETL (IBM Datastage,Quality Stage 7.5x/8.1,8.5.8.7/9.1), IBM Information Analyzer 9.1,IBM Business

glossary,,IBM Fast track, IBM metadata workbench Databases (Oracle/Microsoft sql server/Teradata/Netezza )

and in Unix Shell Scripting.

• Extensive domain knowledge of Telecom,Banking & Finance Domain.

• Good Knowledge of ITIL process, AGILE Methodology, SDLC process.

• Experienced in ER win data modelling tool 9.5 and IBM InfoSphere Data Architect tool for, Data Modeling.

• Played an integral part in the building of a multi-server, multi-database enterprise Data warehouse using

DataStage ETL (extract, transform and load) tools and SQL Server to load legacy business data.

• Expertise in Data Analysis, Data Conversion, Design and Data Modelling specializing in the area of Data

Warehousing and Business Intelligence and experience in Design, Preparing HLD for business requirement

• Experience in Database Management, Data Mining, Software Development Fundamentals, Strategic Planning,

Operating Systems, Requirements Analysis, Data warehousing, Data Modeling and Data Marts.

• Experienced in IBM Information Analyzer 8.5

• Experienced in creating Data Lineage report,Impcat analysis report,Business Lineage report through IBM Mata

Data Workbench .

• Able to write SQL loader/Multi-load/Fast-load/Fast-Export scripts and UNIX shell scripts and Able to handle

large volumes of data.

• Well versed with Netezza 6.0 and Extensive knowledge of NZPLSQL, NZLOAD

• Extensive experience in Parallel Extender. Efficient in all phases of the development lifecycle, coherent with Data

Cleansing, Data Conversion, Performance Tuning, System Testing. Expertise in creating reusable components

such as Shared Containers and Local Containers.

• Expertise in different types of loading like Normal and Bulk loading challenges. Involved in Initial Loads,

Incremental Loads, Daily loads and Monthly loads. Efficient in trouble shooting, performance tuning, and

optimization of ETL and reporting analysis. Involved in massive Data Cleansing prior to Data Staging.

• Experienced in creating entity relational & dimensional relational data models using Kimball Methodology i.e.

Star Schema and Snowflake Schema.

• Area of expertise encompasses Database designing, ETL phases of Data warehousing. This is achieved with an

emphasis on relational data modeling and dimensional data modeling for OLTP and OLAP systems.

• Experienced in writing Autosys Scheduler scripts by analyzing ETL jobs & time dependencies.

• Worked on Slowly Changing Dimensions (Type1, 2, 3 & 6) & Change Data Capture and its implementation to

keep track of historical data.

• Worked on Data quality stage like Data Rules,Match Frequency,Quality Stage legacy,MNS,Investigate,Refrence

Match,Standardize,Unduplicate Match,Survive Stage to keep exact data as per customer request .

• Designed and developed Oracle PL/SQL Procedures, experience in writing PL/SQL Packages, Stored Procedures,

functions, Materialized Views and Triggers using TOAD developer/SQL Developer. Leveraged Explain Plan and

TKPROF to improve query performance.

• Migrated Data Stage from Version 7.x to 8.x.

• Having very good knowledge on connecting to XML Sources using data stage,Transform and integrate data using

WebSphere DataStage XML and Web services packs

• Having strong experience on C++ Routines, java, OOPS concepts copy constructor and inheritance, virtual

functions and Object Oriented Analysis & Design - Advanced methods.

• Developed industry standard solutions with Data stage ETL jobs based on business requirements using various

Data stage stages like Sort, Column Import, Modify, Aggregator, Filter, Funnel, Join, Lookup, Merge, Change

Capture, Datasets, MQ Series, Sequential Stage and Transformer.

• Experience in troubleshooting of Data Stage jobs and addressing production issues like performance tuning and

enhancement.

Handled Complex Web Services in Info Sphere Information Server through DataStage ASB Packs v2.0

• Expertise in OLTP/OLAP System Study, Analysis and Dimensional Modeling, E-R modeling. Involved in

ODS/Designing Dimensional Model (Star Schema and Snowflake Schema) Designing logical and physical

design data modeling with star schema using Erwin.

• Experienced in Physical and Logical Data Modeling.

• Sound knowledge of BI(Cognos reporting tool )

• Experienced in FTP & Connect Direct file transfer mechanism.

• Experienced in IBM Metadata asset manager to maintain Metadata repository .

• Experienced in making prototypes for ETL process with Mapping Templates and making reports on it.

Certification

• Oracle certified Associate 9i

Technical Skills

Operating Systems Windows XP, 98, 2000, Unix,Linux

Languages SQL, PL/SQL,Unix shell Scripting

Databases Oracle 9i/10g, Sqlserver 2000/2005/2008,Teradata, Netezza 6.0,

ETL Tools Data stage 7.5.1,Datastage8.5,Datastage 8.7 /9.1 Parallel,Server & Sequence Job

Data stage - Administrator, Manager, Designer, Director,IBM Information

Analyzer,IBM Fast track

Tools & Utilities SQL Loader, BTEQ,Fast load,Multi load, CA ERwin data modelling tool 9.5,IBM InfoSphere Data Architect, IBM Metadata asset manager.

Domain Knowledge Telecom,Banking &Insurance

Professional Summary

AIG Greensboro NC Sept 2014 – Present

Project : Data Governance

IBM InfoAnalyzer & MDM work bench analyst

Responsibilities:

• Analyze business requirements and create business rules to identify Enterprise critical data element(ECDE’s) and

Business value data elements (BVDE’s)

Import data in Information Analyzer meta data from production environment

• Profiled data in IBM InfoAnalyzer using rule and column analysis against identified ECDEs,BVDE’s.

• Submitted analysis on results and provided LoB/EF recommendations for data quality rule development.

• Configured and code data quality validation rules within IBM InfoAnalyzer (or via SQL) and will schedule,

review and package LoB/EF data quality monthly / quarterly results.

• Created projects, add data sources, write, configure and execute rules/rule sets within Information Analyzer

• Developed data profiling solutions, run analysis jobs and view results, and create and manage data quality

controls using Information Analyzer

• Performed column analysis, rule analysis, primary key analysis, natural key analysis, foreign-key analysis, and

cross-domain analysis

• Import/export projects along with rules and bindings successfully from one environment to another

• Create score card for all request element and shared with Business.

• Develop SQL, run and analyze data quality results.

• Strong knowledge of data system platforms, practices and data management policies.

• Designed develop and document data related policies, standards, procedures, processes.

• Use Import Export Manager to bring metadata about data files, data tables, business terms, reports, and models

into IBM Mata Data Workbench .

• Establish manual and automated links between the IBM Mata Data Workbench .

• Create Data Lineage report,Impcat analysis report,Business Lineage report through IBM Mata Data

Workbench

AT&T Middletown NJ May 2011 – Aug 2014

Project : SXP(Service Express platform)

IBM InfoSphere Analyst

Responsibilities:

• Analyze business requirements and created document for the source to target mapping for the ETL development.

Involved in preparing high level and detailed design documents and acceptable differences documents for the end

users.

• Worked with IBM Information Analyzer 8.5 on Data profiling for column analysis, rule analysis, primary key

analysis, natural key analysis, foreign-key analysis, and cross-domain analysis. Worked as a Data Analyst to

Analyze the Customers global data and identified the existing data issues in the source systems using the IBM

information analyzer tool and profiled the data by generating the information analyzer reports.

• Created projects, add data sources, write, configure and execute rules/rule sets within Information Analyzer .

• Developed data profiling solutions, run analysis jobs and view results, and create and manage data quality

controls using Information Analyzer .

• As a Senior Data Analyst I analyzed, designed and implemented ODS, Data marts, Data warehouse and

operational databases.

• Extracted Data from fixed width files and transformed as per the requirements and loaded into the IW Oracle

tables using SQL loader scripts.,

• Used QualityStage such as investigate, standardize, match and survive for data quality and data profiling issues

during the designing

• Involved in designing the ETL process migration as per Netezza architecture.

• Used BCP to unload data from SQL Server and load into Netezza tables using NZLOAD Used Window

Analytic functions of Netezza to implement complex business logic.

• Developed stored procedures on Netezza (NZPLSQL scripts) for data manipulation and data warehouse

population .

• Used NZLOAD utility to load data from flat files to Netezza tables

• Implemented a process to establish Referential Integrity between related dimension/fact tables.

• • Worked on overall performance improvement of data loads by leveraging Netezza MPP architecture & following

Netezza best practices.

• Unloaded data from Netezza into flat files, using External table functionality to load into Oracle.

• Created Data stage Parallel jobs using Designer and extracted data from various sources, transformed data

according to the requirement and loaded into target databases like Oracle 10g and Sybase.

• Extensively worked with Data Stage Designer for developing various jobs in formatting the data from different

sources, cleansing the data, summarizing, aggregating, transforming, implementing the partitioning and sorting

methods and finally loading the data into the data warehouse.

• Extensively did the Data Quality Checks on the source data.

• Used Data stage Designer for creating new job categories, metadata definitions, and data elements, import/export

of projects, jobs and data stage components, viewing and editing the contents of the repository.

• Used the Data stage designer to design summary tables for monthly sales and UNIX model scripts to automate

and run the jobs.

• Worked with Oracle Connector and Enterprise, DB2 Connector, Peek, Dataset, Lookup, File Set, Filter, Copy,

Join, Remove Duplicates, Modify, Surrogate Key Generator, Change Capture, Funnel stages.

Involved in Integration testing, Co-ordination of the development activities, maintenance of ETL Jobs.

• Preparing HLD for requirement

• Preparing UNIT test case package, Preparation of Unit Test cases and Unit test Logs and performing Unit Testing

of Code.

• Creation of jobs sequences using Job Activity, Wait for File Activity and Notification Activity etc.

• Performing the System Integration testing for data sources and checking the connectivity.

• Developing the Framework job on which other main job is depended for Notification and failed status.

• Performance Tuning, Jobs process identifying and resolution of performance Issues.

• Implemented data stage quality stage to filter out unwanted data from tables and finalize data type, length which

helps us to reduce the database utilization.

• Interacting with business user for Requirement gathering & System analysis Gathering information for

required software and hardware for Project.

• Preparing the Timeline for Code development, DIT and unit testing Duration.

• Preparing Analysis report of Database(Table space /Schema) size

• Conducting meeting with Data modeler for table structure definition and other mandatory requirement for

Development

• Used different types of stages like Transformer, CDC, Remove Duplicate, Aggregator, ODBC, Join, Funnel,

dataset and Merge for developing different jobs.

• Involved in performance tuning of the jobs while developing the jobs

• Handled Complex Web Services in InfoSphere Information Server through DataStage ASB Packs

• Transform and integrate data using WebSphere DataStage XML and Web services packs

• Discussing with other team DBA/UNIX team for mandatory requirement to develop the code.

• Providing overview to team member for low level design

• Conducting meeting with Testing team for test cases,How they are going to perform Testing for Code

• Code (jobs) development / code (jobs) modifications as per requirement changes.

• Performing the System Integration testing for data sources and checking the connectivity.

• Preparing the Job parameter (Local parameter and global parameter) for code rather than using manual parameter.

• Developing the Framework job on which other main job is depended for Notification and failed status.

• Worked on standardization of files to audit the input data and make sure that the data is valid.

• Performance Tuning, Jobs process identifying and resolution of performance Issues.

• Exporting & Importing Jobs and Importing the Metadata from repository as and when required.

• Involved in the development of data stage Jobs, UNIX Shell scripts for data loading.

• Writing Reconciliation Queries as per Business Requirements.

• • Using shard containers created reusable components for local and shard use in the ETL process.

• Providing UAT support, Deployment support.

• Developed Data Stage Loads into Oracle for STAR Schema

• Used subversion( SVN source control management ) to push the code to higher

environments( QA/IT/Staging/PROD)

• Migrated jobs from development to QA to Production environments.

• Preparing the UNIX script for post job completion activity.

• Responsible for overseeing the Quality procedures, Standards related to the project.

• Migrated 7.5 server job to 8.7 parallel jobs in stringent timeline which highly appreciated by customer.

• Designing & Configuring the routines and scripts for sending the critical alerts from the production support

environment prospective.

• Monitoring process progress as per scheduled deadlines for various tasks and taking necessary steps to ensure

completion within time, cost and effort parameters.

ABN AMRO March 2010-May

2011 Project: ABN Data integration

Sr. Datastage Developer

Key Responsibilities:

• Used Data stage Manager for importing metadata from repository, and new job categories creating new data

elements. Export and import of the jobs between the production, development and test servers.

• Used several stages like sequential file stage, datasets, Copy, Aggregator, Row Generator, Join, Merge,Lookup,

Funnel, Filter, Column Export etc in development of parallel jobs.

Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its

components, and monitoring the resulting executable versions (on an ad hoc or scheduled basis).

• Creation of jobs sequences using Job Activity, Wait for File Activity and Notification Activity etc. Code

(jobs) development / code (jobs) modifications as per requirement changes Performance Tuning, Jobs

process identifying and resolution of performance Issues.

• Running, monitoring and scheduling of Data Stage jobs through data stage director.

• Exporting & Importing Jobs and Importing the Metadata from repository as and when required.

• Involved in the development of data stage Jobs, UNIX Shell scripts for data loading.

• Involved in Performance Tuning of Queries and Jobs.

• Responsible for overseeing the Quality procedures, Standards related to the project

• Successfully delivered the one of the Major Milestone and created process to provide RCA for production defects

post to deployment.

• Got the source data in the form of Oracle tables, Sequential files and Excel sheets, developed processes to extract

the source data and load it into the data warehouse after cleansing, transforms and integrating

• Developed various shared Container jobs for Re-Usability

• Worked with Hash files, Parallel Job Extender for jobs for parallel Processing and IPC stages to improve the

performance of jobs

• Wrote SQL queries, PL/SQL procedures to ensure database integrity

Created shell scripts for production and the scripts for small changes using before-after subroutine. Used Korn

Shell scripts for scheduling DS jobs.

• Used Partition methods and collecting methods for implementing parallel processing.

• Tuned of SQL queries for better performance for processing business logic in the database

Marathon Oil Corporation, US June 2008- March -2010

Project: MOP Business Analytics

Datastage Developer

Key Responsibilities:

• Used stages like Transformer, sequential, Oracle, ODBC, Aggregator, Data Set, File Set, CFF, Remove

Duplicates, Sort, Join, Lookup, Funnel, Copy, Modify, Filter, Change Data Capture, Change Apply, Head, Tail,

Sample, Surrogate Key and SCD.

• Extensively worked on capturing the Change Data.

• Extensively worked on slowly changing Dimension concepts.

• Used Data stage Manager for importing metadata from repository, and new job categories creating new data

elements. Export and import of the jobs between the production, development and test servers.

• Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its

components, and monitoring the resulting executable versions (on an ad hoc or scheduled basis).

• Wrote SQL queries, PL/SQL procedures to ensure database integrity

• Creation of jobs sequences using Job Activity, Wait for File Activity and Notification Activity etc. Code

(jobs) development / code (jobs) modifications as per requirement changes Preparation of Unit Test cases

and Unit test Logs.

• Performance Tuning, Jobs process identifying and resolution of performance Issues.

• Running, monitoring and scheduling of Data Stage jobs through data stage director.

• Exporting & Importing Jobs and Importing the Metadata from repository as and when required.

• Involved in the development of data stage Jobs, UNIX Shell scripts for data loading.

• Writing Reconciliation Queries as per Business Requirements.

• Involved in Performance Tuning of Queries.

• Worked with XML Transformer stage to convert and load XML data into Data Warehouse. Designed Jobs, scripts

and process to rectify the data corruption and cleanup activity.

• Responsible for overseeing the Quality procedures, Standards related to the project

sCubes Systems India Private Limited July 2007-June 2008

Datastage Developer

Project : Business Systems Monitor

Key Responsibilities:

• Extensively used DataStage Designer stages such as ODBC, Native plug-in, Sequential File, Remove duplicates,

filter, Aggregator, Transformer, Join, Pivot, Lookup, XML input, XML output, MQ Connector, Sort,Funnel,

Dataset, Copy, Modify, Row generator and Merge.

• Used DataStage Software Development Kit (SDK) Transforms .Used Director for executing, analyzing logs and

scheduling the jobs Preparation of Unit Test cases and Unit test Logs.

• Code (jobs) development / code (jobs) modifications as per requirement changes .Running, monitoring and

scheduling of Data Stage jobs through data stage director,Writing Reconciliation Queries as per Business

Requirements.

• Exporting & Importing Jobs and Importing the Metadata from repository as and when required.

• Involved in the development of data stage Jobs.

• Designed Jobs, scripts and process to rectify the data corruption and cleanup activity.

• Preparing Knowledge base repository for knowledge sharing.

Trainings

• Entry Level Training Program at Mphasis an HP company, 2007.

• Unix Shell Scripting & Programming Concept at MphasiS an HP company, 2009.



Contact this candidate