Post Job Free
Sign in

Manager Data

Location:
Lawrenceville, GA
Posted:
May 30, 2015

Contact this candidate

Resume:

Seshaiah Ambati

PH:630-***-****

********.******@*****.***

professional Summary

. More than 12 years of experience in IT industry especially in

client/server business systems and Decision support Systems (DSS)

analysis, design, development, Testing and implementation.

. Over 8+ years of experience in loading and maintaining Data Warehouses

and Data Marts using tools such as Data Stage ETL processes,

Profiling, Quality Stage 6.X, Quality Stage 9.X and IBM Information

Server 9.0 (Integrated Data Stage and Quality Stage).

. Strong knowledge of Extraction Transformation/Verify and Loading

(ETL/EVL) processes using Ascential Data Stage, UNIX shell scripting,

and SQL Loader.

. Expertise in working with various operational sources like DB2,

DB2UDB, SQL Server, Oracle, Teradata, Sybase, Flat Files into a

staging area.

. Extensively worked on Parallel Extender on Orchestrate Environment for

splitting bulk data into subsets and to dynamically distribute to all

available processors to achieve job performance.

. Experience in Evolving Strategies and developing Architecture for

building a Data warehouse by using data modeling tool Erwin.

. Designed and Developed Data Models like Star and Snowflake.

. Excellent database migration experience in Oracle 9i /11, DB2UDB, MS

SQL and SQL.

. Hands on experience in writing, testing and implementation of the

triggers, Procedures, functions at Database level using PL/SQL.

. Excellent experience in XML stages like XML input, XML transform and

XML output stages.

. Extensive experience in loading high volume data, and performance

tuning.

. Played significant role in various phases of project life cycle, such

as requirements definition, functional & technical design, testing,

Production Support and implementation.

. Experienced in manufacturing Industry maintaining sales, inventory,

profit & loss, HR, product category and time.

. Excellent team member with problem-solving and trouble-shooting

capabilities, Quick Learner, highly motivated, result oriented and an

enthusiastic team player.

Technical Skills

ETL tool : IBM Data Stage 5.X/6.0XE/7.5/8.7XI,9.1,INFORMATICA,SAP BW

OLAP Tools : BUSINESS OBJECTS 4.1/5.0/5.1 (Supervisor, Designer, Reports,

Broadcast Agent, Set Analyzer), WEB INTELLIGENCE 2.5

/ 2.6, SDK, HIPAA EDI X12 transactions like 834, 837,

835, 270, 271, 276, 277

Programming Languages : COBOL, C, C++, JAVA (1.2), SQL, PL/SQL, and TOAD

Internet Technologies : HTML 4.0, JAVASCRIPT 1.2, JSP, IIS (8.0) and

SERVLETS.

Databases : ORACLE 9i, SQL SERVER 7.0, DB2, Teradata,

GreenPlum 4.2.

Data Modeling Tools : ERWIN, ORACLE DESIGNER, Unix AWK, NAWK, windows

power shell.

GUI : ORACLE D2K with FORMS4.x/5.x/6.x, REPORTS 4.x/5.x/6.x, APPLETS,

VISUAL BASIC 5.0, Visual C++, CRYSTAL REPORTS.

Hardware : SUN SPARC, IBM COMPATIBLE PC.

Operating Systems : Windows NT/2000, UNIX, AIX, Solaris.

Education

. Master of Computer Applications in Madras University.

Professional Experience

Axiall, Atlanta, GA Nov

2014 - till Date

Sr.ETL Developer

Axiall is a new leader in the chemistry sector, with holdings and expertise

in chlor-alkali and chlorovinyl materials. It provides a breadth of

chemistries and derivatives essential to the creation of a vast array of

consumer, professional, and industrial products and applications, its

Retail industry for House holds and chemicals.

Role & responsibilities:

. Used Ascential Profiling, Integrity for cleansing, the source data

coming heterogeneous sources such as ANSI X12 (fixed width flat

files), CSV files, and also loaded using DataStage jobs

. Used plug-in stages such as FTP, Merge and Pivot and also various

stages like Sequential, Hashed, ODBC, DB2, Aggregator, Links

Practitioner/Link collector, and Inter-Process.

. Created Batches (DS job controls) and Sequencers to control set of

DataStage jobs.

. Extensively used pre built-in DataStage transforms, functions and

routines for conditioning, conversions, validation and loading, and

also developed if required

. Created of different types of reports, such as: Master/Detail, Cross

Tab and Chart (for trend analysis).

. Scheduled BO reports using Broad Cast Agent and monitored them through

Broad Cast Agent console

. Involved in unit testing, system testing and integrity testing.

. Involved in the Analysis of Physical Data Model for ETL mapping and

the process flow diagrams for all the business functions.

. Involved in designing the procedures using Ascential Integrity for

data cleansing, used pre-built procedures for cleansing the address

data of customer for internal business analytical purpose.

. Used DataStage Manager for importing metadata from repository, new job

categories and creating new data elements.

. Worked with DataStage Director to schedule running the solution,

testing and debugging its components and monitoring the resulting

executable versions (on adhoc or scheduled basis).

. Extensively used pre built-in DataStage transforms, functions and

routines for conditioning, conversions, validation and loading, and

also developed if required.

. Developed user defined Routines and transformations to implement

business logic and Shell scripts to automate file manipulation and

data loading procedures

Environment: ERWin, DataStage 9.1, PS EPM PeopleSoft, Oracle 11, DB2 UDB,

JDE, AIX,MS Excel 2000, SQL Server 2012, cognos, GreenPlum 4.2, windows

power shell, WEBI, SQL Navigator, SQL * Loader, Mainframes, AS400 (iSeries)

UnitedHealth Group, Basking Ridge, NJ

Feb 2013 to Nov 2014

Datastage ETL Sr.Developer

UnitedHealth Group is a major player in the healthcare industry. I was

involved in multiple projects Invoice & Delinquency Letter generation and

HPN Panel Extraction with Factes solution group. Trizetto Factes

application manage provider, Claims, Billing, Letters.

Role & Responsibilities:

. Extensive ETL tool experience using IBM Infosphere/Websphere

DataStage, Ascential DataStage.

. Worked on DataStage tools like DataStage Designer, DataStage Director

and DataStage Administrator.

. Strong understanding of the principles of Data Warehousing using fact

tables, dimension tables and star/snowflake schema modeling.

. Worked extensively with Dimensional modeling, Data migration, Data

cleansing, ETL Processes for data warehouses.

. Developed parallel jobs using different processing stages

like Transformer, Aggregator, Lookup, Join, Sort, Copy, Merge, Funnel,

CDC, Change Apply and Filter.

. Used Enterprise Edition/Parallel stages like Datasets, Change Data

Capture, Row Generator and many other stages in accomplishing the ETL

Coding

. Familiar in using highly scalable parallel processing infrastructure

using parallel jobs and multiple node configuration files.

. Experienced in scheduling Sequence and parallel jobs using DataStage

Director, UNIX scripts and scheduling tools.

. Experience in troubleshooting of jobs and addressing production issues

like data issues, ENV issues, performance tuning and enhancements.

. Knowledge in using Erwin as leading Data modeling tool for logical

(LDM) and physical data model (PDM).

. Extensive experience in design and development of Decision Support

Systems (DSS).

. Assisted in development efforts for Data marts and Reporting.

. Technical and analytical skills with clear understanding of design

goals of ER modeling for OLTP and dimension modeling for OLAP.

. Extensive experience in Unit Testing, Functional Testing, System

Testing, Integration Testing, Regression Testing, User Acceptance

Testing (UAT) and Performance Testing.

. Worked with various databases like Oracle 11g/9i/8i, DB2,Sybase,SQL

Server, Teradata.

. Worked SCDs to populate Type I and Type II slowly changing dimension

tables from several operational source files

. Implemented multi-node declaration using configuration files

(APT_Config_file) for performance enhancement.

. Debug, test and fix the transformation logic applied in the parallel

jobs

. Deployed different partitioning methods like Hash by column, Round

Robin, Entire, Modulus, and Range for bulk data loading and for

performance boost.

. Designed DataStage Jobs to extract data from XML files using XML input

stage, Used XML transformer stage to cleanse and transform the data to

load it into the Data Mart.

. Created Universes and reports in Business object Designer.

. Created, implemented, modified and maintained the business simple to

complex reports using Business objects reporting module.

Environment: Ascential DataStage 8.7/8.5/9.1, Parallel Extender, (Designer,

Director, Manager), Teradata SQL Assistant, BTEQ, MultiLoad, FastLoad,

Erwin, Toad, Dbase3 Files, MS Access, CSV Files, XML Files, Oracle

9i,Greenplum, Windows 2003, IBM AIX 4.2/4.1, TWSd Work Scheduler.

Caterpillar, Peoria IL.

April 2012 - Jan 2013.

Datastage ETL Sr.Developer.

EPS and Order Life Cycle part of Assurance Supply Center project.

The EPS and Order Life Cycle part of Assurance Supply Center is being

developed for Caterpillar to part of a central repository of all Order

schedule life cycle and EPS like Demand and Forecast of parts(material) per

month. A detailed description of the business justification for the

application is beyond the scope of this document. It is used to identify

and organize the logical units of work that will be performed during the

project. These logical units of work are then planned for and managed

throughout the project life cycle.

Roles & Responsibilities:

The ASC application comprises a typical Data Warehouse infrastructure

implementation, consisting of the following components, each described in

more detail in sections following

. The underlying data storage for ASC will reside on the existing

PDW/SCDW, MRC, MDW, RCCP oracle platform in a new schema specifically

for the Application

. The data model will be third normal form in design, extending the

Teradata Manufacturing (MFG) reference model in use at Caterpillar.

. Data flow will follow EIM standards, using Stage and Core layers for

underlying table structures, with additional aggregate tables and

reporting views built (in semantic) to support application reporting.

. Used DataStage Manager to store, manage reusable Metadata and created

custom routines and transforms for the jobs.

. Experience in Using DataStage Director to Validate, Run, and Schedule

and Monitor Data stage jobs.

. Develop complex Teradata SQL queries to generate ad-hoc reports for

the business analysts implement complex business rules.

. Used DataStage Administrator to assign privileges to user groups.

. Worked extensively on all the stages such as OCI, Sequential,

Aggregator, Hashed Files, Sort, Link Partitioner, Link Collector and

ODBC.

. Created proper PI talking into consideration of both planned access of

data and even distribution of data across all the available AMPs.

. Developed jobs in Parallel Extender using different stages like

Transformer, Aggregator, lookup, Source dataset, external filter, Row

generator, column generator, peek stages.

. Performed ETL Using the Teradata utilities like BTEQ, Multiload,

Fastload, TPump and Fastexport.

. Distributed load among different processors by implementing the

Partitioning of data in parallel extender.

. Worked with business functional lead to review and finalize

requirements and data profiling analysis.

. Used QualityStage stages such as investigate, standardize, match and

survive for data quality and data profiling issues during the

designing.

. Created, modified, deployed, optimized, and maintained Business

Objects Universes using Designer and exported the universe to the

Repository to make resources available to the users

. Experience in using Ascential Quality stage GUI tools for customizing

data mart business logics

. Solved various defects in Set of wrapper scripts which executed the

Teradata BTEQ, MLOAD, FLOAD utilities and Unix shell scripts.

. Design and develop interfaces using DataStage/ Information Server -

SAP BW/ECC Packs.

Environment: Ascential DataStage 8.5, Parallel Extender, (Designer,

Director, Manager), Oracle 9i, Windows 2003, IBM AIX 4.2/4.1, Teradata SQL

Assistant, BTEQ, MultiLoad, Greenplum,FastLoad, Erwin, Toad, Dbase3 Files,

MS Access, CSV Files, XML Files, Tidal Work Scheduler, GreenPlum 4.2.

Wells Fargo Bank, Concord, CA

Jan 2011 to April 2012

Data Warehouse Consultant

Wells Fargo is a diversified Financial Services company with $349 billion

assets and over 16 million customers. This data warehouse is developed for

Auto Finance Group, involved in loans and leases of new and pre-owned cars.

Role & Responsibilities:

. Defined roadmaps, project plans and steps to implement data and DSS

architectures.

. Worked with business users and SME's to best determine approaches to

satisfy user requirements, gaps and risks.

. Developed prototype solutions to verify capabilities for new systems

development, enhancement, and maintenance of decision support.

. Developed approaches, methodologies, policies, and standards to govern

data.

. Implemented IBM WebSphere DataStage and Quality Stage 8.1 as an ETL

tool.

. ETL architecture guidance and recommendations surrounding DataStage

environment. Assisted the project manager in the ETL project planning,

resourcing, ETL Designs and developing conceptual designs.

. Loaded the data into the Teradata database using Load utilities like

(Fast Export, Fast Load, MultiLoad, and Tpump).

. Used Teradata utilities like MultiLoad, Tpump, and Fast Load to load

data into Teradata data warehouse from Oracle and DB2 databases.

. Responsible for delivery of multiple parallel projects

. Designed and developed various data stage jobs using version 8.1/8.5

. Developed Advanced DataStage jobs to read data form JDBC ( Java

Applications)

. Document, Develop and review of the Technical specification, Data

mapping documents

. ETL Jobs Design and development

. Design, develop, and implement test strategies for the ETL process

. Used extensively used parallel extender to extract, transform and load

data into data warehouse and Data Mart with partitioning in MPP

environment.

. Worked on different stages in PX like Join, Lookup, Funnel, Filter,

Merge, Aggregator and Transformer.

. Developed parallel Shared Containers and re-used in multiple jobs.

. Performed Import and Export of DataStage components and table

definitions using DataStage Manager.

. Extensively used the Hash file Stage, Aggregator Stage, Sequential

File Stage, Oracle OCI stage

. Created Business Objects reports on sales reports, Queries with

constant interaction with the end users.

. Wrote Routines for Data Cleansing.

. Created master controlling sequencer jobs using the DataStage Job

Sequencer.

. Performance tuning of ETL jobs.

. Used DataStage Director and its run-time engine to schedule running

the jobs, testing and debugging its components, and monitoring the

resulting executable versions, implement full life cycle.

Environment: Oracle 11g/10g/9i, Oracle Applications ERP, Web Applications,

Erwin, PL/SQL, SQL*Loader, Business Information Server (IBM WebSphere

DataStage and Quality Stage 8.1), Business Objects XI R2, Perforce, Python,

Perl, Actuate 9.2, UNIX, Toad, XML Spy, Fast Export, Fast Load, MultiLoad,

and Tpump, Windows 2003 Server, Web Services.

New York Life, Lebanon, NJ

May 2009 - Jan 2011

Datastage ETL Sr.Developer

NYL is the one of the Life Insurance company to publish a detailed

financial report for its policyowners. Business covers the whole value-

added chain in insurance.

This project basically ALO Flat Reporting, Data load as per contract all

base, rider and bonefishes information flattening at target as derived

tables for the BO Reporting universe.

Role & Responsibilities:

. Used extensively used parallel extender to extract, transform and load

data into data warehouse and Data Mart with partitioning in MPP

environment.

. Worked on different stages in PX like Join, Lookup, Funnel, Filter,

Merge, Aggregator and Transformer.

. Developed parallel Shared Containers and re-used in multiple jobs.

. Performed Import and Export of DataStage components and table

definitions using DataStage Manager.

. Extensively used the Hash file Stage, Aggregator Stage, Sequential

File Stage, Oracle OCI stage

. Created Business Objects reports on sales reports, Queries with

constant interaction with the end users.

. Wrote Routines for Data Cleansing.

. Created master controlling sequencer jobs using the DataStage Job

Sequencer.

. Performance tuning of ETL jobs.

. Used DataStage Director and its run-time engine to schedule running

the jobs, testing and debugging its components, and monitoring the

resulting executable versions, implement full life cycle.

. Tuned the DataStage server processes.

. Created, modified, deployed, optimized, and maintained Business

Objects Universes using Designer and exported the universe to the

Repository to make resources available to the users

. Extensive use of slice and dice for generating master/detail and

tabular reports

. Used Ascential Metastage for data integration, standardization for

loading in oracle data warehouse.

. HIPAA Transaction 835 X12 developing to a Comma Delimited file for

dissemination into an Oracle database for Crystal Report generation

and posting to secure server for client viewing.

. Creation and business analysis of proprietary IRL for customer usage.

. Business Crosswalk and EDI Analysis of all X12 to IRL data elements.

. 835 algorithms for claim balancing and checks from X12.

. Developed UNIX scripts to automate the Data Load processes to the

target Data warehouse using Autosys Scheduler.

. Used Partition methods and collecting methods for implementing

parallel processing, implement full life cycle.

Environment: IBM DataStage 7.5/8.1(Server Edition and Parallel Extender)XI

with IIS(8.0), Mainframes, AS400 (iSeries), Oracle 10g, Erwin, Toad,

DB2UDB 8.1, SQL, PL/SQL, IBM AIX UNIX Shell Scripts, Windows XP.

Freddie Mac, McLean, VA

Aug 2007 - May 2009

Sr ETL DataStage Developer

Involved in various Data attestation, Restatement, Close the books, Core

Business Earning, Analytics and external and internal reporting projects

for Freddie. I was involved in multiple projects (Ops Security, NYS) that I

was associated with in Freddie with a Mission to improve end to end

management of data for the enterprise, meeting required data quality and

control standards, Reliably provide data externally to our Regulators and

provision data and tools to enable corporate performance Management and

business use of stronger management information.

Roles & Responsibilities:

. Used DataStage Manager to store, manage reusable Metadata and created

custom routines and transforms for the jobs.

. Experience in Using DataStage Director to Validate, Run, and Schedule

and Monitor Data stage jobs.

. Used DataStage Administrator to assign privileges to user groups.

. Worked extensively on all the stages such as OCI, Sequential,

Aggregator, Hashed Files, Sort, Link Partitioner, Link Collector and

ODBC.

. Developed jobs in Parallel Extender using different stages like

Transformer, Aggregator, lookup, Source dataset, external filter, Row

generator, column generator, peek stages.

. Distributed load among different processors by implementing the

Partitioning of data in parallel extender.

. Extensively used MetaBroker for importing metadata from Erwin and

export warehouse data to Business Objects for reporting purpose

. Created, modified, deployed, optimized, and maintained Business

Objects Universes using Designer and exported the universe to the

Repository to make resources available to the users

. Extensive use of slice and dice for generating master/detail and

tabular reports

. Used Ascential Metastage for data integration, standardization for

loading in oracle data warehouse.

. Used Partition methods and collecting methods for implementing

parallel processing, implement full life cycle.

. Developed UNIX scripts to automate the Data Load processes to the

target Data warehouse using Autosys Scheduler.

. Used Derived tables to create the universe for best performance, and use

context and alias tables to solve the loops in Universe.

. Created complex reports stating revenue generation per year using

cascading and user objects like measure objects by using @aggregate_aware

function to create the summarized reports.

Environment: Ascential DataStage 8.1/7.5/7.0, Parallel Extender, (Designer,

Director, Manager), DataStage BASIC language Expressions, Oracle 9i, Linux,

Windows 2000, IBM AIX 4.2/4.1, Java, Sybase, PVCS, MQ Series, Erwin, Toad,

Dbase3 Files, MS Access, CSV Files, XML Files, Tivoli Work Scheduler.

(MARC)Munich American Reassurance company, Princeton, NJ May 2003-Aug 2007

Datastage ETL Sr.Developer

MARC is the one of the subsidiaries of the Munich Re Group is one of the

world's leading risk carriers. Business covers the whole value-added chain

in insurance and reinsurance.

This project basically migrate data from legacy system to SAP FS-RI

GRIP Global Risk Information Profiling and FRP Facultative Risk Processing

data have risk clearance system info migrate the data reinsurance in SAP FS-

RI.

Role & Responsibilities:

. Worked on Error Handling, Creating Hashed Files and Performing Lookups

for Faster access of Data.

. Created Tables, indexes, synonyms in Oracle 10g Database to Load Data.

. Used extensively used parallel extender to extract, transform and load

data into data warehouse and Data Mart with partitioning in MPP

environment.

. Worked on different stages in PX like Join, Lookup, Funnel, Filter,

Merge, Aggregator and Transformer.

. Developed parallel Shared Containers and re-used in multiple jobs.

. Performed Import and Export of DataStage components and table

definitions using DataStage Manager.

. Wrote Routines for Data Cleansing.

. Created master controlling sequencer jobs using the DataStage Job

Sequencer.

. Performance tuning of ETL jobs.

. Used DataStage Director and its run-time engine to schedule running

the jobs, testing and debugging its components, and monitoring the

resulting executable versions, implement full life cycle..

. Built DataStage ETL processes to load data into PeopleSoft PS EPM 8.9

warehouse.

. Loaded data into Teradata using DataStage, FastLoad, BTEQ, FastExport,

MultiLoad, and Korn shell scripts.

. Healthcare X12 Transactions Sets 837I,P/835/270/271/276/277 Developing

WTX Maps and migration from 8.0 Event Servers to Websphere Message

broker v6.1.0.5 Work Flows.

. Conversion of 4010A1 Data sets to 5010 X12 for HIPAA

. Installing EDIFECS 6.8.1 on Sun Solaris Server, Set up and

configuration for integration as front end for Websphere Message

Broker transactions.

. Architect and designed Co-ordination of X12 Transaction through

Qualedix applications to client destinations.

. Participated in Proof of Concepts and architectural design, and client

implementations.

. Created X12 data for 5010 and 5010A1 negative and positive testing.

. Design and develop interfaces using DataStage/ Information Server -

SAP BW/ECC Packs.

Environment: IBM DataStage 7.5/8.1(Server Edition and Parallel Extender)XI

with IIS(8.0), Teradata SQL, Teradata (TD) Utilities (BTEQ, FastLoad,

FastExport, MultiLoad Update/Insert/Delete/`Upsert, TD Administrator 7.1,

TD SQL Assistant 7.1, TD PMon, TD Warehouse Miner 5.0), Linux,Mainframes,

AS400 (iSeries), DB2UDB 8.1, SQL, PL/SQL, UNIX Shell Scripts, Windows XP,

windows power shell scripting.



Contact this candidate