Post Job Free
Sign in

Data Manager

Location:
Dayton, OH, 45420
Posted:
August 15, 2010

Contact this candidate

Resume:

Sarala Sethuraman

******.**********@*****.***

301-***-****

Summary

* ***** ** *** ********** as a consultant implementing Client/Server and

Data Warehousing applications in major companies with ETL tool Ab Initio

1.15/1.14/1.13 and Informatica

Proficient in the integration of various data sources such as DB2, SQL

Server, Oracle, XML Files, Fixed Width Flat Files, Delimited Flat Files,

VSAM files, Oracle 11i ERP, PeopleSoft and SAP into ODS

Good knowledge in client/server application development using Oracle9i/8i,

MS SQL Server, PL/SQL, Stored Procedures, Triggers, Functions and Visual

Basic 6.0, VC++ and their .net frameworks

Proficient with various AbInitio Parallelism and Multi File System

techniques

Excellent analytical, problem solving, interpersonal, presentation and

communication skills to keep executive staff and team members appraised of

goals, project status and resolving issues & conflicts.

Extensive knowledge in Dimensional Data Modeling, Star schema, Creation of

Fact and Dimension Tables, OLAP, OLTP and thorough understanding of ETL

concepts

Thorough knowledge of DML, Unix Shell scripting and PL/SQL Programming

Worked extensively in the GDE (Graphical Development Environment)

configuring, connecting, testing the components to produce executable flow

graphs on Unix environment

Efficient Report Generation & Analysis made using Business Objects/Crystal

Reports.

Proficient in design and development of logical and physical data models

using Erwin tool

Proficient in the integration of various data sources such as DB2, SQL

Server, Oracle, XML Files, Fixed Width Flat Files, Delimited Flat Files,

VSAM files, Oracle 11i ERP, PeopleSoft and SAP into ODS

Educational Qualification:

Master of Science in Software Systems - M.S.

Birla Institute of Technology & Science - BITS, Pilani, India

Master of Computer Applications - M.C.A.

Bharathidasan University, India

Bachelor of Science - B.Sc.

Bharathidasan University, India

Technical Skills

ETL Tools: AbInitio GDE 1.15.x, Co>Operating System 2.15.x, EME, BRE

Database: Sybase, Teradata V2R3, Informix, IBM DB2 OS/390 8.1.0, Oracle

9i/8i/7i, MS SQL Server 2000/7.0/6.5, MS Access 2000/7.0

Reporting Tools: Business Objects XI,6.0/5.1, Crystal Reports 8.5/10,

Microstrategy 7.1/7.0/6.0/5.0

Operating Systems: Sun Solaris 8, IBM OS/390 version 2.8, IBM AIX Unix 5,

Red Hat Linux 8, Microsoft Windows XP/2003/2000/9i/NT, Windows 2000/2003

Server, Novell Netware 4.1, and DOS

Data Modeling: Erwin 4.1/3.7, Microsoft Visio 2000

D/W Concepts: OLTP, OLAP, Normalization, Data Marts, Dimensional Modeling,

Facts, Dimensions, Star Schema, Snow Flake Schema

Programming Languages: SQL, PL/SQL, T-SQL, Unix Shell Scripting (Korn,

Bourne), DML, C, Java, Visual Basic 6.0/5.0, ASP, ASP.NET, HTML,

JavaScript, VB Script,

Servers: Apache web server, Microsoft IIS 6.0, Windows 2000/2003 Enterprise

& Standard Edition, Advanced Windows 2000 Server and Red Hat Linux

Other D/W Tools: Schedulers AutoSys, Tivoli, SQL Loader, WinSQL 5.0, TOAD,

Webconnect Pro WC6.4, Siebel Tools 7.5

Web Technology : VB.Net,VC++ 6.0, MFC 6.0, ATL 5.0, COM/DCOM, CORBA

Scheduling Tools : Autosys 4.5, r11/Crystal Reports/Control -M

Professional Experience

ETL Consultant

Sep 2008 - Present

Blue Cross Blue Shield, Washington DC

Developed for the Care Coordination Technical Infra Structure(CCTI)

Project for CareFirst BCBS to maintain a perfect cleansed Centralized

database globally. FEP Statistical Records (FSR) and Drug Statistical

Records (DSR) are used as the claims source data for FEP Integrated Data

Store (FEPIDS). FSR and DSR are for approved claims only. Other types of

claims transaction records that are in Transaction Work Records (TWR) are

generated directly from the claims transactions. The TWR will be converted

to FSR and DSR in the CCTI environment. The Medical claims are submitted

in FEP Claims Record by the local Plans, and the pharmacy claims are

submitted in the Drug Billing Record (DBR) format by CareMark, the vendor

contracted to process the pharmacy claims. The NDM files from the Plans

and CareMark are collected and submitted into the claims processing system.

All FCR records for the same claim are combined and converted into one TWR

record. DBR records are converted to TWR and Drug Billing Interim Record

(DBIR) format. Claims failed the VTC(Validity Translation Compatiblility)

edits are suspended from further processing, and are updated to the

Unedited claims tables in FEPENRL database, and adding as a new transaction

to the Output_Portal_TWR table(Temporary holding area) in FEPENRL

database. At the end of the day, all claims transactions processed in the

day are downloaded into either Denied or non-Denied(Approved) TWR files.

The claims transactions are purged from the Output_Portal_TWR database

after one day retention.

Responsibilities

Transformed and conditioned the non-formatted data into standardized format

using Transform components like Reformat, FilterByExpression, Join and Sort

components in Ab Initio graphs

Worked on complex Ab Initio XFRs to derive new fields and solve various

business requirements

Developed generic graphs to extend a single functionality to many processes

and reduce redundant graphs

Used Ab Initio to create Summary tables using Rollup and Aggregate

components

Created complex Ab Initio graphs and extensively used Partition and

Departition components to process huge volume of records quickly, thus

decreasing the execution time

Worked on improving the performance of Ab Initio graphs by employing Ab

Initio performance components like Lookups (instead of joins), In-Memory

Joins, Rollup and Scan components to speed up execution of the graphs

Worked on Continous Flows including Continous RollUp, Continous LookUp, and

other Continous components

Worked on Common Projects and Created Conditional Components

Loaded formatted data to DB2 database using Load DB and Output Table

components

Responsible for creating test cases to test the production ready graph for

integrability, operability, and to make sure the data originating from

source is making into target properly in the right format

Developed Unix scripts in Korn Shell environment to automate specific load

processes and stored procedures using PL/SQL

Involved in building and deploying Meta Data repository

Implemented the parallel application by replicating the components -

datasets and processing modules into number of partitions

Extensively used Ab Initio Co>Operating System commands like m_ls, m_wc,

m_dump, m_copy, m_mkfs

Developed PL/SQL and UNIX shell scripts and wrapper scripts

Created test data for critical graphs and written documentation for these

graphs

Developed Ab Initio Multi File Systems (MFS) to monitor performance in

parallel layout

Worked independently as well as part of the team with large tera bytes of

data and loaded them into Data Marts

Efficiently used Data Profiler to read data directly from database tables

or files and computes field and dataset level statistics completely

characterizing the dataset.

The Data Profiler results are published in the Ab Initio's Enterprise

Meta>Environment (EME), attached to the objects that were analyzed.

The display of the Data Profile Analysis in the EME are both graphical and

tabular

Using Data Profiler, makes development faster by allowing developers to

identify and understand data anomalies such as out-of-range values,

duplicate keys, and misdeclared data formats up front, and to aid them in

developing data cleansing modules, thus maintaining data quality and

usability.

Used Queryman as front-end tool to run SQL queries against Teradata and

also used other Teradata utilities like BTEQ, FastExport, FastLoad,

MultiLoad, TPump for specific processes

Redesigned existing graphs to bring down processing time and scheduled the

jobs using Tivoli Maestro job scheduler

Worked on creating complex SQL queries to run data checks against data

loaded in Oracle database and also to obtain specific results from the

database for Ab Initio graphs

Implemented the parallel application by replicating the components -

datasets and processing modules into number of partitions

Developed Ab Initio custom components for specific business needs using

shell scripts

Monitored and launched job queues using Autosys and Tivoli

Create job definitions using the AutoSys GUI utility and using the AutoSys

JIL through command-line interface as well.

Scheduled workflows on Autosys using Unix Scripts

Scheduled differenct processes based on success/fail codes returned from

it.

Effeciently used the Autosys Event Processor, A multi-threaded process

which selects events from the Event Server and processes them.

Scheduled jobs using multiple event batching and dynamic thread creation.

Used Autosys WCC which consists of applets that provide job management

including Job Editor, Job status Console, Job Flow Design, Flow Monitoring,

Event Console and Reporting.

Provided 24/7 production support

Environment: Ab Initio2.15.2, GDE 1.15.5, Data Profiler, BRE, ERWIN, Unix,

UDB2, Korn Shell, SQL, Oracle 10g, IBM WebSphere MQ V6, Autosys r11,

Business Objects XI

ETL Developer Apr

2007 - Sep 2008

Marketing Associates Inc. Detroit, MI

It's a daily ongoing Project that deals with Ford Dealers and LM

Dealers to promote their sales all over US by identifying the optimum sales

and other criteria and updating good thru date and having a master table of

all existing dealers, through online registration in websites of our

company. Extracted those daily registered customers, analyze the good and

bad record to maintain a clean and valid data set in the Data warehouse.

Responsibilities:

Tera data utilities used extensively for loading bulk data handling

including Fast Load, Multi Load, TPump, and others

Data Modeling created in ERWIN 4.1 for Logical and Physical Models.

Efficiently used Data Profiler to read data directly from database tables

or files and computes field and dataset level statistics completely

characterizing the dataset.

Worked on Common Projects and Created Conditional Components

Working knowledge of Abinitio Continuous Flow

Efficiently modified the Business Rule (BRE) based on the changes made by

the client requirements.

Worked on Start Script and End Script for initial load and final validation

Transformed data into standardized format using Transform components like

Reformat, Filters, Join and Sort components in AbInitio graphs

Worked on complex AbInitio XFRs to derive new fields and solve various

business requirements

Developed generic graphs to extend a single functionality to many processes

and reduce redundant graphs

Used AbInitio to create Summary tables using Rollup and Aggregate

components

Created complex AbInitio graphs and extensively used Partition and

Departition components to process huge volume of records quickly, thus

decreasing the execution time

Designs, develops and tests custom reporting functionality in a database

and web-based environment using Crystal reports

Develops and maintains custom Oracle SQL and PL/SQL code.

Reporting integration with existing user interfaces.

Tests and debugs developed reports, builds and integrates developed

reports.

Monitored and launched job queues using Autosys and Tivoli

Create job definitions using the AutoSys GUI utility and using the AutoSys

JIL through command-line interface as well.

Scheduled workflows on Autosys using Unix Scripts

Scheduled differenct processes based on success/fail codes returned from

it.

Effeciently used the Autosys Event Processor, A multi-threaded process

which selects events from the Event Server and processes them.

Scheduled jobs using multiple event batching and dynamic thread creation.

Used Autosys WCC which consists of applets that provide job management

including Job Editor, Job status Console, Job Flow Design, Flow Monitoring,

Event Console and Reporting.

Provided 24/7 production support

Produces functional design specifications and documentation.

Environment: Ab Initio2.14, GDE 1.15, BRE, Data Profiler, AIX Unix Version

5, IBM DB2 OS/390 8.1.0, Teradata, SQL Server 2005, IBM WebSphere MQ

V5.3, Korn Shell, Siebel Tools 7.5, Autosys 4.5

ETL Developer

Feb 2005 - Mar 2007

Pfizer Inc. La Jolla, CA

Pfizer Pharmacy is a big Manufacturer & major retailer for the

pharmacy and general products. It runs the data mart (CVS Data mart) that

contains sales data, Purchase data, customer Profile information, Employee

Information & Modification Data. The main aim of the project is to deliver

and transform the data that is extracted from various source systems and to

load the data Into CVS Data marts

Responsibilities:

The project involved data warehouse for sales and inventory reporting

system using

AbInitio, CO>OS as a part of Enterprise pharmaceutical data warehouse

management system

Performed understanding of requirements, extracting, populating and loading

data from Oracle to data warehouse database Oracle 8i using AbInitio as

ETL.

Designed and used different Transform components in GDE for extracting data

from flat files, DB2 and Oracle and loaded into target data warehouse.

Created graphs to load data, using components like Aggregate, Rollup,

Filter, Join, Sort, Partition and Departition.

Tested graphs using Validate Components like Compare-Records, Check Order,

Validate Records and Generate-Records.

Created checkpoints, phases, and worked with multi-file data for parallel

processing using AbInitio (ETL tool) to load data from flat files, SQL

Server and Oracle.

Coded and implemented shell scripts for migration of database from

production to development system.

Performed loading data into Warehouse tables from staging area using SQL *

loader.

Developed reports with parameters, sub-reports, cross-tabs, and chart based

using with stored procedures and queries by using Crystal Reports 8.5

Environment: AbInitio 2.12, GDE 1.13,Tera data Utilities, Oracle 9i, SQL

Server 2003, PL/SQL, AIX Unix V Version, Tivoli Maestro 6.1, Solaris 8,

Stored Procedures, SQL Loader, and UNIX, Crystal Reports 8.5

Hartford Hospital, Hartford, CT

Jan 2003 - Feb 2005

ETL Developer

The Patient Data Analytics Solution provides business

intelligence analysis services to the billing department through

interactive client tools. Data from various online transaction processing

(OLTP) applications and other sources is selectively extracted, related,

transformed and loaded into the Oracle data warehouse using Informatica

PowerCenter 6.2 ETL tool. Then the transformed data from data warehouse is

loaded into an OLAP server to provide Business Intelligence Analysis

Services

Responsibilities:

Analyzed user requirements for the system

Identified business rules for data migration and performed data

administration

Installed and configured Informatica Power Center 6.2 on Client (windows)

/ Server (UNIX) for Development, Test and Production Environments

Extensively used Informatica Client Tools -- Source Analyzer, Warehouse

Designer, Transformation Developer, Mapping Designer, Mapplet Designer to

develop mappings

Used designer debugger to test the data flow and fix the mappings

Parsing high-level design specifications to simple ETL coding and mapping

standards

Developed Stored Procedures and invoked the same through PowerCenter

Stored Procedure Transformation

Developed workflows and tasks using Informatica PowerCenter Work flow

Manager.

Analyzed newly converted data to establish a baseline measurement for data

quality in data warehouse

Developed Universes and reports using Business Objects Designer

Developed Shell scripts to setup runtime environment

Used Informatica Version Control for checking in all versions of the

objects used in creating the mappings, workflows to keep track of the

changes in the development, test and production environment

Environment: Informatica Power Center 6.2(Designer, Workflow Manager,

Workflow Monitor, Repository Manager), Informatica Power Connect, Business

Objects 5.1, Oracle 8i, UNIX Shell Scripting, TOAD



Contact this candidate