Post Job Free
Sign in

Manager Data

Location:
Minneapolis, MN, 55418
Posted:
April 30, 2010

Contact this candidate

Resume:

Sarathi Talla 952-***-**** ©

********@*****.***

__________________________________________________________

OBJECTIVE

To obtain demanding and challenging opportunities to leverage my technical skill set in conjunction with my proven

design and development record to deliver successful technology based applications.

Ten years of experience in Software Development Life Cycle, ETL

development, Business Intelligence and database design & development

Expertise in database architecture and modeling, project methodology, data

integration, metadata management and ETL performance tuning

Experience in design development of ETL process using Informatica Power

Center. Extensive experience in loading high volume data and performance tuning

Experience in integration of various data sources like DB2, Oracle, SQL Server,

MS Access, Teradata and XML on Windows and UNIX platform s

Extensive experience in SQL, PL/SQL, Stored Procedures, Functions,Triggers and

Packages

Very good exposure to entire SDLC viz. from ground up planning, strategy,

collecting specs and data, interacting with users and department, coding, developing and testing to final

implementation

•Managed the Metadata associated with the ETL processes to populate the Data Warehouse

Data Cleansing, Transformation Scripts, Stored

Implemented

Procedures/Triggers and necessary Test plans to ensure the successful execution of the data loading

processes

Highly motivated team player that works with same merit in small as well as large

teams. At the same time capable of taking sound decisions, extra responsibility and competent to lead the

team as and when need arises

Experience in Financial, Healthcare, Insurance, Retail and E Commerce

Applications

TECHNICAL PROFILE

Informatica – PowerCenter 8.6/8.1/7.2/6.x(Repository Manager, Designer,

ETL

Server Manager, Work Flow Monitor, Work Flow Manager), Power

Exchange, Administration, Data Quality

Erwin 3.5.2, Power Designer 10, Visio, Star schema, Snow Flake,

Data modeling &

Business Objects (XI Rel 3/2/6.5/6.1/5.5), Desktop Intelligence, Web

Reporting

Intelligence(WEBI), Infoview, SSRS

Databases / Oracle 10g/9i/8.1/7.x, Oracle RAC, SQL Server 2005/2000, DB2 UDB 8,

Teradata, MS Access, Quest Central for DB2, TOAD, SQL Query Tool,

Scheduling tools

CVS Version control, Aqua Studio 4.5, Putty, Secure CRT, Mercury Test

Director 8.0, MS Office Tools, Visio, MS Project, MS Sharepoint portal,

Autosys 4.5, Control M

Environment Windows Vista/2000/98/95/2000, NT, Sun Solaris and AIX UNIX, LINUX

XML, Java, Web Development, J2EE, JSP, HTML, DHTML, UML, Rational

Internet

Rose, Java script, Shell Scripting

Technologies

Others Peoplesoft JD Edwards, CGI AMS Advantage, Enterprise Origination (EO)

EDUCATION

Bachelor of Science(B.Sc) in Maths, Physics and Chemistry, Osmania University, India

Diploma in Systems Management (DSM), NIIT, India

PROFESSIONAL EXPERIENCE

Target Corporation Dec 09 – Till Date

Senior ETL Developer

Target Corporation is one of leader in Retail industry. Project was to analyze and evaluate requirements for

Configuration Management DB (CMDB) Auto population and develop ETL flows of Configuration items from

different sources and load into CMDB database accessed by HP Service Desk.

Responsible to study existing ETL process and recommend conversion strategies using Informatica best

practices

Prepared ETL technical specifications based on data analysts requirements

Recommended best practices in developing mapping, session, workflow that used to extract data from

different sources, transform and load into CMDB database

Documented process flow of each ETL process, load strategies, updates strategies, error processing and

job dependencies

Worked with XML files and XSD structures to parse data to Targets using XML Transformation

Developed mapping for Logical, Hardware, HPNA and Relationship CI’s based on standards. Used

variables to pass into Parameter files. Performed Code review

Implemented insert, update, upsert strategies, mapping dependencies, batch processing and balance

control approaches. Used sequence generator to generate serial number

Tune mappings, transformations, recommend tuning options for source/target databases,

mappings and sessions

Design, test and deploy data quality process in form of plans in Logical, Hardware and HPNA

using Workbench component

Designed pre built data quality plans to perform cleansing, standardization and matching

operations and pass these plan instructions into transformation in Logical, Hardware and HPNA

Developed mapping sessions to load data into staging area and scripts to schedule jobs using Control M

automated tool

Recommended best practices and coding standards to the developer’s team.

Interacted with HP Service desk development team to understand business and mentored them ETL process

Designed UML diagrams for HPNA and Relationship in form of activity diagrams depicting flow of control from

activity to activity providing value to actor drawn in horizontal ellipse

Acted as Team lead in implementing CMDB project

Promote releases from Development to Test for system testing then to Production

Environment: Informatica PowerCenter 8.1, Administration, Oracle 10g, SQL Server 2008, SQL, XML, XSD

Structure, Parser, Aqua Studio, Rational Rose, Control M, Unix, Shell Scripting, Windows XP, MS Office Suite,

MS SharePoint

Wells Fargo Advisors Feb 09 – Nov 09

Senior ETL Developer

Wells Fargo Advisors part of Wells Group is into brokerage and financial advisors merged from Wachovia

Securities. Projects are implemented to integrate Wells Fargo data into Wachovia Securities domain and vice versa

using different ETL process. Data from Third party vendors are also extracted and transformed into BDW and ODS

systems which are used by downstream applications.

Responsibilities:

• Implemented Wells Fargo to Wachovia Historical brokerage data merger, Morning star Asset

Classification, Kaplan Insurance Annuities and Compliance conversion projects

Participated in project team meetings to gather source target field analysis, conversion

transformations and assessment of impact analysis in ETL conversions

Designed ETL technical documents which contains ETL jobs, Autosys and Unix script jobs and

process flow based on input from data analysts and data modelers SSA documents

Developed ETL mappings for truncate/full load, Delta load based on requirements from different

source systems into BDW and ODS environment

• Used expression, lookups, update strategy, router, sequence generator, filter transformations and

reusable mapplets

• Developed complex mappings with insert/update in Router and surrogate key in sequence

generator transformations

• Created simple and complex mappings in Designer, workflows using Workflow Manager

• Extensively worked with XML files, XSD structures and parsed data from XML to Relational

databases

• Identified Data Sources and configured rule based analyzer to define Data quality plan

• Integrated Powercenter users to data quality repository and pass plan instructions into

transformations. Executed workflow containing these transformations were plan instructions are

sent to Data quality engine for execution and data quality results are retrieved back into workflow

Performed code reviews on informatica jobs to check adherence of code with ETL development

standards

Coded and executed UNIX scripts to run ETL jobs in DEV, ITE and CTE environments

Created and monitored Autosys command jobs to run UNIX Scripts, workflows and file watcher

jobs

Attended regular meetings with Project team on SDLC phases and followed project timelines for

implementations

Acted as Lead team for conversion projects and coordinated on daily basis with offshore team

Coordinated with Central File Distribution(CFD) teams to automate file extraction process from

third party vendors

Environment: Informatica PowerCenter 8.6, Administration, Oracle 10g, DB2 UDB, SQL, XML, XSD structure,

TOAD for Oracle, TOAD for DB2, Autosys 4.5, Secure CRT, Unix, Shell Scripting, Windows XP, MS Office Suite,

MS SharePoint, Visio, Documentum

OptumHealth, Minneapolis May 07 – Jan 09

Senior ETL Analyst

Optumhealth part of UHG group optimizes health, well being through personalized health management solutions.

Ovations integration Project (OVTS) is to build Optum ODS containing clinical and care data. Project is focused to

extract, transform and cleanse Ovations care data to build Optum Operational Data Store (ODS) for data

warehouse and Business Intelligence solution. Implemented ETL process for H3C Wellness project.

Responsibilities:

• Performed data profiling on complex source relational files

• Created ETL technical documents based on business requirements which includes functionality,

file and data sources, process overview in form of DFD’s and source to target field matrix

• Interacted with data architectures on staging data and understand OVTS ODS build. Implemented

reference tables for codes and descriptions. Identify, document and analyze physical and

conceptual metadata

• Created shared source/target definitions for complex file structures and Oracle tables. Designed

and developed shared components

• Implemented transformations, update strategies, error processing, mapping dependences, batch

processing and balance control approaches

• Developed integration, dimension and copy mappings, mapplets based on standards. Performed

QA and Code review

• Developed complex mappings with type 1 and 2 logic to insert/update clinical data and care data

into Oracle tables based on last update date in ETL control table. Maintain Pre and Post session to

update ETL control table

• Used expression, lookups, update strategy, router, sequence generator, filter transformations and

reusable mapplets

• Custom ETL mappings with queries tunned and coded merge sql scripts to rectify duplicate

records in Post SQL sessions. Used Views/Procedures in ETL mappings

• Developed shell scripts to manage parameter files, job scheduling and ETL maintenance

Created, executed and monitored Autosys job scheduling command jobs for ETL mappings.

Raised tickets, USR’s for database and Informatica issues

Coordinated with offshore/onshore team for ETL issues. Mentoring other ETL developers in team.

Performed code reviews.

Attended functional and technical project review meetings and updated weekly OVTS team

assignment tracker

Environment: Informatica Powercenter 8.6, Oracle 10g, SQL Server 2005, SQL, PL/SQL, XML, TOAD 9.6, Aqua

Studio 4.5, Autosys 4.5, Putty, Windows 2000, UNIX, Shell Scripting, MS Office, MS Sharepoint, Mercury Test

Director

Wells Fargo, Des Moines Aug 05 – Apr 07

ETL Designer / Analyst

Wells Fargo is major Financial Institution with interest in Banking, Home mortgage, Lending and Credit Cards

services. CORE (Common Opportunities, Results and Experience) is major project to integrate Home Mortgage

Legacy Business Services data into single ERP system called Enterprise Origination (EO).

Responsibilities:

• Interacted with Data Architect team to design the Data marts for Get Risk Decision (GRD), Compliance

and Fidelity Business Services

• Worked with business, data design and customer facing teams(CFT) to understand current business rules

and documented changes needed in ETL process

• Performed data analysis, gap analysis and source to target mapping to identify source data to be extracted

to EO system

• Extracted the source data from source systems. Designed and documented for Batch processing, Error

processing, update strategies and referential integrity. Developed balance and Control mappings and

implemented Error processing

• Created complex and robust mappings, workflows using workflow manager. Monitored tasks using

workflow monitor

• Used transformations – Filter, Joiner, Look up, Rank, Expressions, Aggregator and Sequence Generator

• Developed mappings to load data from various sources using different transformations Expression,

Lookup, Aggregator, Update Strategy, Router and Filter

• Created reusable transformations and used them in different mappings

• Extensively used TOAD to write SQL queries, procedures and test source and target Oracle tables. Tunned

SQL queries for faster retrieval of data

• Documented error handling issues and provided on call support for production issues. Created incident of

support issues

• Created pre session and post session scripts and mail notifications

• Support in production environment to monitor the workflows for different clients

Business Objects:

• Designed, developed and maintained Business Objects Universes using Designer and validate the integrity

of the universe

Resolved loops and alias while developing the universe

Exported the Universes to the Repository to make resources available to the users

Created several classes and objects. (Dimension, detail, measure)

Analyzed, designed and created reports for users using Business Objects reporter .

Developed drill down reports for users depending up on requirements

Used indexes to improve the performance of the reports

Created Templates and generated reports for publishing on the Dashboard

Interacted with users from time to time to collect requirements

Created several reports like Master Detail, Drill Down, Cross Tab and Charts

Environment: Informatica PowerCenter 7.1, Oracle 10g/9i, SQL Server 2005, SQL, PL/SQL, TOAD, CGI

Enterprise Origination (EO), Business Objects 6.5, Power Designer 10, Windows XP, UNIX

Cargill Foods, KS Oct 02 – Aug 05

ETL Developer / Analyst

Excel Foods is meat subsidiary of Cargill group which process Red meat products and sells to customers

worldwide. The Project involves developing Laboratory Information Management System (LIMS) which is collection

of Test Lab data from various Labs. LIMS Reports are developed using Business Objects BI reporting model.

Responsibilities:

• Implemented LIMS system as per FDA government regulations

• Involved in designing Star Schema data model which has Fact and Dimension tables

• Installed and configured Business Objects Server and created LIMS Repository using Repository

manager

• Identified LIMS reports to be developed and documented specifications in template

• Created dimension, measure and detail class objects

• Resolved loops by creating aliases and contexts, defined complex contexts and tested to get correct

results

• Created condition objects and used multiple Section Brakes, Alerts and Filters in adhoc reports

• Used Prompts, CrossTab, Slice and Dice, Master Detail, @Functions and Formulas

• Created reports with graphical representation using Line and Bar Charts

• Improved system performance in data access by fine tuning SQL queries, joins and indexes

• Coded VBA macros to display Tabname in Cell by passing macros value to function for multitab

reports

• Designed LIMS templates, legends as one of tab for report details and exported reports to WEBI

server to be accessible on LIMS web page on intranet

• Monitor the Broadcast Agent Console and re scheduled Reports

• Scheduled and Published reports using Broadcast Agent and Monitored Broadcast Agent Console and

re scheduled Reports as per user profile

• Used Business objects Supervisor for creating users, user groups and assigned privileges depending

on user profile

Environment: Informatica 7.1, Oracle 9i/8i, SQL Server, SQL, PL/SQL, Business Objects 5.5, Designer,

Supervisor, Broadcast Agent Server (BCA), Console, WebI 2.6, VBA Macros, SDK, Windows NT 4.0, UNIX

Cybrant Corporation, CA Feb 00 – Sept 02

ETL/BI Analyst

Cybrant is into e commerce and customizes customers user interface (UI) with products Velocity, Design studio,

Pricing, Quote manager. Java interface is used to code these products and oracle as database.

Responsibilities

• Involved in data analysis and documenting reporting requirements

• As BO Supervisor involved in setting up Repository with database and created users, groups and assign

access level to them

• Designed and created classes, objects and condition using Designer

• Used Formulas, local variables and @functions for adhoc reporting

• Used slice and dice, drill down and master detail methods in full client and webi reporting

• Used with graphical representation like line and bar charts in reporting

• Involved in testing of Business object reports and cross check data results using sql plus

• Used Broadcast Agent Server to schedule and refresh Daily/Weekly/Fortnightly Reports

• Resolved performance issues in business objects reporting

• Used supervisor for creating users, user groups and assigned privileges to various users and universes

• Prepared User documentation for the developed Reports

• Worked thoroughly on PL/SQL, Tables, Views, Synonyms, and Indexes

• Analyzed the existing Stored Procedures, Functions, Triggers and Cursors. Worked thoroughly on

performance issues.

• Worked extensively on Exception handling. Used SQLERRM and SQLCODE to trouble shoot PL/SQL

code

Environment: Business Objects 4.1, Designer, Reporter, Velocity, Design Studio, Document

Agent Server, PL/SQL, Windows NT 4.0

OMC Technologies Ltd, India Mar 93 Jan 00

Programmer/Analyst

Banking Application called BRACS is developed for State Bank of India, one of the largest Banks in India. The

software is meant for individual Branches which are run parallel with the Manual System. The modules are divided

into Current Account (CA), Savings Bank (SB), Recurring Deposit(RD), and Advances Accounts

Responsibilities

• Developed the Front End Accounting Screens of CA, SB, RR and advances Accounts using oracle

Forms.

Developed various pl/sql programs, database triggers, functions and stored procedures

Used developer 2000, reports 2.5 to generate various reports

Involved in developing cross reference and report generation programs

Involved in interacting with Advertisement Department for preparing acceptance plan

Developed and implemented programs for data conversion using pro Cobol from Cobol to ORACLE

database

Developed various PL/SQL programs, Database Triggers, Functions and Stored procedures

Used Developer 2000, Reports 2.5 to generate various Reports

Involved in developing cross reference and report generation programs

• Developed Transactional, Summary Reports which are routed to spool

• Written C Shell Menu Programs as when required in UNIX

• Responsible for debugging, Enhancements and testing the Developed Programs

• Involved in Support at Onsite Applications

Environment: COBOL, Oracle 7.3, Forms 4.5, SQL *Plus, SQL *Loader, Reports 2.5, Windows NT, HP

UNIX



Contact this candidate