Post Job Free

Resume

Sign in

Data Sales

Location:
Bangalore, KA, India
Posted:
March 26, 2014

Contact this candidate

Resume:

PROFESSIONAL SUMMARY

. Over **+ years of IT experience as a BI Architect/ETL Data Modular/ETL

Lead Developer which includes strong Data Warehouse Design/Developer

experience in Informatica 9x/8x/7x/6x,5x, Oracle 11i, SQL Server 2008,

Teradata v12/v13.

. Experience in all aspects of project development life cycle in Data

Warehousing - Estimation, Requirement Document, Design Document, Unit

test plan, System test Plan, Implementation plan, Lead role in all phases

involved in SDLC.

. Excellent knowledge and usage of Data Warehousing concepts like

dimensional modeling, Kimball methodology, star schema, snowflake schema,

master data management.

. Solid experience in Informatica PowerCenter, Mappings, Mapplets,

Transformations, Workflow Manager, Workflow Monitor, Repository Manager.

. Extensive experience with Data Extraction, Transformation, and Loading

from disparate Data sources like RDBMS, COBOL, XML into a common

reporting and analytical Data Model - serial/parallel batch processing,

Real time ETL including CDC using queue- MQ series, TBCO etc.

. Expertise in performance tuning of Informatica mappings, sessions,

Workflow - Throughput analysis, Session partitioning, caching technique:

static, dynamic, persistent.

. Expertise in performance tuning of PL/SQL code, SQL queries - Partitioned

tables, Indexes, function based indexes, SQL tuning, cursor optimization,

stage and process technique, bulk loading etc.

. Experienced in Installation, Configuration, and Administration of

Informatica Power Center/Power Mart Client, Server.

. Strong understanding of Data Mining concepts and techniques like -

Decision Tree Analysis, Cluster Analysis, Association Analysis, and

Neural Networks.

. Solid in communicating at all levels of management with experience in

coordinating various client meetings, seminars, presentations and group

discussions.

. Ability to work in-groups as well as independently with minimum

supervision and initiative to learn new technologies and tools quickly.

EDUCATION & TRAINING

. Masters Degree in Computer Applications

. Certified as a Level 1 Informatica Mapping Designer - Informatica

corp.

. Teradata training for ETL processing and warehouse application.

TECHNICAL SKILLS

Business Intelligence Tools: Informatica PowerCenter 9x, Abinitio GDE Power

Connect, SAS 9.1 base Programming & SAS

statistics

Data Reporting Tools: COGNOS PowerPlay 6.5, Impromptu Administrator 6.0,

Data Reports, Business Objects 5.0, Developer

2000, MS Access Reports, Crystal Reports,

Business Objects 5.1

Programming Languages: C, C++, JAVA, XML, HTML, SQL, JavaScript,

Perl, PHP, Shell Scripting

Databases: Oracle 7.3/8i/9i, Sybase 11.5/11.2, MS SQL Server 2008, MySQL,

MS Access, DB2, XML, SQL*Plus, SQL*Loader

Software Engineering Tools: Rational Rose, Requisite Pro, U.M.L, MS

Visio.

Data modeling & Data integration ERWin, Microsoft Visio, Power builder,

OLAP, ROLAP, MOLAP

Testing Tools: WinRunner, TestDirector, QuickTestPro, Rational Robo, PVCS

Bug Tracker

PROFESSIONAL EXPERIENCE

Nomura Securities - World Finance Center, New York

Nov-09 - Present

Environment: Informatica 8.6, SQL Server 8, Oracle10g, PL/SQL, UNIX,

Autosys

Description:

Project includes implementation of Operations' new Fixed Income processing

system - IMPACT. The project was a major effort that touched almost all

areas of Nomura US and many other global businesses. The overall size and

scope of the project was exemplified by more the than 400 people who were

involved either directly or peripherally, with 150 working over the

conversion weekend. New security reference data platforms, interfaces for

the Fixed Income front office systems, separation and partitioning of NTAPS

for the Equities business, major re-plumbing of finance and regulatory

reporting and general ledger feeds, new interfaces for risk, a new suite of

reports for compliance and a significant change to the way Nomura confirm,

settle, and balance its Fixed Income trades were just some of the areas

affected by the change.

Role and Responsibilities:

. Designed ETL architecture for integration of new settlement system

IMPACT- Staging Area, Interface Tables, dimension modeling.

. Designed and developed ETL process for Incremental extract, Real time

ETL including CDC using queue- MQ series, TBCO, Interface table

design, dimension load and fact load ETL process using Informatica,

PL/SQL package, UNIX scripting for file processing.

. Designed and developed Instrument master a confirmed dimension

sourcing data from in house source-IMPACT, NTAPS, PDP and Third party

market data provider- Bloomberg, Fitch, Moody etc.

. Complete ETL design and development of Instrument master dimension to

capture real time updates and real time integration to facilitate real

time trade fact load using one version of instrument data.

. Designed ETL process to Fact table including complex fact loading like

Risk Market Value, Market data fact, Security Aggregated fact,

position, outstanding/Fails/Settlement trade.

. Designed module for accounting data feed to PeopleSoft for balance

sheet posting.

. Designed and developed mapping to load and implementation of large

fact which holds daily snapshot of Securities, Positions, and other

market data - Rating, Margin Price, Pool Factor, and Coupon.

. Onshore-offshore co-ordination - worked with offshore on design,

development and day to day issue resolution.

. Worked with Business analyst to create technical requirement document

for Phase II includes implementation and integration of more business

functionality like Equity trade, Trade transaction, and international

exposure.

. Post implementation support- Analysis of issue raised on daily basis

and resolution to meet SLA.

. Post implementation support- On call support for data integration

issue.

. Post implementation support- Worked as part of Run the bank team,

responsibility includes - day to day issue analysis, issue fixing,

Code review, Code migration, performance tuning for late running

batch.

.

OppenheimerFund- World Finance Center, New York.

Sep-09 - Nov 09

Environment: Informatica 8.6, SQL Server 8, Oracle10g, PL/SQL, UNIX,

Autosys

Description: OppenheimerFunds, Inc. has been helping investors achieve

their financial goals since 1960. They are one of the nation's largest and

respected asset management companies.

Role and Responsibilities:

. Designed and developed ETL process for extract, Real time data

extract, Interface table design, dimension load and fact load ETL

process using Informatica, PL/SQL package.

. Designed and developed Instrument master a confirmed dimension

sourcing data from in house source and Third party market data

provider- Bloomberg, Fitch, Moody etc.

. Complete ETL design and development of Instrument master dimension to

capture real time updates and real time integration to facilitate real

time trade fact load using one version of instrument data.

. Designed ETL process to Fact table including complex fact loading like

security, trades and position.

. Designed and developed mapping to load and implementation of reporting

tables.

. Worked on creation of re-usable transformation and Mapplets for data

integration.

Priceline.com Norwalk, CT, USA

Jun-08 - Aug-09 Environment: Informatica 8*/7*, Oracle10g,

PL/SQL, UNIX

Description: Priceline.com is one of the leading commercial website which

helps users obtain discount rates for travel related items such as airline,

hotels and rentals. It operates in several countries under different

subsidiaries. It is a national leader in travel and leisure.

Role and Responsibilities:

. Create/Modify dimension module to accommodate new ERP integration.

. Analyze current ETL data flow from old ERP to corporate data warehouse

(CIS) and design data flow to connect new ERP to corporate data

warehouse.

. Understand the current scope of mapping and create documentation.

. Change mappings as per the new target structure.

. Testing of critical mappings after upgrade from version 7 to version 8

. Modify existing PLSQL Procedure and shell scripts to achieve

standardization.

. New requirement Development as per specifications.

GE Capital Solution Danbury, CT, USA

Nov 06 - May-08

Smart Stream (TLM) Data Integration

Environment: Informatica 8*/7*, Oracle PL/SQL, UNIX

Source System Oracle Database, Flat Files, Teradata, XML

Description: GE has enormous diversity of business where in this project

will focus on two critical division of Commercial Finance (CommFin),

typically GE Capital Solutions and GE Real Estate. Both the divisions

reconcile their account related activities, which involves General Ledger

balances by authenticating them with sub ledger and other source document

balances.

The new initiative has been taken to move the reconciliation process to

TLM. These accounts will be representative of the entire population of

accounts within CommFin. The TLM implementation will accomplish the

following objectives:

. Automate a predominantly manual process.

. Reduce time taken for reconciliation process.

. Ensure that timescales for reconciliation process are met.

. Provide a central repository for all reconciliation data and supporting

documentation.

Allow rapid access to reconciliation information and supporting

documentation. Add validation and verification to enforce business rules

based on GE corporate policies and business specific requirements.

The above automation process would fetch data from 134 sources including

Oracle GL, Sub Ledger Systems (like WebCash, PMS etc) and other external

system to load formatted (pipe delimited flat file) data into TLM system

designated path, from where TLM engine will process it further to meet

business expectation for account reconciliation.

Role and Responsibilities:

. Requirement gathering for each business and to provide integrated view

to business people to make uniform business rule for all business.

. To prepare data requirement document for different sources for all

business.

. Analysis of the source system, data flow, dependencies and designed

ETL data flow and staging area to maintain the reusability of the

code.

. Environment setup, designing ETL Architecture and process to extract

financial data from different source including Oracle GL, Corporate

warehouse, PMS, and customized application system, etc. and to create

a common module to perform integrated business rule validation and

data load.

. Defining/designing of codes/lookup table to store business rule and to

be used in ETL processing.

. Creating Complex mapping and creating reusable transformation and

reusable Informatica code component.

. To define table and index structure for better performance.

. Team management, task estimation and communication with client for

daily status of the project.

. Helping team to develop ETL process and to maintain the reusability of

design.

Motorola Inc, Fort Worth, TX

Aug 04 to Oct 06

eMART Lean Supply Chain

Environment: Oracle 8i, Oracle PL/SQL,

Oracle Web DB, Power Analyzer 3.5/5.0

Description: This project has been initiated for providing Motorola wide

visibility of supply chain to higher management. This includes extraction

of sales order details for all Motorola business and provide a platform for

integrated report. Different business reports have been created under this

are: Supply plan summary, OTD, Delivery span, and shipped/unshipped report.

Role and Responsibilities:

. To gather specific requirement for each business and to provide

integrated view to business people to make uniform business rule for

all business.

. To prepare requirement document for all business.

. To design the ETL process to extract sales order data for all business

from different source including data warehouse, ERP, and customized

application system, and to create a common module to perform

integrated business rule validation and data load.

. Converting business specification into technical report requirement

and mapping it to logical database schema.

. To define table and index structure for better performance.

. To write PLSQL procedure for data extraction and load (ETL) process.

. Helping team in performance tuning and Job scheduling.

Motorola Inc. USA, GTSS Data Warehouse

Environment Informatica 6.1/7.1, Oracle 8i/9i, Oracle PL/SQL,

Oracle Web DB, Power Analyzer 3.5/5.0

Description: Motorola is a world leader in providing telecom solutions.

Its operations are managed by different unit/sector. GTSS DW is a

warehouse for Network CSI sector. Under this project new business

functionality has been added called procurement by using Informatica

provided standard mapping and customizing them as per the business need.

Role and Responsibilities:

. Created Requirement, Design, and Test plan documents for different

enhancement.

. Preformed detail analysis of issues reported by the Business Users and

provided immediate solutions.

. Performed analysis, estimation, and implementation of the Change

Request.

. Prepared Unit Test Cases and System Test Cases.

. Develop code in Oracle PL/SQL, Oracle WEB DB, Informatica 7.1, and

Power Analyzer 3.5/5.0

GSMDW Procurement

Environment: Oracle 9i, Informatica 7.1, Power Analyzer 5.0

Description: Global Supply Chain Management DW is a very large data

warehouse of Motorola containing supply chain data from all sectors. I

have worked on the different enhancement EMS Known Issue, Foundation

Improvements, Supplier Diversity, and FSS-11i Connectivity including

production Support.

Role and Responsibilities:

. Performed detail analysis of issues reported by the Business Users and

provided immediate solutions.

. Performed analysis, estimation, and implementation of the Change

Request.

. Prepared detail Design document based on Requirements document.

. Prepared Unit Test Cases and System Test Cases.

. Developed code in Oracle PL/SQL, Informatica, and Shell Script.

. Performed Unit, System, and QA Testing.

OneVIEW Data Purge Project

Environment: Oracle 9i, Informatica 7.1, Power Analyzer 5.0

Description: OneVIEW is a large data warehouse of Motorola for

Personal communication sector. This Project was to purge data in OneVIEW

data warehouse. Process includes deletion of those records, which were

not required for business analysis (like historical and obsolete data).

The objective of this project was to develop a system to delete records

from selected tables based on some pre defined condition. Scope of the

project includes modifications to OneVIEW for the following:

- Purge data identified for performance and/or data growth issue.

- Purge obsolete data based on the source of data.

- Rebuild tables and indexes to regain space.

- Prepare Guidelines for data purging.

Role and Responsibilities:

. Created design for CRM module

. Created of PL/SQL Package and Procedure for data purging.

. Created Test Plans.

. Performed testing of all modules.

. Followed the CMMi processes to ensure the quality of deliverables.

. Performed client communication for approval on project deliverables.

. Provided support to the Implementation team.

. Lead my team to perform weekly status call with the Motorola PM

Moritz Sales Data warehouse, India Nov 05-

Jun 06

Moritz group, Malaysia

Environment Oracle 9i, Informatica 6.1

Description: This system is mainly concerned with sales and distribution

analysis of the product lines. Company has several branches spread all

over Malaysia. Branches send their data to the head office. Like every

emerging company, this company is also searching for business enhancement

and competitive edge using BI technique to make better business

decisions. Client wants to have a system, which can give them intelligent

information reports on their existing business situation and how to

analyze business specifically.

Role and Responsibilities:

. Designed, developed, and implemented ETL specification through

Informatica 6.1 to load data from various sources to target Oracle

database.

. Prepared various Test cases to test mapping to assure that data is

loaded as per ETL specification.

. Designed complex mapping using various transformations like Source

Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, and Update

Strategy etc. to load data into slowly changing dimensions,

Transaction level fact, and summary fact.

. Performed debugging and tuning of mapping and sessions.

. Created and configured Workflows and Session tasks to load data.

. Scheduled sessions using various newly introduced tasks in Informatica

6.1.

. Configured Informatica server for loading data.

Tesco Retail Dss Jan 05-

Oct 05

TESCO Ltd., U.K

Environment Oracle 8i, Informatica 6.1

Description: company has retail chain all over the world. This project

was developed to support Business Analyst in complex decision making.

Sales performance is measured across various dimensions, and based on

these observations the management can take better decision.

Role & Responsibilities:

. Prepared ETL specification and Test cases, and designed mapping as per

ETL specification. After design and development of mapping, performed

testing as per unit Test cases.

. Created various reports as per our client requirements using Cognos

(Impromptu Administrator).

. Performed scheduling of sessions using various newly introduced tasks

in Informatica 6.1.

. Developed dimension loading for location dimension, date dimension,

item dimension, branches dimension, and customer dimension.

. Developed mappings and tuned them for better performance.

. Used most of the transformations such as the Source qualifier,

Aggregator, Joiner, Lookups, Expression, and Filters and Sequence to

integrate data to the target.

. Involved in Unit testing and performed data validation.

Strategic Intelligence System Apr 04-Dec 04

Strategic Systems Architects, Carrollton, Texas

Environment Oracle 8i, Informatica 5.1.

Description: A strategic system is a successful accessory store chain with

retail outlets across the US, selling many products across different

product lines. This project is a management decision support system that

quickly and effectively gives a snapshot of company performance at a high

level with respect to product lines. Business report includes company's

revenue analysis, estimation of the impact of regions and seasons on

revenue, sales progress from quarter to quarter, sales analysis of

product lines, and projected and actual sales comparison.

Role & Responsibilities:

. Analyzed various sources of data and format in which data is coming

from different sources.

. Designed mapping to extract data from different source and put into

Data Staging area.

. Developed ETL for dimension (SCD, SGD) loading and fact loading.

. Developed mappings and tuned them for better performance.

. Involved in the Unit testing (qualitative and quantitative) to check

the data consistency.

. Created reports in Cognos Impromptu Administrator.

Hotel Management Information System Jul 03-

Mar 04

ITDC, New Delhi, India

Environment Visual Basic 6.0, Excel, Oracle 8i, Crystal Report

Description: ITDC was operating 36 hotels across India. All hotels send

summary data on monthly basis to ITDC Hotel Division, New Delhi.

Objective of the system was to store data at one place to fulfill on-

demand query and provide unit wise and integrated report such as details

of room and bed occupancy percentage, income, expenditure, operating

profit/loss, operating ratio, average room realization, and agency wise

room realization etc.

Role and Responsibilities:

. Involved in design and development of database.

. Developed code and design of Complex Query/Report using Visual Basic

tools like Data Environment, Data Report, ADO, Hierarchical Flexi

Grid, and Flexi Grid.

. Involved in testing and documentation.

. Trained end users on the software.



Contact this candidate