Post Job Free

Resume

Sign in

Data Project

Location:
New York, NY
Posted:
July 29, 2014

Contact this candidate

Resume:

Jyothy Venkataraman

ace5ph@r.postjobfree.com

973-***-****

SUMMARY

• Eight and half of experience in Informatica, SQL, PL/SQL, UNIX shell scripting

and Perl scripting.

Jyothy is a Informatica Certified Consultant (7.X) and Oracle Certified Associate

(PL/SQL).

Knowledge of Dimensional Data Modeling Physical and Logical Data

Modeling, Star Schema Modeling, and Snowflake Schema Modeling.

• Has successfully led many Full Lifecycle Data warehousing Projects. Has over 3 years of

experience of working as an Onsite Coordinator.

• Has good experience in ETL Designing and Implementation especially in

Informatica. He has experience of developing ETL Architecture.

Has worked extensively in Teradata V2R5,V12, V13, V13.10 and worked on

optimizing quires, stored procedures, fast export, multi load and bteq scripts

Has good Experience in PL/SQL Development with development of Stored

Procedures, PL/SQL Scripts and Triggers.

Experience in Optimizing the Performance of SQL scripts and Oracle/DB2

database/application tuning.

Extensively worked to create Screen Door Scripts in Unix Shell Scripts, Perl Scripting

and nawk scripts. Developed Batch Jobs using UNIX Shell scripts (AIX, csh, bash,

ksh) to automate the process of loading, pushing and pulling data.

Good knowledge and hands on in UNIX shell scripting. Created complex generic

scripts which can be used across applications.

Worked as architect for major SAP migration project for a Fortune 500 Company.

Has worked on Informatica – SAP Integration using IDocs, BAPI & ABAP codes.

• Has worked extensively with Informatica UNIX Commands like PMCMD and used it to

automate the Jobs. Has experience of Automating Informatica Jobs through third party tools

like Dollar Universe and Tivoli.

Excellent inter personal and communication skills and the ability to work as a

part of a team.

SKILLS

Data Modeling Logical and Physical Data Modeling, Star Schema Modeling, Snowflake

Modeling, E R Modeling, FACT and Dimension Tables, ERWIN 3.5/4.1

Informatica 7.1.3, Informatica 8.1.1, Informatica 9.1.0

Data Warehousing

Databases Oracle 10g/9i/8.x/7.x, Teradata V2R5/V12/V13/V13 10

Operating Systems MS Windows XP, 98, Windows NT/2000, UNIX

Languages PL/ SQL, SQL* Plus 8, UNIX Shell Script, Perl Script

Tools & Utilities SQL* Plus 8, Erwin 3.5/4.0/4.1/7.1, Toad for Oracle V8.6, MS Office

EDUCATION & CERTIFICATIONS

Education:

Degree University

MCA Cochin University

B. Sc. Calicut University

Certifications:

• Informatica Certified Consultant.

• Oracle Certified Associated PL/SQL Developer.

EXPERIENCE

Organization Designation Duration

Mahindra Satyam Software Engineer (10/2005) –

(07/2009)

Cap Gemini Ltd Project Lead (08/2009) – till date

PROJECT EXPERIENCE

1. Organization Name: Cap Gemini Ltd.

Project Name Type Ahead

Role Onsite Coordinator

Duration Oct 2013 till date

Environment Informatica Power Center 9.1.0,TeradataV13, Unix

Client Morgan Stanley, NY

The Morgan Stanley 3D type ahead (Look Up Data) functionality for FAs enhances the

search experience for users by providing suggestions in the input field drop down based

on recent keyword searches for the matching client names, SSN, TIN, etc. The look up

data key words appear in the search field after the user types the initial two characters of

the intended search word.

FAs will see only their own authorized account details as suggestions.

The SQL Server will cache account details from Teradata on a daily basis.

Data will be retrieved from the Teradata, reformatted and stored in SQL Server.

XML and Look Up feed generation to be consumed at 3D used by FAs for searching

accounts.

Data will be retrieved from the Teradata, grouped, reformatted and stored in SQL Server

The look up data key words appear in the search field after the user types the initial two

characters of the intended search word.

Responsibilities

Working with Users, clients and business Analysts to understanding &

documenting the enhancement and new project requirements.

Working provide Order Of Magnitude and effort estimation to prepare Project Plans

Working with BSAs to create Test Design and end to end test scenarios.

Creating ETL Data Model and ETL Design Documents for new projects/ Work

requests.

Co coordinating with the downstream and upstream, following the release

calendar and dates..

Providing periodic Status Reports on Capacity management and Deliverables.

Coordinating and working on Fixing production abends, break fix tickets, and

ensuring all the tasks are responded and finished well within SLA time limits.

2. Organization Name: Cap Gemini Ltd.

Project Name Enterprise Data Warehouse (EDW) Core Process

Role Onsite Coordinator

Duration Oct 2012 till date

Environment Informatica Power Center 9.1.0,TeradataV12, Unix, BO XI R2

Client Morgan Stanley, NY

Morgan Stanley Wealth Management EDW is the one golden source of data where all

related information is maintained in EDW from where each downstream application team

will consume the data for the reporting and application level logics. EDW contains firm

wide information on Accounts, swing accounts, cross reference, client, FA, Cusips,

positions, activities etc. Along with maintaining the data in warehouse for downstream

applications, self serve reporting also given precedence, so that all business users can

check data themselves.

Main source of Information is Mainframes / DB2 and the data is loaded to Data

warehousing using the generic CORE process, which is used to maintain all SCD types.

This process is having its own SCD type logic like below and all these are achieved using

the UNIX scripts and Informatica.

There are 6 types of core load processes:

Type 1 Truncate & replace

Type 2 Daily / monthly incremental file & Append history

Type 3 Daily propagator file – Apply Delta

Type 4 Full Refresh & Apply Delta

Type 5 Daily propagator file – Apply Delta –With Archival logic

Type 6 Full Refresh & Apply Delta – With Archival logic

For complex data processing Dynamic SQL Concept was used via SQL Transformations,

parallel processing, partitioning, mapplets and Teradata procedures and macros. Since

the quality of data is critical along with the challenge of meeting SLA, all special efforts

been taken to maintain the stability and integrity of data by doing the reconciliation and

dash boarding reports.

Responsibilities

Working with Users, clients and business Analysts to understanding &

documenting the enhancement and new project requirements.

Working provide Order Of Magnitude and effort estimation to prepare Project Plans

Working with BSAs to create Test Design and end to end test scenarios.

Creating ETL Data Model and ETL Design Documents for new projects/ Work

requests.

Creating a Job Flow involving Unix Scripts, pmcmd command and Tivoli Scheduler.

Providing periodic Status Reports on Capacity management and Deliverables.

Coordinating and working on Fixing production abends, break fix tickets, and

ensuring all the tasks are responded and finished well within SLA time limits.

3. Organization Name: Cap Gemini Ltd.

Project Name CRM / Class Action /MSSB Advent /NexJ

Role Onsite Coordinator

Duration Oct 2013 till date

Environment Informatica Power Center 9.1.0,TeradataV13, Unix

Client Morgan Stanley, NY

The Morgan Stanley 3D is the online application, which is accessed by the FAs for their

calculations. Data will be retrieved from the Teradata, grouped, reformatted and stored in

SQL Server All the small module projects are retrieving data from Teradata and

populating to SQL server directly or sending files to SQL server.

CRM – will send customer relation data like profile data etc from UDB DB2 (source) to

SQL Server.

NexJ – will send text file, fixed width on the account information/client information data to

Mainframe.

Class Action will write to SQL server directly, about the client position/Asset/Activity

data. In law, a class action is a form of lawsuit in which a large group of people collectively

bring a claim to court and/or in which a class of defendants is being sued. Client of

Morgan Stanley has also filed class action against various companies (like Bank of

America, Citibank etc. post the 2008 subprime crises). Since these Clients have traded in

the securities of these companies (Bank of America, Citibank etc) via Morgan Stanley and

once they win the class action, our FA (as well as Client) need to know which client are

eligible for settlement as part of Class Action and how much.

Thus there is a need of an application which enables our FA to get details about client

transactions and other relevant client data based on class action.

Responsibilities

Working with Users, clients and business Analysts to understanding &

documenting the enhancement and new project requirements.

Working provide Order Of Magnitude and effort estimation to prepare Project Plans

Working with BSAs to create Test Design and end to end test scenarios.

Creating ETL Data Model and ETL Design Documents for new projects/ Work

requests.

Co coordinating with the downstream and upstream, following the release

calendar and dates..

Providing periodic Status Reports on Capacity management and Deliverables.

Coordinating and working on Fixing production abends, break fix tickets, and

ensuring all the tasks are responded and finished well within SLA time limits.

4. Organization Name: Cap Gemini Ltd.

Project Name Near Real Time Accounts to Data warehouse

Role Onsite Coordinator

Duration Feb 2013 till Dec 2013

Environment TeradataV13, Unix

Client Morgan Stanley, NY

Morgan Stanley Wealth Management team has online systems which will look up on the

data from the EDW Data warehouse for showing up the details in the front end, while

client is doing manual transaction through internet. This as of now shows the details as of

previous day. Near real time will help FA, clients as well as the business user for seeing

the latest data and taking decisions accordingly.

Data from Mainframes are moved to Teradata through IBM Info Sphere Change Data

Capture offers a high performance, heterogeneous, near real time data replication

solution that helps customers easily capture and deliver business critical data changes

throughout the enterprise. Once the data is in Teradata, exact DPROP (Data propagator

logic) will be implied on Teradata and the tables will be loaded with the near real time

data. This load would be happening in every equal interval. Generic scripts are developed

which will work on the bteq and maintain CDC data in the database. This script will take

the parameter as the table name and will populate. This helps in unnecessary replication

of the same logic across for each table.

Responsibilities

Working with DB2/Mainframe team understating the DPROP set up.

Contributing on the Data Model Design and ETL Architecture Design

Preparing and reviewing Design Artifacts, Code. Ensuring defect free deliverables

are passed to clients

Leading the Offshore team for Task Assignment and Status Updates.

Coordinating and reviewing Unit Testing, QA and Code Review efforts.

Working with QC Teams on Defect Triage and Resolutions.

Coordinating the UAT and Warranty Support which includes Batch Monitoring,

Defect Resolutions and Change Request Management.

Working on value adds and performance improvement plans to enhance the

existing functionality.

5. Organization Name: Cap Gemini Ltd.

Integrated Data Warehouse

Project Name

Role Offshore Team Lead

Duration Jan 2011 to Oct 2012

Environment Informatica Power Center 8.6.1, Teradata V12, UNIX,BO XI R2

Client Morgan Stanley, NY

Morgan Stanley needed to take data sources spread across multiple organizations that

prevented the cross referencing of information and deliver a centralized information store

to enable analysis of information by business area and organizational functions.

Information is extracted from the back office, the market itself, call centers and the

Internet as well as a corporate repository, which features a scheduler, the file name of

origin/destination, format, source and connection. Both the large volumes of information

from external sources and the small volumes of open database connectivity (ODBC) data

from the repository undergo minor transformations of format, office correspondence and

simple calculations.

This engagement involves the following Business Areas:

Wealth Management, Assets, Revenues, Flows, FA Compensation, FA Expenses,

Banking, PLA and Tailored Lending, Core analytics for the Smith Barney Data integration

into the Morgan Stanley Data warehouse framework.

Leading the Offshore Team for overlooking the Production support activities.

Coordinating and working on Design and Implementation part of Enhancements

and work request.

Coordinating and reviewing testing, QA and Code Review efforts.

Providing periodic Status Reports on Capacity management and Deliverables.

Coordinating and working on Fixing production abends, break fix tickets, and

ensuring all the tasks are responded and finished well within SLA time limits.

Working on value adds and performance improvement plans to enhance the

existing functionality.

Coordinating and providing month end and year end support.

6. Organization Name: Cap Gemini Ltd.

Shared Client Information _ Joint Venture

Project Name

Role Informatica Architect

Duration August 2009 to Dec 2010

Environment Informatica Power Center 8.6.1, Teradata V12, UNIX,BO XI R2

Client Morgan Stanley, NY

On June 1st 2009 Morgan Stanley and Smith Barney division of Citi group Inc. completed

a joint venture that resulted in the formation of Morgan Stanley Smith Barney (MSSB). At

the time of closing both Morgan Stanley and Smith Barney had their own bank deposit

programs (BDP). With the joint venture, there are specific requirement laid out within the

deposit sweep agreement (DSA) among the Citigroup and Morgan Stanley that address

the handling of deposit sweep accounts relating to the Morgan Stanley and Smith

Barney’s respective BDP’s. We are tagging all clients account on both the legacy Morgan

Stanley and legacy Smith Barney platforms specific to requirement detailed in the DSA.

This engagement involves the following Business Areas:

Wealth Management, Assets, Revenues, Flows, FA Compensation, FA Expenses,

Banking, PLA and Tailored Lending, Core analytics for the Smith Barney Data integration

into the Morgan Stanley Data warehouse framework.

Requirement Analysis / Design.

Coordinating and working on Design and Implementation part of Enhancements

and work request.

Coordinating and reviewing testing, QA and Code Review efforts.

Understanding the business flow. Working along with the Business Analyst to get

input on various business processes.

Coordinating and working on Fixing production abends, break fix tickets, and

ensuring all the tasks are responded and finished well within SLA time limits.

Working on value adds and performance improvement plans to enhance the

existing functionality.

7. Organization Name: Mahindra Satyam.

Project ETL

Project Name

Role Informatica Architect/Onsite Coordinator

Duration Dec 2007 to July 2009

Environment Informatica Power Center 8.1.6, DB2, SAP R/3

Client DuPont, Wilmington, Delaware

DuPont Chemicals has huge inventory data stored in disparate legacy systems. This

project is part of their huge endeavor to migrate all the different systems to centralize SAP

Systems. The data migration is to be done in two phases of validations.

Data is moved from disparate sources (Oracle, DB2, Excel, flat files) to a staging DB2

Databases. Once they are move to the relational tables after all validations, good records

needs to be posted to SAP through different ways to different objects like IDocs &

BAPI/RFC.

This been taken care as phases, material master, customer master, bill of material, sales

and master recipe like all major portions been taken care.

Responsibilities

Played a key role from the start to the end of the completion of this project. Right

from the understanding the requirements, designing documents, developing,

testing and till project GOLIVE.

Analyzed the data requirements and understood the complex structure of

heterogeneous legacy systems.

Manually put on hands on experience for each object in SAP Development

environment and made sure the rule templates and valid and no rules had been

missed. For each object, manually checked in SAP with the transaction codes like

(XD01 (customer master), MM01 (material Master) VD01 (customer info records)

etc.)

Wrote DB2 stored procedures for validating business rules and transformations

Made sure that each material is ready to get extended in SAP by making a checkpoint to

check that a material is having all mandatory views like Basic Data, MRP, Purchasing,

Sales, and Plant Data storage, Warehouse Management, Unit of Measure etc. depending

on their material types like (VERP, ZMAT-FPP, ZMAT-FPM, and DIEN etc.)

Loaded data to both Keystone (480 K cluster) and other clusters (W cluster, T

cluster like that) as per the business requirements

Even created Informatica mappings for pre product costing runs

Created and Run the UNIX scripts for all Pre-Post session ETL jobs.

Gave adhoc reports to the clients as per their requirements.

8. Organization Name: Mahindra Satyam.

Project Name Central ETL

Role Developer/Informatica Architect/Module lead

Duration March 2006 to Dec 2007

Environment Informatica Power Center 8.1.6, Oracle 10g

Client DuPont, Wilmington, Delaware

DuPont Chemicals has huge inventory data stored in disparate legacy systems. This

project is part of their huge endeavor to migrate all the different systems to centralize SAP

Systems. The data migration is to be done in two phases. This is the second phase of the

data load. This involves the data load from DB2 Target tables to SAP through Informatica

Power Connect.

For this, the incoming data is first checked for validations and business errors with DB2

database and the correct data is then loaded into SAP R/3 via Informatica Power connect

for SAP R/3. Valid data is posted to SAP through both IDocs and RFC depending on the

different objects.

Project involved the design, development and implementation of multiple solutions:

Data cleansing solution for SAP Master Data

Data validation and loading tool for Master Data Management

The data cleansing solution is used to extract, cleanse and validate the data coming in

from disparate sources of data. This solution has been designed as a reusable solution

that can be used across multiple businesses across DuPont.

Data validation and loading tool project (CET) involved the design, development and the

implementation of data cleansing solution to be used across DuPont to cleanse, validate

and load data into SAP R/3 using Informatica Power Center and Power Exchange for SAP.

The template based tool is designed as a replacement of LSMW.

The tool is designed to load data into 20 Master data objects in SAP.

1. Solution has helped identify 8 million errors resulting in a 800 person day saving in

effort and ensuring the maintenance of data quality

2. CET is now the preferred load tool for SAP R/3 data loads at DuPont

Responsibilities

Played instrumental role in pioneering the first data migration to SAP

through Informatica, first time in Mahindra Satyam.

Played a key role from the start to the end of the completion of this project.

Did POC for loading the Master data to SAP and to extract the data from SAP

using IDoc, BAPI/RFC using Informatica Power Connect and Power Exchange

and bagged this and another few projects for the company.

Was involved in required gathering, understanding SAP Master Objects and

their interdependencies.

Designed Informatica Architecture and Data Flow Model for Data Migration to

SAP, different way for different objects using IDocs and SAP.

Extended the knowledge from SAP consultants to check the IDoc Errors (we02,

we05), to understand the customized ABAP code, testing it in BDR and working

with the SAP Development box directly with transaction codes.

Played a role to make the SAP functional consultants understand the way

Informatica works and made sure that Informatica code is according to SAP.

Created listener mappings using Informatica Interpreter transformations to

download the data from SAP.

ABAP mappings are developed to extract data from SAP for T config tables to

check the values.



Contact this candidate