Post Job Free
Sign in

Data Project

Location:
Westerville, OH, 43081
Posted:
March 09, 2010

Contact this candidate

Resume:

*Madhu Akkala

614-***-****

abnf49@r.postjobfree.com

PROFESSIONAL SUMMARY

More than Nine years of professional experience in Information Technology including 6

years of experience in Data Warehousing as a System Analyst especially in finance domain.

Extensive experience in Software Development Life Cycle (including Data Modeling).

Strong Data Warehousing concepts and capable of deriving Analytical, Logical, Physical, Dimensional

and statistical Data Models.

Knowledge of STAR and SNOW FLAKE schema and clear understanding of Metadata

Excellent knowledge in working with Ab Initio (GDE 1.13, 1.14), Informatica Power Center, Power Mart

5.1/6.2. Designed and developed strategies for extraction, transformation and loading (ETL) mechanism

using External Loader with Informatica & Ab Initio.

Created complex Stored Procedures/Functions, Triggers in Sybase, Oracle and DB2 UDB.

Excellent SQL tuning and optimizing skills in Oracle, Sybase and DB2 UDB.

Experience in UNIX shell (including awk and sed), Perl scripting and automation of ETL processes

using Autosys and Ctrl M scheduling system.

Possess strong business knowledge in Finance especially in derivatives and basel2 and Insurance

industries.

Two years of in depth experience in Basel2 implementation projects.

Implemented fin39 netting & capital management netting for OTC derivatives.

Possess excellent Analytical, communication, documentation, interpersonal and client

interaction skills. Strong Ability to work well as a part of a team or individually.

Excellent Leadership and mentoring skills.

Executed multiple projects simultaneously as Tech lead and implemented all of them successfully.

Actively participated in Project deliverables estimations.

Excellent coordination skills especially with offshore teams.

SKILLS

Databases : DB2 UDB v7.2 & 8.2, Oracle 8i, 9i, Sybase 11 & 12

Data ETL Tools : Ab Initio (GDE 1.13, 1.14, Co-op 2.13, 2.14), Informatica 5.1 & 6.2

Language : C, C++, SQL, UNIX Shell & Perl Scripting, AWK, SED scripting

Tools : MS Excel, MS word, PVCS, Autosys, Ctrl M, Visio 2000

Operating Systems : DOS, Windows98/NT/2000 & XP, UNIX (HP-UX, AIX, Solaris).

PROFESSIONAL EXPERIENCE

Employer: JPMorgan Chase & Co Sep 2005 to

Till today

Basel II Aug 2007 to Till date

ETL Lead

Basel II is the second of the Basel Accords, which are recommendations on banking laws and regulations

1 of 7

issued by the Basel Committee on Banking Supervision. The purpose of Basel II, which was initially

published in June 2004, is to create an international standard that banking regulators can use when

creating regulations about how much capital banks need to put aside to guard against the types of

financial and operational risks banks face. Advocates of Basel II believe that such an international

standard can help protect the international financial system from the types of problems that might arise

should a major bank or a series of banks collapse. In practice, Basel II attempts to accomplish this by

setting up rigorous risk and capital management requirements designed to ensure that a bank holds capital

reserves appropriate to the risk the bank exposes itself to through its lending and investment practices.

This is high visible project and the reports generated are reviewed by CEO.

Responsibilities:

Designed ETL part of Basel II system.

Designed calculation rules table to automate the calculations and built graphs to generate the

dynamic calculations.

Analyzed requirements and estimated the effort

Involved in entire product life cycle development

Developed MD70 and Source to Target Mappings.

Analyzed and validated source data.

Created Ab Initio graphs to load data into Basel system

Mentored offshore team to Build Basel II system application.

Assisted BO team to create Different FFIEC 101 reports

Supported Basel application until operate team was setup separately.

Assisted testing team to create test plans/cases for system and integration testing.

Supported and assisted User acceptance testing. Part of this process developed many reports

to assist line managers/End users.

Environment: Ab Initio (GDE 1.14), Oracle 9i, UNIX AIX, Perl 5.6, SQL, Java, Windows XP, MS

Office

Derivatives Regulatory Reporting Mar 2006 to May 2007

ETL Lead

The FXDRRS Notional Attestation process allows LE/LOB Controllers to reconcile their Notional

Position information on their source systems to the notional information present on GlobalNet. Variances

are investigated and returned to Regulatory Reporting to be included in the final calculations to compile a

complete notional derivative position for the firm for regulatory purposes.

A file is extracted from GlobalNet containing all “Active” and “New” derivative trades. The file is

supplemented with some additional information (LE, LOB/SUBLOB, etc) and then the notional amounts

are summed by LE, LOB/SUBLOB then instrument type. The compiled files and the underlying trade

detail is distributed to the LE controllers and / or LOB Operations contacts for verification. The LE

controllers reconcile the spreadsheets to their source systems and return back to Regulatory Reporting the

total positions by instrument type that make up the difference between their source systems and

GlobalNet. The adjustments are actually at the LOB / Product / Trade Type / Funding Type / Maturity

Bucket level. Counterparty information is also being requested on adjustments. The adjustment amounts

are reviewed and then used to substantiate the difference between GlobalNet and the source

trading/accounting systems in compiling the RC-L/HC-L, SEC disclosures and other regulatory reports.

Responsibilities:

2 of 7

Involved in entire product life cycle development

Developed MD70 and Source to Target Mappings.

Extensively Involved in Operational Data Store data model

Created many complex Ab Initio graphs to load data into FXDRRS system

Developed various unix/perl scripts to verify the loaded data.

Developed fin39 netting and capital management netting process.

Reviewed and optimized ODS, DW and acceptance testing SQL queries.

Worked with BO team to create the monthly/quarterly/annual reports

Created perl script to write data from flat file to excel file.

Developed and supported FXDRRS application until operations team took over

Mentored operations/offshore team to understand FXDRRS application.

Assisted testing team to create test plans/cases for system and integration testing.

Environment: C++, Ab Initio (GDE 1.13, 1.14), DB2 UDB v8.2, UNIX (AIX, HP-UX & Solaris), Perl

5.6, Windows XP, MS Office,

Budgeting and Forecasting Oct 2005 to Feb 2006

ETL Lead

The Planning Project is intended to produce the foundation for broad Line Manager engagement in the

planning process across the firm, specifically in the area of active participation in the creation of budgets

at the position by position level for salary and benefits and contractors, and for key direct expenses at the

cost center level. Additionally, the project will streamline the current Corporate Planning consolidation

platform through the convergence and standardization of reporting dimensionality with Actuals reporting

and introduce a more efficient architecture across the current Essbase Planning application that will build

upon efficiencies that have already been realized in the environment.

Responsibilities:

Developed Planning system from the scratch

Involved in entire product life cycle development

Developed MD70 and Source to Target Mappings.

Analyzed and validated source data.

Created Ab Initio graphs to load data into Planning system

Worked with Essbase team to create the Extracts

Developed and supported Planning application until the planning was done.

Mentored offshore team to develop Planning system application.

Assisted testing team to create test plans/cases for system and integration testing.

Supported and assisted User acceptance testing. Part of this process developed many reports to

assist line managers/End users.

Environment: Ab Initio (GDE 1.14), DB2 UDB v8.2, UNIX (AIX, HP-UX & Solaris), Perl 5.6,

Windows XP, MS Office, Essbase 7.1.5, Business objects.

Employer: Covansys Corporation Jan 2005 to

Sep 2005

Client: VISA International, Austin TX

3 of 7

CCDRI – Enhanced data processing

ETL developer

This is the new system been developed to be as ODS of Corporate Cards. The transactions and Invoices

are extracted from the files from different clients and banks associated with VISA. This particular system

is responsible of Inbound and outbound data processing, enrichment of data, auditing and reporting. The

inputs and outputs types processed are fixed length, delimited and XML.

Responsibilities:

Involved in entire product life cycle development

Extensively Involved in Business requirements gathering.

Developed Abinitio graphs to implement data consolidation.

Developed various unix/perl/awk/sed scripts to verify the loads.

Extensively Involved in Stage and Operational Data Store data model

Created performance and acceptance test plans for DW.

Environment: Ab Initio (Co op 2.13), DB2 UDB v8.2, UNIX (AIX, HP-UX & Solaris), Perl 5.6, SQL,

Erwin, DB Artisan, Windows XP, MS Office,.

Employer: Cyber Resource Group Inc Nov 2004 to

Dec 2004

Client: Strong Mutual Funds, Milwaukee WI Nov 2004 to Dec 2004

Fund Conversion

QA Analyst

The main objective of Fund Conversion project is to synch up the Strong mutual funds order entry

system with Wells Fargo mutual funds order entry system.

Responsibilities:

Developed various UNIX and perl scripts to enable successful data conversion and

communication between Strong and Wells Fargo systems.

Environment: DB2 UDB v8.2, UNIX (AIX, HP-UX & Solaris), Perl 5.6.

Employer: Infosys Technologies Ltd, India Jul

2000 to Oct 2004

Client: Northwestern Mutual, Milwaukee WI

Northwestern Mutual is a client of Infosys and I have worked with Northwestern Mutual for four and

half years as developer, analyst, tester, implementation coordinator. Northwestern Mutual is insurance

and financial institution (brokerage firm) involved in selling life and long term disability insurance,

securities, mutual funds, Asset management and trade executions.

4 of 7

IPS Data Asset Management (IDAM) Nov 2003 – Oct 2004

Data warehouse Analyst/Developer/Implementation coordinator

The IPS Data Asset Management initiative will transition the current environment into an IPS reporting

environment, providing a view into most IPS accounts and securities, not just those held at Baird. This is

made possible by sourcing data from an additional third party vendor, Statement One (Albridge), which

aggregates information about most IPS accounts. As part of this project a ODS (operation data store),

DW (Data Warehouse) and DM (Data Mart) will be built to simplify the report generation.

Responsibilities:

Involved in entire product life cycle development

Extensively Involved in Business requirements gathering and analysis of the source (Albridge) data

Extensively Involved in ODS, DW and DM data models.

Development Lead for development and deployment of ODS and Data warehouse

Created stored procedures/Triggers to populate data from Stage tables to ODS.

Converted the Business requirements to technical specification. Also involved in mapping of Source

and target data.

Developed conceptual design to meet the requirements, followed by the logical process flow.

Analyzed and understood the metadata, source data, identified the Facts and dimensions

Designed and developed complex aggregate, Join, Look up transformation rules (business rules) to

generate consolidated fact/summary data identified by dimensions using Informatica tool

Loaded data in DW with Informatica DB2 UDB (EE) External loader as well as UDB Load native

utility.

Developed Regression test beds for ODS & DW using Unix Shell Scripts.

Created performance and acceptance test plans for DW.

Reviewed and optimized ODS, DW and acceptance testing SQL queries.

Environment: Informatica 6.2, DB2 UDB v7.2 & v8.2, Sybase 12, UNIX (AIX, HP-UX & Solaris),

DB2 UDB Import/Export Utilities.

SEC B&R Nov’02 – Oct’03

Data Warehouse Analyst/Developer

SEC project was to analyze, design and build a warehouse to monitor investor’s objectives and personal

information changes. The warehouse is built from data received from BETA (Clearing platform) and

offers easy report generation for Investor changes.

Responsibilities:

Developed data transfer strategy from flat files to Mart via ODS & DW.

Analyzed Clearing platform flat files and designed the Data models accordingly.

Developed mappings/sessions using Informatica power center for data loading

Assisted in developing the Regression test bed.

Created test plans for Unit, Integration and System testing.

Preparation of technical specification for the development of Informatica Extraction,

Transformation and Loading (ETL) mappings to load data into various tables and defining

ETL standards

Created mappings using the transformations like Source qualifier, Joiner, Aggregator,

Expression, lookup, Router, Filter, Rank, Sequence Generator, and Update Strategy

5 of 7

Designed and Developed pre-session, post-session routines and batch execution routines

using Informatica Server to run Informatica sessions

Involved in Performance Tuning of sources, targets, mappings and sessions

Tuned all SQL queries and suggested the necessary indexes to enhance the performance.

Developed stored procedures, functions and triggers to populate data in ODS.

Environment: Informatica 5, Sybase 12, UDB 7.2, UNIX HP-UX and Windows NT

Validation of Servicing Agent replication Sep’01 – Oct’02

Developer

The scope of project included validation of Mainframe Agent data with replicated Sybase Agent data.

Agent information is critical data for Northwestern mutual to pay the commission for the financial

representatives. Hence loading this data into Sybase database without bugs was crucial. The objective of

this project is to find the mismatches in replicated Sybase data, so that client can take necessary steps.

Responsibilities:

Designed and developed Sybase stored procedures and triggers to capture daily changes to

Agent information into Sybase tables.

Created flat files using Informatica to get the daily changes.

Developed UNIX scripts to format the Sybase data into flat files to match with Mainframe files.

Developed generic C program to compare any two files which are in same format. This C program used

to compare the Mainframe files with above Sybase data file and report the mismatches.

Used mapping designer and mapplet designer to generate different mappings/mapplets for

different loads into flat files.

Developed and Deployed Batches, Parameter files, variables other related scripts using Unix Shell Scripts

to pull data from Unix environment

Created stored procedures, triggers, tables, views, and test data in Sybase

Developed procedures, functions and triggers to populate Validation tables.

Environment: Informatica 5, UNIX shell scripts, C, Sybase, SQL, Unix-Solaris and Windows NT

Replication of Servicing Agent/Agency Mar 2001 – Aug 2001

Programmer Analyst

The scope of project included building a replication from Supra/DB2 files to Enterprise Operation data

store (EODS). These files were replicated to Customer database. In addition to this, the Agent/Agency –

Contract relationship was built and maintained on the Enterprise ODS using the TINPI file on DB2. The

data from Supra/DB2 files is replicated to Enterprise ODS database through two different replication

streams – Supra to EODS replication and customer to EODS replication. This project consolidated the

above mentioned replicated streams.

Responsibilities:

Extensively involved in business requirements analysis and translating them into technical

specification for database design

Involved in designing of Entity Relationship Diagram and created tables, indexes, sequences,

constraints and snapshots

Designed and developed procedures, functions and triggers to populate the replicated Mainframe data.

6 of 7

Fixed software bugs and interacted with developers to resolve technical issues

Developed shell scripts to automate the System testing.

Tuned/optimized the Initial load SQLs/Unix scripts.

Environment: UNIX, Sybase, SQL, C

Long term care product Jul. 2000 – Feb. 2001

Developer

The main objective of the project is to modify the configuration items in the LTC (Long Term Care)

interface application to support the new version of LTC product called Quiet Care RS. Configuration

items include items that makeup the LTC Interface application. Stored procedures, C Codes, Mainframe

jobs and VB code constitute the whole LTC Interface application. The existing functionality that is

affected by the introduction of Quiet Care RS is also modified.

Responsibilities:

Extensively involved in requirements gathering and data gathering to support developers in

handling the design specification

Involved in designing and coding of functional specifications for the development of user interfaces

Involved in designing of Entity Relationship Diagrams

Created tables, indexes, sequences, constraints and snapshots

Developed Sybase procedures, functions and triggers to load new policy data.

Fixed software bugs and interacted with developers to resolve technical issues

Maintained and supported the application for 4 Months.

Environment: Sybase 11, C, SQL, UNIX, PL1, SAS and VB.

EDUCATION

Bachelor of Technology, Jawaharlal Nehru Technological University, India.

Working status: Authorized to work in US (EAD)

7 of 7



Contact this candidate