Profile:
> ** years of IT experience in total
> 12 years of extensive experience in designing, developing and
implementing Data Warehouse Projects using Informatica, Oracle & Unix
shell scripting
> 1 year of Informatica Administration & MDM experience
> 1 year of BI consulting experience in Financial services sector
> 2 years of Master Data Management consulting experience in Insurance
Services sector
> Good experience in Planning & Efforts estimation
> Expertise in Business Requirement gathering and converting the same
into Technical Specification documents (HLD & LLD)
> Possess excellent interpersonal and communication skills, quick
learner, teamwork minded
Technical Skills:
Data Informatica PowerCenter
Warehousing: 9.5.1/9.0.1/8.6.1/8.1.1/7.1.3/6.0/5.1, Informatica
Powerexchange, Business Objects XI, Trillium, Informatica
MDM Multidomain Edition 9.5.1
Database: Oracle Exadata/10g/9i/8i, DB2, Sqlserver, Salesforce.com,
Postgres
Languages: Unix shell scripting, SQL, PL/SQL
Operating HP-UX 11i, IBM AIX 5.1/5.2/5.3
Systems:
Scheduling Tidal, Unicenter Autosys, Maestro Scheduler
Tools:
Experience Summary:
. Senior ETL Consultant in New York Technology Partners
Worked in the following client places during the tenure with NYTP:
. Senior ETL Consultant in Teach for America, NYC, US from Mar'
2012-Till Date
. Technical Manager in Headstrong Services LLC from Feb 2011-Feb 2012.
Worked in the following client places during the tenure with
Headstrong:
. ETL Lead in Barclays Capital, Jersey City, US from Jan' 2012-
Mar' 2012
. ETL Lead in MF Global, NYC, US from Feb' 2011-Nov' 2011
. ETL Architect in Wipro Technologies from Jan 2003 to Feb 2011
Worked in the following client places during the tenure
with Wipro:
. ETL Architect in The Hartford, Connecticut, US from Sep' 2008-
Feb' 2011
. Technical Leader in Prudential, Reading, UK from Aug' 2004-Aug'
2008
. Associate consultant in Mascot Systems Ltd from Mar 2001 to Jan 2003
Professional Experience:
TFACT, Senior ETL Consultant, Teach for America, US
Mar' 12 - Till date
Teach for America plans to extend its current implementation of
Salesforce.com (SFDC), broadening the user base within the organization and
incorporating several line-of-business operational silos and processes
within this single product suite. In conjunction with related data
migration and system consolidation, several remaining operational systems,
on an on-going basis, need to share data to and from SFDC in order to
facilitate a holistic view of the individuals whose information is shared
across teams and operational systems.
Responsibilities:
. Acted as the Technical lead for the ETL team comprised of both internal
and contractor developer, creating and maintaining business intelligence
and data warehousing design principles using industry leading practices
. Provided technical leadership and guidance to the development team for
the design and development of highly complex or critical ETL architecture
and leading industry practices
. Collaborated with project, architecture, and release teams, providing
input on architectural design recommendations, driving standards, plan
and execute effective transition to production operations
. Studied the existing source systems, analysed the business requirements &
prepared ETL Specifications
. Explored the Salesforce.com application, created POC mappings to verify
Salesforce integration using Informatica
. Designed ETL framework to integrate data from various source systems such
as Postgres, DB2, Files into Salesforce.com
. Designed & implemented the following concepts:
o Automatic reprocessing of salesforce rejections
o Data Threshold governance
o Table driven parallelization
. Configured Webservices consumer transformation to read data from Workday
system
. Analyzed the Change request/Defects. Conducted meetings with BAs, DBAs &
Tech leads to finalize the design & implementation
. Hands on Experience in Informatica PowerCenter Administration
. Installed Informatica 9.0.1 in Unix Platform
. Experience with Informartica PowerExchange, pmcmd command line interface,
and Security (including Native and LDAP security)
. Installed and configured LDAP, Sales Force & Web-service Plugins with
Power Center
. Extensively worked on Powercenter upgrade from 9.0.1 to 9.5.1HF3
. Created Deployment groups for code promotion
. Performed the activities like User creation, Recycling/Disabling services
through Informatica Admin console
. Hands on experience working in Informatica MDM
. Created Data model elements, Defined relationships & lookups within the
Data model using Hub console schema tool
. Configured mappings that use Functions and Cleanse Lists, setting options
for Delta detection, Raw data detention using mapping tool
. Configured exact matching, fuzzy matching & merge processes using the
Schema tool
. Configured Batch jobs to execute the stage, load, match & merge processes
BASEL-3, ETL Lead, Barclays Capital, US
Jan' 12 - Mar' 12
The Basel Committee on Banking Supervision have issued a new regulatory
framework, known as Basel 3, to replace the existing Basel 2 regulations.
As part of Basel-3 requirement, Eagle requires sourcing Non-netted or Gross
Long / Short Position data, or Netted Position data in case of auto 'close
out' or 'tear up' in the trade settlement process, and Margin Data from
upstream systems for all Centrally Cleared Products. TDB is identified as
one main upstream data source systems to feed those data to Eagle. ODH is a
Strategic data store for Exchange Traded Products and will be the single
point of integration for Exchange Traded Product data extraction. For BASEL
III CCP project, ODH will be responsible to provide GMI, all RANSYS
regional instances (excluding India business), ISTAR and DOLPHIN Futures
and Options Positions and Margin data to TDB, in order to meet the BASEL
III requirements
Responsibilities:
. Analysed the ESM & CHORUS Feed thoroughly before loading the data into
ODH
. Responsible for loading static data from ESM & CHORUS into ODH, which
then, allows ODH to do necessary data mapping
. Actively participated in analysis of ISTAR & DOLPHIN data feed via ODH
. Responsible to generate position & margin feed of ISTAR & DOLPHIN systems
from ODH & send them to TDB
CFS, ETL Lead, MF Global, US
Feb' 11 - Nov' 11
Core Foundation Services (CFS) provides a single, global, consolidated
source for MF global business critical data enabling cross-system, cross-
region reports, with roll-ups and drill downs and historical analysis.
Currently CFS manages roughly 14 source systems. CFS CAFE & Taxonomy are
enhancement projects. CAFE aims to add new source systems into CFS.
Taxonomy aims to implement the business taxonomy for each of the Source
Systems available in CFS
Responsibilities:
. Understanding and analyzing new and changing business requirements for
adding new source systems and their impact on the CFS design. Proposing
enhancements and changes to the technical and business solution to meet
the new requirements.
. Estimated Efforts accurately & actively worked with PM's in completing
the project plan
. Involved in discussion with CFS SME's to finalise the design of
integrating the new source systems in CFS platform. Managed the Design of
taxonomy logic for few source systems
. Coordinated with Offshore team & made sure to clarify their Design &
Requirement queries. Worked with Tidal administrators to implement some
complicated scheduling design
. Conducted meeting with QA to demonstrate the design for each source
systems. Provided support to QA for SIT releases. Promptly responded back
to the users during UAT
. Actively monitored the releases into higher environments. Participated in
all production release calls & cleared the issues that arises during the
release
PAVE Auto NB, ETL Architect, The Hartford, US
Sep' 08 - Feb' 11
The PAVE program objective is to fully replace the legacy Policy
Administration system (PLA) with a product from CSC called Exceed.
Currently PLA quote, policy and renewal sourced data is available in a
warehouse called Personal Lines Datawarehouse (PLDW). Goal of this project
is to ensure that, the New business data from Exceed system will be stored
into the PLDW in the same format as the PLA data is available today. By
this way, it will be ensured that, the downstream applications is not
disturbed in anyway.
Responsibilities:
. Involved in overall estimation and planning for the Project.
. Actively participated in all Design Discussions and prepared Design
documents (HLD & LLD) for certain Load Stages. Trained offshore team
members with Exceed Product & Auto insurance business Knowledge
. Shared & Provided details about all Design & Transformation rules
document with Offshore & guided them to build the necessary components.
Worked with offshore team members to prepare Coding Standards, ETL
Specification & Test case documents template. Provided
Informatica/Oracle/Unix/Powerexchange technical consultation to offshore
team members & helped in resolving key technical issues
. Assisted Project Business Analysts by providing key Design & Data mapping
inputs for documenting the FSD. Involved in QA Test plan & Testcase
review meetings. Worked with Release Management team in migrating the
components to QA environment for QA testing. Provided necessary technical
assistance in fixing the QA defects
. Lead a Team of 8 members in Offshore & 1 member in Onsite
Thames O-date, Technical Leader, Prudential, UK
July' 07 - Aug' 08
Prudential has initiated Project Thames to implement the process of
Reattribution of the Inherited Estate of the With Profit Sub Funds of
Prudential Assurance Company Limited, which could potentially generate
additional value to the policyholders and shareholders. Thames - O Date
project will be executed to deliver the capability of reattribution of the
inherited estate.
Responsibilities:
. Managed the end to end delivery of the UVE, Offer mailing & MI from
requirement analysis to Build and test
. Lead a team of 5 people in Onsite & 5 people in Offshore. Involved in
high level design and architecture for operational data store TPDB and
the calculation engine TVDB application using Informatica, Oracle PL/SQL
and Business Objects
. Analysed the Source system thoroughly by going through the existing
Design & Data Model documents, querying the database to capture the data
quality issues & prepared the Source system Analysis document.
. Participated in Business Requirements workshop & gathered the thorough
knowledge of the requirements & then, prepared the Requirement Analysis
document
. Actively worked with project manager, data modeller & designers to come-
up with Build estimate
. Provided knowledge transfer to offshore team by sharing & explaining
about the necessary project related documents
. Reviewed Low Level Design & Testcase documents, Informatica, Oracle &
Unix components
. Assisted Offshore Team members in clarifying any Informatica, PL/SQL &
Shell script related queries
. Involved in the support for Link Testing, System Testing & Performance
Testing
. Created few Business object reports while working in MI stream
OCDB 2007, Technical Leader, Prudential, UK
Jan' 07 - June' 07
The aim of this project is to extend the existing single customer view
(OCDB) to support the Thames program by adding 3 more new source systems
such as, ADMINISTRATOR (ADMIN), UAPS and IB into OCDB.
Responsibilities:
. Worked as a Technical Leader for the stream ADMINISTRATOR. Lead a team of
5 people in Offshore.
. Involved in Design of the Streams, ADMINISTRATOR & UAPS.
. Identifying the list of Components & Doing Efforts Estimation.
. Prepared Source System Analysis & Requirement Analysis documents.
. Assisted the offshore team members to do the Link Testing & Regression
Testing.
. Involved in the support for System Testing & Performance Testing.
. Assisted the Team Members to perform the following activities as part of
a Trillium code modification:
. Creating a new trillium project. Modifiying the existing Converter
driver file.
. Creating a new Converter Input DDL file as per the definition.
Customer Output Improvement, Senior Developer, Prudential, UK
July' 06 - Dec' 06
The aim of this project is to extract customer & policy data from Mainframe
and to produce XML files for a document composition tool called
Thunderhead. Customer letters will then be generated from Thunderhead.
Responsibilities:
. Developed a Generic Unix Shell Script to process the files produced from
Mainframe and also to run all the mappings used in this project.
. Created complex informatica mappings that reads COBOL source files &
loads into XML files.
. Designed XML Schemas, which will be used to create XML Source Qualifier
transformations.
. Automated all ETL processes through Maestro Scheduler.
SID Software Refresh Project, Senior Developer, Prudential, UK
Jan' 06 - June' 06
SID(Single IFA Database) has a Siebel v5.0 front end and therefore has to
use Oracle v7.3.4, which is out of support. This project will replace SID
with a brand new Oracle 9i database called the Distribution Partner
Database (DPDB) and also it replaces the Siebel v5.0 front end with an
Weblogic application called the Distribution Partner System (Dipas). This
project uses Informatica to Re-write the existing feeds (Some of the feeds
are written in Unix shell scripts and some of them are in PL/SQL) into and
out of SID so, that they work with DPDB.
Responsibilities:
. Designed and developed ETL layer using Informatica for extracting Adviser
firm related Informations To & From DPDB.
. Created various Triggers in DPDB Database to capture any changes done
through Dipas front end.
. Analysed the existing Unix & PL/SQL code to replace this with Informatica
mappings.
BI Program Drop 1, Developer, Prudential, UK
Jan' 05 - Dec' 05
This project will deliver a complete solution of campaign management for
Prudential's marketing business area. The aim of this project is to enhance
OCDB by integrating feeds from three new source systems called, FAST,
Prudential GI systems and PROSPECT.
Responsibilities:
. Created Reusable Transformations, Mapplets, and made use of the Shared
Folder Concept using shortcuts wherever possible to avoid redundancy.
. Reviewed Mappings, Sessions and Workflows and logged all review comments.
. Prepared & Reviewed LLD & Testcase documents. Prepared Testdata for
Component Testing.
. Used Debugger by making use of Breakpoints to monitor data movement and
troubleshoot the mappings.
OCDB, Developer, Prudential, India
Jan' 03 - Dec' 04
This solution delivered the Operational Customer Data Base (OCDB) which
integrates the legacy systems in Prudential and presents a single customer
view. The single customer view will enable significant efficiency gains for
the Customer Services in order to handle customer calls. Marketing will be
able to design effective outbound strategies by building the picture of
customers across different product relationships.
Responsibilities:
. Contributed to the technical architecture and high level design for data
extraction, cleansing and integration including reusable frameworks for
Change Data Capture, Matching & Merging, Load Batch Management and
Exception handling.
. Extracted, Transformed and Loaded data into the staging area and Data
Warehouse (Oracle) using Informatica mappings which contains complex
transformations.
. Created PL/SQL Stored procedures, which are to be used in Informatica
mappings.
. Developed Unix shell scripts to Pre-process the files.
. Worked in Informatica Powerconnect (It is now called as Powerexchange).
Created Data Maps for Bulk extraction & Changed Data Capture (CDC) using
Detail Navigator for ADABAS source & VSAM files. Tested the Data maps
using row test feature.
Life CDT Maintenance, Developer, GE, India
Aug' 02 - Dec' 02
There are 4 systems for this project and these systems are used to track,
maintain and report on various types of Reinsurance treaties, certificates
and policies. The scope of the project includes migration of data from
Mainframe DB2 and Oracle into flat files (both fixed and delimited) using
Informatica.
Responsibilities:
. Studied the existing process by interacting with the users and collecting
documents that were being used for their day to day working. Modified
existing Mappings that are used for migration of data from DB2 to
FlatFiles, based on user requirements.
. Created New mappings for migration of data from Oracle database to
Flatfiles.
GE-F-Navi, Analyst Programmer, GE, India
Mar' 01 - Jul' 02
GE Fleet services is one of the largest supplier of vehicle leasing and
fleet management solutions in Japan. GE Fleet services aims at making its
web reporting system, F-navi a user interactive application. F-navi is
proposed at providing its users comprehensive reports on the status of the
various vehicles leased by them, along with some screens where customer can
input the details pertaining to their vehicles.
Responsibilities:
. Implemented the business rules in ASP code, with client side validation
done in JavaScript.
. Prepared Low Level Design (LLD) and Unit Test Cases. Responsible for Peer
Review & Testing.