PRABHU RAMAMURTHY
Email: **********@*****.***
Contact #: 904-***-****
PROFESSIONAL SUMMARY:
. Around 10 Years of IT experience with professional expertise in Data
Integration/ Data warehousing Systems.
. Over 7 years of ETL tool experience using Ab Initio, IBM Information
Server (DataStage) 7.x in banking domain.
. Strong understanding of the Data Warehousing Techniques and ETL
Methodologies for supporting data extraction, loading and transformation
processing.
. Well versed with AB Initio parallelism techniques and implemented Ab
Initio Graphs using Data parallelism and MFS techniques.
. Well versed with various Ab Initio components such as Join, Rollup,
Partition by key and sort, gather, merge.
. Good experience with Data Migration, Data Transformation and Data Loading
using Ab Initio into Oracle, and flat files.
. Excellent experience in designing and developing Ab Initio graphs using
GDE environment.
. Expertise in Developing Transformations between Source and Target using
Ab Initio.
. Worked on Ab Initio production job scheduling process using the Autosys
tool.
. Extensive database experience using SQL, Oracle10G, DB2 UDB and SAP.
. Strong in writing UNIX shell scripts for Data validations, Data cleansing
etc.
. Casestudy on building analyses & dashboards using OBIEE.
. Hands on experience working in USA & UK IT Industry.
CERTIFICATIONS:
. Datastage 8.1 Certified Professional.
. Teradata 12 Certified Professional.
. SAS Certified Base Programmer for SAS 9 certification.
. DB2 700 Certified Professional.
TECHNICAL SKILLS:
ETL Ab Initio, Datastage.
BI OBIEE.
Languages SQL, PL/SQL,UNIX Shell Scripting, Autosys
Databases Teradata 12, ORACLE
Scheduling Autosys, CA7
Tools
SQL Tools Teradata SQL Assistant, Toad for Oracle.
PROFESSIONAL EXPERIENCE:
Client: Bank Of America, Chennai, India & Jacksonville, FL.
Apr 11 - Till Date
Project # 1: FSR ETL NON-PROFIT
Project Description:
Following applications are part of the FSR ETL NON-PROFIT Project.
FSR ETL Recon - It is a core component in E-Ledger Reconciliation. It
integrates ledger balances from different sources and loads data into SAP
BW System mapping general ledger account numbers to SAP account numbers.
FSR ETL FPP - FPP (Financial Provisional Point) will provide the processes
and technologies necessary to retrieve data from SAP ECC FICO (General
Ledger (GL), Fixed Assets (FA)) and/or SAP Business Warehouse (BW) Info
Cubes for GL, and FA, and provide selected data to specific downstream
applications.
Role: ETL Team Lead
Responsibilities:
. Review the functional documentation and curve functional documents into
technical design documents.
. Developed, tested and reviewed complex Ab Initio graphs, sub-graphs,
DML, XFR, deployed scripts, DBC files for connectivity.
. Extracted data from various sources performed data transformations and
loaded into the target interface files and tables.
. Improved performance of Ab Initio graphs by using various Ab Initio
performance techniques like using in memory joins and rollups to speed
up various Ab Initio graphs and avoiding deprecated components.
. Developed, tested and reviewed complex Ab Initio graphs, sub-graphs,
DML, Pset, XFR, deployed scripts, DBC files for connectivity, create
Package and exports.
. Developed wrapper scripts to periodically notify users in case of any
failures with debugging information.
. Extensively worked with the Ab Initio Enterprise Meta Environment (EME)
to obtain the initial setup variables and maintaining version control
during the development effort.
. Support System Integration and User acceptance testing cycles executed
by the Automation team fixing design issues.
. Wrote UNIX scripts as required for preprocessing steps and to validate
input and output data elements.
. Involved in migration of code from development to other test environment
and to production.
. Involved in change management using Maximo by creating RFCs and IRs to
raise defects.
. Created dashboards and reports for test completion extracting data from
quality center.
. Provide expert guidance to rest of the team on design and development
issues.
Environment: Ab Initio GDE 1.15 CO 2.15, AIX Unix, SQL, PL/SQL.
Client: Bank Of America, Concord, CA & Chennai, India.
May 08 - Mar 11
Project #2: VISION GEN 1 ETL
Description:
Vision Gen 1 Essbase is a system used for Financial Analysis and Reporting.
Ab Initio is the ETL Tool used to extract data from Insight which is the
Source system that stores its data in an Oracle database and load into the
Vision's Oracle database.
Role: Ab Initio Developer
Responsibilities:
. Experience in using korn Shell Scripting to maximize Ab-Initio
parallelism capabilities and developed numerous Ab-Initio Graphs using
Data Parallelism and Multi File System (MFS) techniques.
. Used components like run program and run SQL components to run UNIX and
SQL commands in Ab-Initio.
. Expertise in unit testing, system testing using of sample data, generate
data, manipulate date and verify the functional, data quality,
performance of graphs.
. Used different Ab-Initio components like partition by key and sort,
dedup, rollup, reformat, join in various graphs.
. Used phasing to sequence plans.
. Perform program construction / modification due to problem fixes and new
development.
. Involved in code reviews, performance tuning strategies.
. Conducting various Testing cycles.
. Project discussions to consolidate the current status and requirements.
. Conduct quality reviews of design documents, code and test plans.
Environment: Ab Initio GDE 1.14 CO 2.15, UNIX, SQL.
Client: Bank Of America, Chennai, India.
Aug
06 - Apr 08
Project #3: W - Data warehousing
Project Description:
W Info Load, ETL team of the Bank, has to ensure mainly loading of data
from different domains, including Card, Customers, Deposits, ecommerce,
Marketing & Loans, as per Service Level Agreements (SLA).As per business
rules, extracted data is loaded to the W after transformation.
Any delay in data availability will have impact on bank's business. So
meeting SLA for all applications data is a challenge due to bigger size of
system & data volume and huge number of users accessing the system. To meet
SLA, quick resolution is required if any job fails in loading the data and
need to make enhancements and proper planning on the usage of system to
accommodate all users of datawarehouse. Also have to address if any data
issues occurs in the loading process.
Role: Lead Datastage Analyst
Responsibilities:
. Extensively used IBM Information server Designer to develop various jobs
to extract, cleanse, transform, integrate and load data into target
tables.
. Prepared technical design/specifications for Data Extraction, Data
Transformation, Data Cleansing and Data Loading.
. Used IBM Information server Designer to transform the data through
multiple stages and prepared documentation.
. Extracted data from various sources performed Data transformations and
loaded into the target Oracle database.
. Import/Export datastage jobs using IBM Information server Designer
. Designed Data Stage Parallel jobs with Change Capture, Change Apply
stages.
. Implemented logic for Slowly Changing Dimensions.
. Automated and fine-tuned IBM Information server Designer jobs and
sequences for loading source systems data into Data warehouse.
. Extensively wrote user-defined SQL coding for overriding Auto generated
SQL query in DataStage.
. Used local and shared containers to increase Object Code Reusability and
to increase throughput of the system.
. Used different partitioning methods like Auto, Hash, Same, Entire etc.
. Developed Job sequences with restart ability check points, and
implemented proper failure actions.
. Participated in the review of Technical, Business Transformation
Requirements Documentation.
. Actively participated in the Team meetings to gather the business
requirements and developing the specifications.
. Participated on call support on 24/7 Basis.
Environment: IBM Data stage 7.x, Unix, Teradata SQL
Client: TESCO, Hertfordshire, UK and Chennai, India.
Apr 04 -
Jul 06
Project #4: Retail/Commercial System
Description:
TESCO is the world's 4th largest and Europe's 2nd largest retailer
operating around 2200 stores worldwide. The TESCO IT Services develop and
maintain the software systems to support Retail functions.
This project is envisaged to provide an EDW platform and provide
capabilities to analyze the marketing campaign effectiveness for the
Decision /Strategy team to make effective business plans. TESCO uses
variety of applications such as CMS, Net tracker, Campaign Management
System, In house delivery management systems, SAP ERP etc.
The requirement is to design and develop staging ODS to consolidate data
from varied sources to a single business data mart/warehouse to support
analytics/business intelligence.
Role: ETL Developer
Responsibilities:
. Designed ETL jobs for extracting data from heterogeneous source
systems and designed the automated job runs to automate the ETL
process.
. Involved in designing ETL Processes to extract data from oracle and
load into target oracle databases by applying business logic on
transformation for data cleansing and insertion of records.
. Loading historical data and daily extraction of data.
. Scheduling of the job using Shell Script and Autosys utility.
. Monitoring the process on a daily basis and coordinating with onsite
teams on resolution of issues in the daily pass.
. Designing additional ETL requirements as per the requirement changes
by the client.
Environment: Oracle, UNIX.
Education: Master of Computer Applications - 2004, University of Madras.
References: Will be provided upon request.