[pic][pic][pic][pic][pic][pic][pic][pic]
Ramesh Gundubogula
***********@*****.***
Sr.ETL & Sr. Teradata Consultant
IBM Certified Data Stage & NCR Teradata Certified Master
CAREER OBJECTIVE
Obtain a challenging and reward winning position in Data warehousing with
an organization that is both progressive and offers opportunities for
professional carrier growth through nurturing my technical skills and
competencies, and to the serve organization to the best of my abilities and
help meet its ever growing business targets.
Major Strengths & Summary:
Over 11+ years of experience in Information Technology, having 8+ years of
experience in the analysis, design, development, and implementation of data
warehousing applications using Oracle 8i/9i, Teradata, Sql Server and DB2
UDB database applications on MS Windows, Unix, and Linux platforms with
Informatica Power Center v8.6 and IBM Web Sphere Information Server (Data
Stage) v8x ETL tools.
> Analyzing and understanding the data model and talking to business
users to gather insight into the user requirements, type of business
queries to be performed, discuss with DBAs for efficiently maintaining
the data warehouse optimized.
> Involved in LDM design phase and delivering the Detailed Design
Document. Creating design documents for building jobs according to the
physical(PDM) and logical structures of data models(LDM)
> Extensive experience designing relational (logical, physical), Star
schema Snowflake schema and dimensional data models for data
warehouses and business intelligence environments
> Performance tuning made in Data stage jobs and Teradata utilities on
entire sequence processes extensively to bring system performance
optimization for all job runs within the runtime window.
> Experienced in the design and development of Data warehouses and Data
marts using Informatica Power center v8.6 and IBM Web Sphere
Information Server (Data Stage) v8.1 ETL tools and Teradata Sql
scripts with Performance Tuning techniques of ETL Processes.
> Writing technical specifications (source-target mappings) and
designing/preparing detailed design documents for ETL mappings
> Designing the process methodologies to be followed for the overall job
design, optimizations performed for efficient DataStage EE and
Parallel jobs and Informatica workflows approaches to minimize
database access and provide maximum window for Business users to use
the production box
> Proficient knowledge about the architecture of the Teradata database
versions from V2R5 to V13
> Expert in performance tuning of Teradata queries, views and decisive
usage of PPI, AJI, JI as well as Stats collection.
> Involved activities for Database administration and performance
monitoring using PMON for analysis. Use of Net vault for Back up and
Data security also.
> Extensive knowledge and expertise in Teradata utilities like BTEQ,
MLOAD, Fast export and Fast Load in Teradata V2R5 to V 13
> Assigning various subject areas within the team and monitoring the
progress of work as well as making sure that the various work products
integrate and resolve issues when it needed
> Provided and followed data governance, following standards, design and
best practices set by Telstra, Walgreens, State farm insurance and
McDonalds.
> Extensively Created ETL Jobs, Workflows, Data flows, R/3 flows, ABAP
Extraction, Functions, Transformations and Scripts.
> Have leaded various teams in major projects to design a Data Mart and
Atomic Data Store respectively. Involved extensively in Design as well
as Development in Teradata Utilities
> Experience with Data Stage routines for special jobs with a specific
purposes
> Experience in Performance Tuning of Data Warehouses including the
creation of materialized views, bitmapped indexes and partitions.
> Have used Teradata connector stage i.e. Teradata Parallel Transporter
(TPT / Teradata PT) stage to load Teradata tables and is aware of the
Teradata parallel transport concepts
> Experience and maintained onsite, offshore development activities and
played as coordinator along with development and leading at onsite.
> Created JCL scripts using mainframe environments to submit jobs in
channel attached client for Teradata Server
> Worked on Slowly Changing Dimensions (SCDs) and its implementation
with type1 and 2
> Programming knowledge and experience in using SQL, PL/SQL, Unix Shell,
and Perl
> Implementing full lifecycle in data warehouses and business data marts
with Star schema and Snowflake schemas
> Designing business views and using various utilities as part of
specific business requirement to load data using Teradata and Oracle
10g,Db2,Sql server database tables.
> Experience in Telecommunication, Banking, Finance, Insurance, and
Retail domains
> Excellent Analytical Knowledge and Good Communication Skills.
Technical Skills:
Languages C, SQL, PL/SQL, Kshell, Perl, and JCL
Operating Systems MS Windows 95/98/2000/2007/NT, Unix, and Linux
Databases Oracle 8i/9i/10g, Teradata V2R6,V12,V13, DB2,MS
Access and MS SQL Server
Testing Tools Test Director and Clear Quest
Automation Tools QTP, Win Runner and Load Runner
Reporting Crystal Reports, Business Objects, Micro Strategy
Tools/SQL 7.2.3, Brio 6.2(Hyperion), Cognos BI 8, PL/SQL, and
SQL
Data Warehousing IBM Web Sphere DataStage 7.5.2/8x, Quality Stage
8x and Informatica Power enter 5.1/6.1/8.1/8.6 ETL
(Extract, Transform, and Load)
Version Control MS Visual Source Safe, Clear Case, and Lotus Notes
Tools
MS Office Tools MS Word, MS Excel, MS PowerPoint, MS Access, and MS
SharePoint
Case/SCM Tools MS Visio 2007
Hardware/Server IBM Mainframe and Unix
Certifications:
> Teradata Certified Master V2R5.NCR
> Advanced Teradata Certified Professional,NCR
> Teradata V2R5 Basics Certified Professional, NCR
> Teradata certified Implementation Specialist, NCR
> Teradata certified SQL Specialist,NCR
> Teradata certified Application Developer in V2R5,NCR
> Teradata certified Administrator in V2R5,NCR
> Certified in IBM Web Sphere IIS Data Stage Enterprise Edition v7.5,
IBM
Project Experience:
McDonalds Corporation. Oakbrook, IL
May 2010 to Present
True Consulting LLC
As Sr.ETL Consultant,
McDonald's restaurants are found in 119 countries and territories around
the world and serve 58 million customers each day. McDonald's operates over
31,000 restaurants worldwide, employing more than 1.5 million people and
one of largest chain restaurants in the world.
Market Analytics System (BI)
McDonald's daily POS data will be processed for each store worldwide and
sent it back to McDonald's for use by the Global Data Warehouse (Polaris).
Only cleansed and validated POS data will be sent to McDonald's Global Data
Warehouse. Providing cleansed and validated data for Market Analytics,
Market Analytics system will process the daily POS data accumulated for the
Daily basis. The daily processing will aggregate the weekly POS data and
then apply baseline calculations along with other ACNielsen fact and
subtotal calculations. The enhanced sales data will then be sent to
McDonald's Global Data Warehouse on a Country-by-Country basis. This
system is intended to handle POS data from all of McDonald's stores.
Several Operational reports will be developed for monitoring the Market
Analytics system. This includes Exception reports as well as Service Level
Agreements (SLA) reports.
> Developing sequences and Jobs to load data to target tables in
Teradata for Global Data warehouse using Data Stage ETL tool.
> Developed unloading and FTP files using shell scripts in Unix and
their directories
> Experience in Preparing the detailed design, ETL field mapping,
strategy, naming convention, test case, and status report documents
> Extensive work expertise and tuning in Teradata utilities like BTEQ,
MLOAD, Fast export and Fast Load in Teradata v13.
> Expertise in setting and design processes to extract load and
transform using various utilities and processes.
> Extensively used stages such as Transformer, Sorter, Tpt, Copy,
Remdup, CDC, Aggregator, Lookup, Join and Merge stages.etc.
> Design and development of Data stage jobs to (ELT) extract, load and
transform data from operational .csv files source system to tables in
Teradata database
> Written functions and procedures in Teradata sql and used utilities to
read and write the data to and from target tables.
> Extensively tuned existing jobs to bring run times in control to bring
highest performance optimized.
> Have used Teradata connector stage i.e. Teradata Parallel Transporter
(TPT / Teradata PT) stage to load Teradata tables and is aware of the
Teradata parallel transport concepts
> Involved in Database Tuning, mainly involved in studying few tables
and then deciding the change in primary indexes. In addition, involved
in identifying the columns to collect stats and automating the same.
> Responsible for writing bteq scripts for tactical queries, which
includes joins, grouping sets, ANSI Merge, parallel and sequential
data loading and business logics etc
> Implemented generic Unix Shall script to automate Jobs logging process
called AbaC(Audit Balance and Control) to capture Jobs execute times
and loading into maintenance tables.
> Support QA team during the QA phase like explain the business logic,
fixing identified issues and address the environmental issues.
> Involved in meetings with BAs, analyst and Users to gather the
business requirements and solution challenges.
Environment: SQL, Shell script, DB2,Teradata v13, IBM Information server
Data Stage v8x, Unix Kshell, AIX 5.2 Perl Script, IBM clear case, Micro
strategy, File Zilla, MS access,Putty and MS Windows Xp
State Farm Insurance Company, Bloomington, IL
February 2009 to Present
HTC Global Services Ltd
I) Advance Premium R II
As Sr.ETL Lead /designer,
Advance Premium project is to Design, build, and implements the new
Transactional Accounting System (TAS) which will handle financial data for
Auto, Fire, and Health. Replace existing processing that occurs in TDS
using existing rules to support reporting for advance premium, stat,
ledger, and earned/unearned premium, premium for agents' commission,
suspense, and tax. Realign existing pended financial data from TDS to TAS.
II) Decision Support System (DSS)
As Sr.ETL Lead /designer,
The Decision Support Services (DSS) provides a single, consistent source of
enterprise data and tools for analytical and reporting needs. DSS delivers
a consistent, unified approach to gathering and managing data, along with
easier and more efficient Business Intelligence and decision support
services.
Responsibilities:
> Involved in Analyzing and understanding the data model and talking to
business users to gather insight into the user requirements, type of
business queries to be performed, talking to DBAs for efficiently
maintaining the data warehouse optimized.
> Involved in LDM design phase and delivering the Detailed Design
Document. Creating design documents for building jobs according to the
physical (PDM) and logical structures of data models (LDM).
> Developing sequences and Jobs to run to load data to target tables in
the warehouse using DataStage ETL tool.
> Developing and using shared containers using DataStage quality stage
and DataStage designer.
> Developed unloading and FTP files using JCL scripts in mainframe to
the Unix directories
> Experience in Preparing the detailed design, ETL field mapping,
strategy, naming convention, test case, and status report documents
> Extensive work and expertise in Teradata utilities like BTEQ, MLOAD,
Fast export and Fast Load in Teradata V 6.1
> Extensively used stages such as Transformer, CDC, Aggregator, Lookup,
Join and Merge stages.etc.
> Query optimization for business queries as well as queries used in ETL
by use of Aggregate Join Index as well as PPI, JI and stats collection
on join keys
> Developing ETL design templates for data extraction and loading on
Teradata using utilities
> Hands on Designing, scheduling, and running jobs using DataStage
Director
> Extensive usage of Teradata sql and used utilities like Bteq, Mload,
Fload, to load target tables.
> Involved in unit testing, system testing and UAT testing levels to run
the jobs and test the performance and functionality
Environment: ANSI SQL, Shell script, DB2,Teradata v2R6/v12, Rumba Mainframe
Host, IBM Information server Data Stage v8x, Unix Kshell, and MS Windows
2000/NT
Walgreens, Chicago, IL February 2008 to January
Satyam Computer services ltd 2009
EDW Photo Data
As Sr. ETL Lead Consultant,
The objective of the EDW Photo Data project is to extend the data available
from the enterprise data warehouse to include photo data to facilitate data
analytics and reporting for: loss prevention, customer insight, market
basket analysis, product affinity, store traffic patterns, and others.
It needs a detailed understanding of the Picture Care plus source data and
to decide the strategy for loading the data in EDW. There were 4 main
subject areas Customer, Orders, Promotions and Order Items. Also Order data
from PC + was to be matched with data from Point of Sale (POS) so that the
cost obtained from both the systems could be matched to find irregularities
in stores. It would help a lot in loss prevention from stores. Also it
would help analyze customer purchase patterns and store traffic patterns
Responsibilities:
> Analyzing the existing system and creating data flows according to the
Walgreens standards and specifications.
> Extracted various sources like oracle, Sql server, flat files and SAP
R/3 through DataStage Information Server.
> Design and development of Data stage EE jobs to extract, transform
and load data from operational Oracle source system to Teradata target
> Written functions and procedures in Teradata sql and used many
utilities to read and write the data to and from target tables.
> Designing, scheduling, and running jobs using DataStage with control
-M
> Using various stages Transformer, CDC, Aggregator, Lookup, Join and
Merge, copy stages etc...
Environment: ANSI SQL, Shell Script, Oracle 9i, Teradata V2R6, Win SCP,
Extra Putty, DataStage (Parallel Extender) 8.1, Ascential Quality Stage
7.1, MS Windows 2000/NT
Telstra, Melbourne, Australia
Aug 2005 to January 2008
Satyam Computer services ltd
Telstra is Australia's Major and largest telecommunications provider,
serving the consumer, business, government, and wholesale markets.
Providing PSTN (Public Switched Telephone Network), mobile, and data
services, it is the leading fully integrated telecommunications company in
Australia, and one of only 17 worldwide. It offers services and competes in
all telecommunications markets throughout Australia, providing more than
9.0 million Australian fixed line and 10.2 million mobile services,
including 6.3 million 3G services.
Engagements:
As Technical Lead/designer,
I. EDW Transformation (TR1)
I. Enterprise data warehouse layer provides a shared, cross-
functional store of subject-oriented data, based on an application-neutral
relational data model, designed to support historical trend analysis and
decision support reporting. Since Telstra is undergoing a major
transformation in terms of its internal systems, the existing data
warehouse is also being streamlined based on the new source systems. The
Down Stream applications which were previously receiving feeds from TDW
needs to receive remediate feeds from the new Data Warehouse - EDW.
Siebel Extracts (Siebel CRM)
II. Siebel Extracts project which provides the necessary data feeds from
the Telstra data warehouse to the Siebel Marketing and Analytics system.
Three upstream systems have been identified to provide the unavailable data
in TDW. The data from these systems has been pushed into the EDS Layer of
the Telstra data warehouse. The feeds to Siebel have been derived from the
existing and new EDS and data mart tables.
III. BTS SIO - Phase 2 project.
The Basic Telephone Service reporting has been a key regulatory requirement
of Telstra. Presently, the reporting is complicated and requires a large
amount of manual intervention to pull data together. This initiative is to
improve the existing process and produce accurate results at the same time
as there has been a drive to improve the quality, usability, and
performance of data warehouse. This project is implemented using Ascential
Data Stage 7.5.1.A (Parallel Extender).
Responsibilities:
> Participating in meetings with the business team and Analyst to gather
requirements and finalizing the Erwin logical(LDM) and physical
models(PDM) as per the subject areas
> Table creation, Maintaining Index, Stats Collection, Access Control,
Handling Data Base Change Requests and other DBA related activities
> Extracting data from the Oracle database and flat files to load target
tables in Teradata
> Writing SQL BTEQ scripts, Fload, Fexport and Mload as part of Teradata
> Performance tuning of Teradata Bteq scripts. Maintain the Data
Currency for the Jobs.
> Delivering the complete roadmap for the project and designing the Data
Integration project using the IBM platform of Data Stage
> Developing ETL design templates for data extraction and loading on
Teradata using the ETL tool Data Stage
> Developing jobs in Parallel Extender using various stages such as
Transformer, Aggregator, Lookup, Joins, and Merge and Designing,
scheduling, and running jobs using Data Stage Director
> Designing/writing technical specifications (source-target mappings)
for the ETL mappings along with unit test scripts and Worked on
Teradata for developing business and aggregated views
Environment: Teradata SQL V2R5, Perl, Teradata V2R5, Unix MP RAS, Compare
it, Turbo Data, Ultra Edit, Data Stage (Parallel Extender) 7.5.2/ v 8x,
Cognos 8, and MS Windows 2000/NT
IBM Global Services July 2003 to August 2005
Vodafone, Melbourne, Australia
Finance Data Mart
As BI Consultant, Vodafone product development system that adapts the
proven benchmark of speed and efficiency. This project aims to create a
Business Intelligence Reporting platform for business users of Vodafone
Telecommunications in Australia. The scope of the project also includes
providing production support for the existing Micro Strategy application
and ETL scripts and delivering ad hoc reports for business users on time.
Responsibilities:
> Production support, Designing reports using the Micro Strategy OLAP
tool and Hyperion
> Creating reports using various filters, metrics, and consolidations in
Micro Strategy
> Providing production support, monitoring, analyzing, and reporting
issues of applications i.e. investigating the problem and resolving it
in-house using the third party tool Informatica.
> Providing initial problem determination, error recovery, and
notification whenever necessary
> Tuned and created micro strategy reports as part of development with
custom groups and consolidations
> On call Production Support 24/7 Monitoring ETL Jobs, Micro Strategy
reports, responsible for Scheduled Loads and Servers.
Environment: SQL, Teradata V2R5, Brio (Hyperion), Micro Strategy, Unix MP
RAS, and MS Windows 2000/NT
Florida Power & Light's, Florida, US
As ETL Developer, Redesign of the FPLDSS project.
Responsibilities:
> Interacting with end users to collect functional requirements for the
application
> Participating in the business functionality review meetings
> Developing mappings to extract data from source systems to the staging
area and to load data to the data warehouse
> Identifying and tracking the slowly changing dimension tables and
heterogeneous sources and determining the hierarchies in dimensions
Environment: SQL, PL/SQL, Oracle 9i, Informatica 6.1, Cognos Report Net,
Impromptu, and MS Windows 2000/NT
NS IT Consulting April 2000 to
May 2003
Delphi Auto Systems, Lisbon, Portugal
Auto Business Analysis
As ETL Developer, I was involved in the Auto Business Analysis project.
They needed a data warehouse to maintain historical data in a central
location for integration, analyze business in various locations, and make
strategic decisions
Environment: SQL, Oracle, Informatica 5.1, Business Objects 5.1, and MS
Windows XP/2000
Data Migration Information System
As Developer, I was involved in the Data Migration system. The project
involved creating a data mart by migrating the company's policy and claims
data from a mainframe to a client/server environment. This solution would
provide the company stakeholders' faster access to a more comprehensive and
consistent view of data to better assist the customer with their insurance
needs.
Environment: SQL, Oracle 7.3, Informatica 5.1, Cognos Impromptu, UNIX, and
MS Windows 98/2000/NT
Pay On Account System & Data Transformations
As Developer, Analysis, design, development, customization, and
implementation of the Pay On Account system, which was developed to
maintain the vendor details and details required throughout the payment
cycle, prepare invoices and payment documents, and generate reports.
Environment: SQL, PL/SQL, Oracle 7.3, Reports 2.5, SQL*Plus, and MS Windows
95
Transport corporation of India. Mumbai, August 1998 to March 2000
India
As Developer/Data Analyst
Environment: SQL, PL/SQL, Oracle 7.1, reports, MS Visual Basic, and MS
Windows 95
Education:
> Master of Business Administration in Systems, AIMS University,
Visakhapatnam, India, 1997