Naveen Balusu
abmio5@r.postjobfree.com
. Over 8 years of IT experience with 7+ years of experience in Data
Warehouse/Data Mart design, System analysis, development, Database
design.
. Thorough understanding of Software Development Life Cycle (SDLC)
including requirements analysis, system analysis, design, development,
documentation, testing, implementation and post-implementation review.
. Involved in translating Business Requirements into Technical
Specifications.
. Directly responsible for the Extraction, Transformation and Loading of
data from Multiple Sources to Data Warehouse.
. Extensive experience with ETL tool Informatica in designing the
Workflows, Worklets, Mappings, Configuring the Informatica Server and
scheduling the Workflows and sessions using Informatica PowerCenter
8.6/7.1.4/6.2/5.1 and Power Mart 5.1/5.0/4.7.
. Extensive experience in integration of various data sources like
Oracle, SQL Server, DB2, MS Excel, Flat Files, XML files and Mainframe
files into the staging area and then into the Warehouse and Marts.
. Extensive experience in Extracting, Cleaning, Transforming and Loading
the data using all the Transformations in the ETL tool Informatica.
. Vast experience in designing and developing complex mappings from
varied transformation logic like Unconnected and Connected lookups,
Source Qualifier, Router, Filter, Expression, Aggregator, Joiner,
Update Strategy etc.
. Very good knowledge of both Star Schema and Snow Flake Schema.
. Experience in debugging and Performance tuning of targets, sources,
mappings and sessions
. Extensive experience in implementing slowly changing dimensions.(Type
1,Type 2,Type 3)
. Extensive experience in developing reusable transformations and
mapplets embedding the business logic.
. Extensive experience in Informatica performance tuning at various
levels.
. Extensive experience in creating and executing Unit and Integration
test plan.
. Experience in user accepting testing with business clients
. Experience in working with the testing team to help them test
Informatica Mappings.
. Experience in working with the reporting team to help them create
Reports.
. Extensive experience in closely working with the DBA in creating
tables, migrating tables to different environments and obtaining
permissions on various tables.
. Experience in using scheduling tools to automate running of
Informatica Workflows
. Experience with Power Connect For Mainframe (Power Exchange).
. Experience working on-call and Production Support.
. Expertise in Relational Databases like Oracle, SQL Server, Teradata
and DB2.
. Adept in meeting project deadlines, based on Workloads assigned
followed by process methodologies, adapting to the Standards of the
client.
. Excellent team, analytical and problem solving skills.
. Team player and self-starter with good communication skills and
ability to work independently and as part of a team
Technical Skills:
Data Warehousing: Informatica PowerCenter 8.6/7.1.4/6.2/5.1, Power
Mart 5.1/4.7, Business Objects XI/6.0/5.0,SAS, Web
Intelligence, SQL*Loader, Erwin, Embarcadero ER/Studio,
Power Exchange (Power Connect for Mainframe), Informatica
Power Analyzer, Cognos, XML Spy, CA7, Maestro
Databases: Oracle7.x/8.1/8i/9i/10g,MS SQL Server, Teradata, DB2 UDB EEE
7.2/8.1/8.2/9, DB2 MVS7
OS: MS Windows 2000/NT/98/2000/XP, AIX-Unix, Solaris, Linux,
MVS
Project Experience:
Shell Trading/Accenture
Sep2009-Till date Blueprint Focus
Houston,TX
Blueprint FOCUS is designed to improve business processes and IT systems
within Oil globally, by updating aging IT systems, providing ONE end-to-end
trading system designed and standardising global processes. Blueprint FOCUS
will simplify and integrate Shell interfaces.The Blueprint Benefits are
organized based on the business areas affected, which includes: Deal
Capture, Operations, Settlement, Finance, Primary Distribution, Scheduling
and Risk.
. Gathered Business requirements and Prepared Logical design
documents
. Interacted with Business, Modeling,Test and Involved with
Deployment Process
. Created Informatica Mappings which involved Slowly Changing
Dimensions
. Involved in Fine-tuning SQL overrides, Look-up SQL overrides and
Partitioning for performance Enhancements
. Defined ETL Test conditions and prepared Test scripts
. Extensively worked on transformations like Lookup, Filter,
Expression, Aggregator, Router, Source Qualifier, Sorter, Sequence
Generator, Rank, etc.
. Created Reusable Transformations and Mapplets and used them in
Mappings to develop the business logic for transforming source data
into target.
. Worked closely with the testing team for the testing of maps.
. Responsible for creating and effectively executing deployment plan
to move all the objects, tables, mappings, workflows, SQL scripts,
etc to AT & PT environments.
. Involved in improving performance of the Informatica Mappings by
identifying and eliminating the Performance Bottlenecks
. Used HPQC and IBM CQ to Create and manage the Defects
. Created Technical Design Document and Dependency Document
. Defined the Dependency and scheduled the jobs thru Control-M
Environment: Informatica - PowerCenter8.6, Oracle10g, Sql server 2000,
UNIX, Windows NT/2000,
Toad,SVN(versioncontrol)Embarcadero,BusinessObjects,Endur9.
US Bank
Feb 2009 -Aug 2009
Denver, CO
Data Warehouse Consultant
U.S. Bank is the 6th largest commercial bank in the United States. The
company operates 2,850 banking offices and 5,173 ATMs, and provides a
comprehensive line of banking, brokerage, insurance, investment, mortgage,
trust and payment services products to consumers, businesses and
institutions.The objective of Fraud Detection project is to achieve single
point of reference to get the data from the various Fraud databases.
Distributed data residing in heterogeneous data sources is consolidated
into the target Oracle enterprise DW database..A complex ETL process
developed in Informatica PowerCenter extracts this data from many disparate
data sources, identifies erroneous data,and transforms/loads clean data it
into an Oracle data warehouse
. Involved in requirement gathering, analysis and study of existing
systems
. Worked on extracting data from source systems and transforming the
data according to business rules and loading the data into target
systems
. Also involved in moving the mappings from Dev to Test & Test
Repository to Production after duly testing all Mappings
. Created Reusable Transformations and Mapplets and used them in
Mappings to develop the business logic for transforming source data
into target.
. Worked on Pre-Production and Post-Production Issues.
. Responsible for creating new jobs and their dependencies in Production
. Extensively worked on transformations like Lookup, Filter, Expression,
Aggregator, Router, Source Qualifier, Sorter, Sequence Generator,
Rank, etc.
. Extensively used ETL tool Informatica to load data from Mainframe,
DB2, Oracle, Flat Files to the target database.
. Involved in synching up the Dev Environments with Production once the
release is completed
. Involved in improving performance of the Informatica Mappings by
identifying and eliminating the Performance Bottlenecks
. Responsible for creating and effectively executing deployment plan to
move all the objects, tables, mappings, workflows, SQL scripts, etc to
production.
. Responsible for monitoring all the sessions that are running,
scheduled, completed and failed.
. Unit tested the mappings before including the developed sessions into
the already existing batch.
. Actively involved in creating Test Plans based on the requirements
submitted by the Business analysts.
. Involved in Release coordination
. Extensively used Debugger for trouble shooting the mappings
. Extensively used Clear case tool for revision control of source code
. Design and Development of pre-session, post-session routines and batch
execution routines using Informatica Server to run Informatica
sessions.
. Design and development of ETL process using Informatica to load data
from file system into the target database
. Involved in synching up the Dev Environments with Production once the
release is completed
Environment: Informatica - PowerCenter8.6, Oracle10g, Sql server 2000,TSQL,
SQL, PL/SQL, UNIX Shell Script, Windows NT/2000, Erwin 4.0.Clear Case, DB2,
Toad, Embarcadero, Business Objects, Crystal
reports,MicrosoftExcel,MicrosoftAccess.
JPMORGAN & CHASE
Apr 2006 -Jan 2009
Houston, TX
Data Warehouse Consultant
JPMorgan Chase is a leader in investment banking, financial services for
consumers, small business and commercial banking, financial transaction
processing, asset management and private equity. Worked on multiple
projects to develop Enterprise Data Warehouse and Data Marts to support
Business Intelligence These projects help in business decisions and allow
management to set directives against any discrepancies and also assist in
meeting goals set by the initiative. These data Warehouse projects involve
extraction of data from multiple sources and transforming and loading data
into the target Oracle Enterprise Data Warehouse.
Responsibilities:
. Involved in requirement gathering, analysis and study of existing
systems
. Worked on extracting data from source systems and transforming the
data according to business rules and loading the data into target
systems.
. Extensively used ETL tool Informatica to load data from Mainframe,
DB2, Oracle, Flat Files to the target database.
. Worked on Informatica PowerCenter - Source Analyzer, Warehouse
designer, Mapping Designer, Mapplet Designer and Transformation
Developer.
. Extensively worked on extracting data from XML sources (XML, DTD and
XSD) and also creating XML targets.
. Extensively used Informatica Power Exchange, which is the Informatica
Power Connect for Mainframe to get Mainframe files from the source to
the staging area
. Developed stored procedures, functions, triggers and packages using
PL/SQL
. Extensively worked on transformations like Lookup, Filter, Expression,
Aggregator, Router, Source Qualifier, Sorter, Sequence Generator,
Rank, Union etc.
. Created Reusable Transformations and Mapplets and used them in
Mappings to develop the business logic for transforming source data
into target
. Involved in improving performance of the Informatica Mappings by
identifying and eliminating the Performance Bottlenecks
. Involved in level 2 and level 3 production issues
. Designed and developed transformation rules (business rules) to
generate consolidated (facts and dimensions) data using ETL tool
Informatica.
. Extensively used Change Data Capture (CDC) approach efficiently to
capture only the changes.
. Involved in Release coordination
. Extensively used Debugger for trouble shooting the mappings
. Experience in the Data Warehouse Lifecycle (Requirement Analysis,
Design, Development and Testing).
. Also involved in moving the mappings from Dev to Test & Test
Repository to Production after duly testing all Mappings
. Worked closely with the testing team for the testing of maps.
. Responsible for creating and effectively executing deployment plan to
move all the objects, tables, mappings, workflows, SQL scripts, etc to
production.
. Responsible for creating new jobs and their dependencies in
Production.
. Extensively used the scheduling tool to schedule jobs and their
dependencies in production.
. Worked on Pre-Production and Post-Production Issues.
. Extensively used ETL to load data from flat files (excel/access) to
Oracle Database.
. Created Test plans and Test Scenarios.
. Actively involved in creating Test Plans based on the requirements
submitted by the Business analysts.
. Involved in working with the Data Modeler in Data Modeling.
. Extensively used Business Objects for testing reports in the process
of preparing for UAT
. Involved in helping the reporting team with Business Objects reports
and data issues.
. Actively involved in training people new to Informatica
. Extensively used Clear case tool for revision control of source code
. Extensively used workflow manager for creating and scheduling various
sessions.
. Used Informatica Power center workflow manager to create sessions,
workflows and work-lets to run with the logic embedded in the
mappings.
. Experience in Scheduling workflows with Autosys
. Involved in synching up the Dev Environments with Production once the
release is completed
Environment: Informatica - PowerCenter8.5/7.1.4/6.2, Sybase, Oracle10g/9i,
Sql server 2000,TSQL, SQL, PL/SQL, UNIX Shell Script,Autosys,Windows
NT/2000, MicrosoftExcel,MicrosoftAccess,Erwin 4.0.Clear Case, DB2, Toad,
Embarcadero, Murex
Cargill Meat Solutions
Jan 2005- Mar2006 Wichita, KS
Data Warehouse Consultant
Cargill, Incorporation is an international provider of food, agricultural
and risk management products and services, with 101,000 employees in 59
countries. Cargill Meat Solutions is comprised of Cargill Beef, Pork,
Taylor Packing, Case Ready and CVAM (Cargill Value Added Meat) groups. The
aim of CVAM project is to analyze Beef and Pork sales in different states
and different regions. The Data Warehouse is on Oracle Database and we used
Informatica for the process of ETL. The data from different data sources
were extracted to the data warehouse, which was used for reporting through
Business Objects tools.
Responsibilities:
. Involved in requirement gathering, analysis and study of existing
systems.
. Extensively worked on transformations like Lookup, Filter,
Expression, Aggregator, Router, Source Qualifier, Sorter, Sequence
Generator, Rank, etc.
. Designed and developed new mappings using Connected, Unconnected
Look ups and Update strategy transformations.
. Developed joiner transformation for extracting data from multiple
sources.
. Created Reusable Transformations and Mapplets and used them in
Mappings to develop the business logic for transforming source data
into target.
. Extensively used workflow manager for creating and scheduling
various sessions.
. Actively involved in creating Test Plans based on the requirements
submitted by the Business analysts.
. Created and Monitored Batches and Sessions using Informatica
PowerCenter Server.
. Designed and developed various mapplets using maplet designer.
. Extensively used ETL to load data from flat files (excel/access) to
Oracle Database.
. Created Test plans and Test Scenarios.
. Unit tested the mappings before including the developed sessions
into the already existing batch.
. Created Informatica Mappings with PL/SQL procedures/functions to
build business rules to load data.
. Responsible for monitoring all the sessions that are running,
scheduled, completed and failed.
. Involved in improving performance of the Server Sessions by
identifying and eliminating the Performance Bottlenecks.
. Responsible for monitoring all the sessions that are running,
scheduled, completed and failed.
. Experience in the Data Warehouse Lifecycle (Requirement Analysis,
Design, Development and Testing).
. Also involved in moving the mappings from Test Repository to
Production after duly testing all transformations.
. Experience in Scheduling workflows with Autosys
. Creating SAS dataset from tables in Database using SAS/Access
. Wrote PL/SQL Packages and Stored procedures to implement business
rules and validations.
Environment: Informatica - PowerCenter7.1.1/6.1, DB2, Oracle9i,Terradata
Sql server 2000,TSQL, SQL, PL/SQL, UNIX Shell Script, Autosys,SQL*Loader,
MicrosoftExcel,MicrosoftAccess,Windows NT/2000, Erwin 4.0.,Business
Objects, SAS 8.2 DB2 Visualizer.
Hutchinson Technology
Nov 2002-Dec 2004
Hutchinson, Mn
Data Warehouse Consultant
Hutchinson Technology is an acknowledged world leader in precision
manufacturing. Hutchinson Technology designs and manufactures suspension
assemblies for hard disk drives. The objective of the Offline Lot project
is to integrate and report product quality measurements collected from two
distinctly different sources - manufacturing machines on the shop floor and
off-line testing machines in the Quality Assurance area. A complex ETL
process developed in Informatica PowerCenter extracts this data from many
disparate data sources (Oracle databases, Db2 and flat files), identifies
erroneous data, and transforms/loads clean data it into an Oracle 9i data
warehouse that is optimized for query and reporting.
Responsibilities:
. Involved in requirement gathering, analysis and study of existing
systems
. Analysis of Source, Requirement, existing OLTP system and
Identification of required dimensions and facts from the Database.
. Responsible for monitoring and troubleshooting daily processes that
includes scripts execution, data loads, and archiving
. Design and development of ETL process using Informatica to load data
from file system into the target Oracle database
. Created shared folders, local and global shortcuts.
. Involved in defining the new dimensions, fact tables and collection
process.
. Extensively worked on transformations like Lookup, Filter, Expression,
Aggregator, Router, Source Qualifier, Sorter, Sequence Generator,
Rank, etc.
. Created Reusable Transformations and Mapplets and used them in
Mappings to develop the business logic for transforming source data
into target
. Design and Development of pre-session, post-session routines and batch
execution routines using Informatica Server to run Informatica
sessions.
. Used Informatica Server Manager to create, schedule, monitor and send
the messages in case of process failures.
. Actively involved in creating Test Plans based on the requirements
submitted by the Business analysts
. Also involved in moving the mappings from Test Repository to
Production after duly testing all transformations
. Extensively used ETL to load data from COBOL Flat Files, Oracle
database, DB2 database.
. Developed and maintained Reports using Power Analyzer 4.1.1.
. Used Informatica Power center workflow manager to create sessions,
workflows and work-lets to run with the logic embedded in the
mappings.
. Used Workflow Manager (Server Manager) for Creating, Validating,
Testing and running the sequential and concurrent Batches and Sessions
and scheduling them to run at specified time.
Environment:Informatica-
PowerCenter6.2,PowerAnalyzer4.1.1,DB2,Oracle9i,Sqlserver2000,
TSQL,SQL,PL/SQL,UNIXShellScript,SQL*Loader,WindowsNT/2000,Erwin4.0,
McrosoftExcel,MicrosoftAccess.
Madras Refineries Limited, Chennai, India
Oct 2001- Sep 2002
System Analyst
The main task accomplished by the current delivered system is automation of
the regular activities. A modular approach has been applied in the process
of automation. The entire process is divided into different modules namely
Price listing module, Customer and Location module, Offer and Purchase
Orders module, Bill and Payments module, and GUI Reports module. Generated
reports on Accounting and Billing. The project mainly consists of Company
Master, Asset \ Liability \ Income \ Expenditure, Accounts, Client Master,
Receipts, Payments, Journals, Bill Entry, Client Receipts, Transaction
maintains details.
. Responsibilities:
. Studied the Requirement Specifications
. Performed Analysis, and Design of database
. Involved in SQL Tuning by creation of Indexes, Rebuilding Indexes, and
Clusters etc.
. Developed oracle reports for Location Info, Price Detail, Customer
Info, and Purchase Details
. Created database objects like tables, views, procedures, packages
using Oracle tools like SQL Loader, SQL* Plus and PL/SQL
. Developed triggers, stored procedures and functions
. Developed various reports like Bills, Payment Voucher, Invoices, P & L
Account, Balance Sheet
. Functional Experience gained in Finance and Accounting
Environment: Oracle 7.3 on WinNT, Oracle with Developer2000.
Education: Bachelors in Engineering from Andhra University, India