Anil Kumar
Phone: 319-***-**** Email:
abht89@r.postjobfree.com
Summary of Experience:
. 6 + years of software experience in analysis, design, development and
testing which includes Data warehousing experience using Informatica
PowerCenter / PowerMart.
. Experience in developing strategies for Extraction, Transformation and
Loading (ETL) mechanism using various ETL Tools.
. Experience in extraction of data from various sources such as Oracle,
Microsoft SQL Server 2000, Flat Files, COBOL, XML and DB2.
. Experience in Study, Design, Analysis, Development, and Implementation
of Data Warehousing concepts.
. Practical understanding of the Data modeling (Dimensional &
Relational) concepts like Star-Schema modeling, Snowflake modeling,
fact and dimension tables modeling of data at all the three levels:
view, logical & physical.
. Experience in various features of Erwin like Forward Engineering and
Reverse Engineering.
. Hands on experience in tuning mappings, identifying and resolving
performance bottlenecks in various levels like sources, targets,
mappings and sessions.
. Designed and developed efficient Error handling methods and
implemented throughout the mappings.
. Strong Experience in creating Stored Procedures using Oracle PL/SQL
. Skilled in Tuning of SQL by using techniques like TKPROF and Explain
plan.
. Skilled in Unit Test, System Integration Test and UAT
. Good experience in UNIX and writing shell scripts for Informatica pre
& post session operations and database administration activities.
. Worked extensively with Dimensional modeling, Data migration, ETL
Processes for Data Warehouses.
. Excellent Analytical, Communication and Problem Solving Skills.
Technical Profile:
ETL Tools Informatica PowerCenter 8.6.1/8.1/7.1
RDBMS Oracle 11g/10g/9i/8i, IBM DB2 UDB 8.1/7.0,
MS SQL Server 2000/7.0/6.5, MS Access 7.0/2000
BI Tools Business Objects 5.1/6.5, Microstrategy 7i
Modeling Tools Erwin 3.5.2/3.x, Microsoft Visio
Other Tools Toad for Oracle, AQT, Quest Central for DB2,
Humming
Bird for FTP, Putty, MS Visual SourceSafe,
Control-M
Report Tools Brio SQR writer 6, Crystal 8
Languages SQL, PL/SQL, UNIX Shell Scripting, C, C++
Operating Systems Windows 2000, XP, UNIX-AIX, IBM MVS/390
Education:
. Bachelor of Technology in Computer Science May'05 JNTU (Jawaharlal
Nehru Technological University)
Professional Experience
AEGON, Cedar Rapids, IA
Apr'08 - Till Date
Lead ETL Developer
Aegon is one of the world's largest providers of life insurance, pensions,
and long-term savings and investment products. Being part of Investment
Division goal is to build a New Data Warehouse (DW) to suffice the business
needs. Part of the process is to also move the data in existing DW
The new IDW is compatible of supporting the data from existing Data ware
House (PAM) for the past 18 months (Assuming live date to be June 2009,
History loads need to be from DEC 2007 - MAY 2009). The legal compliance is
achieved by archiving the monthly history financial snapshot (a separate
data mart extracted from the IDW which is in 3NF).
New warehouse provides data to various in house applications like PAR
(Portfolio Analytics & Reporting application) & Eagle
Responsibilities:
. Requirement gathering from users & create High Level Design Documents.
. Prepare the source to target mapping specification documents for various
steps in the warehouse load like source tables to XML, XML to Staging,
Staging to Data Warehouse & Warehouse to Data Mart Layer for the
Informatica Routines.
. Create & Maintain Informatica Definitions (Source & Target Definitions,
Re usable lookups/Mapplets/Transformations).
. Analyzed the data model and implement required changes for better design.
. Designed & Developed Data Mart Layer to be Snow Flake to help reporting
using various front end reporting tools (CUBES of MS-SQL SERVER
2005/Business Objects (BO)/MS-Excel Services/POD (Custom Application).
. Perform Change Data Capture (CDC) of source data using informatica Hash
Key function & control tables
. Month end correction facility/Adjustment facility implemented for various
financial data transactions.
. Responsible for the source control and code move procedures in various
environments, Initiating SWIM ID's.
. Mappings implemented to move the data from existing ware house(PAM) to
the new ware house(IDW), with all the additional functionality complying
the financial reports/summaries tallied to the existing ones (One month
of data per week for history).
. Design, develop & test mappings, session & workflows
. Done caching optimization techniques in Aggregator, Lookup, and Joiner
transformation.
. Performance tuning of sources, targets, mappings and SQL queries in the
transformations.
. Responsible for scheduling the workflow based on the nightly load.
. Used various debugging techniques to debug the mappings.
. Developed Informatica parameter files to filter the daily source data.
. Created Oracle PL/SQL Stored Procedure to implement complex business
logic for good performance and called from Informatica using Stored
Procedure transformation.
. Used various Oracle Index techniques like B*tree, bitmap index to
improve the query performance and created scripts to update the table
statistics for better explain plan.
. Created Materialized views for summary tables for better query
performance.
. Review & Unit Test/Functional test of the mappings developed by the team
members.
. Create & perform Unit Test, System Integration Test and UAT to check the
data quality
. Follow up/Track the changes/Re Test (Issues to be gone).
. Keep a close track on the issues list from business users and make sure
the issues opened are valid ones.
. Update & Monitor the ETL standards (defined in the ETL Developer Guide)
of the code from the team members (queries ran on OPB tables rather than
individually checking the session log file names/Transformation Name/
Connection Object Name/ Description Fields).
. Developed shell scripts for data validation, load validation and to
archive files.
. Used command task to invoke remote shell scripts for archiving files &
collecting session statistics
. Extensively used mapping parameters and parameter files to reuse mappings
& sessions across 14 source systems.
. Improved the performance of SQL, transformations and mappings by
identifying bottlenecks & using hints
. Support the nightly cycle & fix if any issues & follow up to improve
performance & avoid failures
Environment: Informatica PowerCenter 8.6.1, Oracle 11g, Flat Files, XML,
Toad for Oracle v10.1, MS Visio, PuTTy, MS Visual SourceSafe, Windows 2000,
Unix Aix 6, CONTROL-M v6.3.01, Stylus Studio 2009
Principal Financial Groups, Des Moines, IA
Nov '07 - Mar '08
CRM PRA Conversion
Informatica Designer/Developer
The CRM PRA is designed to provide more effective marketing to the huge
customer base of Principal Financial Group, a leading global financial
company offering individuals, institutional clients and businesses with a
wide range of financial products and services. These include retirement
solutions, life and health insurance, wellness programs, and investment and
banking products. The project goal is achieved by eliminating the existing
gaps in effective capture and usage of participant data. This in turn helps
to develop marketing touch points that lead in making wise decisions rather
than mere assumptions. PRA (Personal Retirement Account) helps to capture,
store and provide reporting and analytics on participant data.
Responsibilities:
. Gathering requirements from the marketing and customer service groups.
. Analysis of the business information from various source systems and
understand the source data (data profiling), data transformation and
data aggregation.
. Designed process to capture needed data points, trigger and event
data, metrics, reporting and analysis data.
. Implementation of the design using informatica objects: Sources,
Targets and various transformations like Expression, Router,
Aggregator and Filter.
. Developed processes to capture events occurred in the operational
source systems to feed the CRM PRA conversion stream.
. Designed and developed mappings to automate the process of tracking
the status of participants by feeding the captured events along with
the context data to the CRM PRA State Machine.
. Designed ETL processes to transform data from mainframe files and
other relational sources to DB2.
. Developed a number of Informatica Mappings, Mapplets and
Transformations to load data from relational and flat file sources
into the data mart.
. Worked with Copybooks, Cobol files, flat files and relational sources.
. Developed shell scripts for data validation, load validation and to
archive files.
. Used command task to call the remote shell scripts and decision task
when required.
. Designed and developed reusable transformations, mappings, sessions
and mapplets.
. Used mapping parameters and parameter files to reuse mappings.
. Improved the performance of SQL, transformations and mappings.
. Created and maintained high level design documents, functional
documents and technical design documents.
. Tested and implemented the mappings, scripts and sessions in phases
with in the timelines.
Environment: Informatica PowerCenter 8.6.1, UDB DB2, Flat Files, Cobol,
Quest Central v5.0, Extra Session, MS Visio, Humming Bird, IBM MVS/390,
Windows 2000, Unix Aix 5.1
Netalytics, Greenville, SC
Dec'06-Oct'07
Decision Support System
Sr.Informatica Developer
The Project Involves in customizing Business Intelligence (BI) to one of
its client, WELCOME Health Care Systems. Responsible for ETL processes in
developing the data warehouse for material management and inventory control
focusing on achieving optimum business solution for supply chain management
involving component materials, inventory management, and dealer sales
across the country.
Responsibilities:
. Primarily responsible to convert business requirements into system
design.
. Further implemented system model in Informatica, comprehensively using
the functionality of the tool.
. Used ETL to standardize data from various sources and load into Data
Stage area, which was in Oracle.
. Worked on Informatica Power Center tool-Source Analyzer, Warehouse
designer, Mapping and Mapplet Designer, Transformations, Informatica
Repository Manager and Informatica Server Manager.
. Extensively used ETL to load data from wide range of source such as
flat files, oracle and XML to ORACLE.
. Extensively used PL/SQL Procedures/Functions to build business rules.
. The project involved extracting data from various sources, transforming
the data from these files before loading the data into target
(warehouse) Oracle tables.
. Informatica Metadata repository was created using the Repository
Manager as a hub for interaction between the various tools. Security
and user management, repository backup was also done using the same
tool.
. Informatica Designer tools were used to design the source definition,
target definition and transformations to build mappings.
. Designed and Developed complex aggregate, join, look up transformation
rules (business rules) to generate consolidated (Fact/Summary) data
identified by dimensions using Informatica Power Center tool.
. Created the mappings using transformations such as the Source
qualifier, Aggregator, Expression, Lookup, Router, Filter, and Update
Strategy.
. Server Manager used for creating and maintaining the Sessions. Server
Manger Also used to Monitor, edit, schedule, copy, abort and delete the
session.
. Setting up Batches and sessions to schedule the loads at required
frequency using Power Center Workflow manager, PMCMD and also using
scheduling tools. Generated completion messages and status reports
using Workflow manager.
. Extensively worked in the performance tuning of programs, ETL
procedures and processes.
Environment: Informatica PowerCenter 7.1.3, Business Objects 5.1, Oracle
9i, SQL Server 2000, PL/SQL, ERWIN, Toad, Windows 2000,UNIX.
IHOP, CA.
Jan'06 - Nov'06
IHOP Sales and Finance Data Warehouse
IHOP (International House of Pancakes) is a leading chain of restaurants
with more than 1000 outlets in USA and Canada. Data warehouse was developed
to analyze Sales and Financial data for both franchise as well as company
owned stores on a periodical basis.
Responsibilities:
. Involved in the Analysis of Physical Data Model for ETL mapping and
the process flow diagrams.
. Created Mappings to extract data from data sources like Oracle 9i, Sql
Server.
. Used Informatica (Repository Manager) to give permissions to users,
create new users, repositories.
. Extensively used the Mapping parameters, variables & parameter file in
the ETL complex mappings.
. Worked extensively with complex mappings using expressions,
aggregators, filters, lookup and procedures to develop and feed in
Data Warehouse.
. Coded and Used SQL, PL/SQL for the stored procedures, Packages,
function as required for the ETL Process.
. Created the Worklets, Tasks, Timers, Reusable objects for scheduling
the Workflows
. Involved in performance tuning of the complex Informatica mapping for
extracting & loading the data from transactional source tables and
resolving the production issues for Phase 1A.
. Created the SQL Loader scripts & used the SQL Loader to clean up files
and for data validation.
. Performed data accuracy & data integration test and created the test
cases and test data scripts.
. Involved in the SQL Queries performances tuning by altering session
parameters, using explain plans & using hints.
. Interacted with the business users to identify the business
requirements & to validate the load specifications.
. Written UNIX shell scripts & used corn tab to schedule the workflows.
. Analyzed and explored Cognos for reporting purposes.
. Built an Analysis Services cube from a fact table with millions of
rows and lots of dimensions from Oracle database to SQL SERVER 2000
using Analysis manager for SQL Server 2000.
Environment: Informatica PowerCenter 7.1.3, TOAD, Erwin 3.5.2, Oracle 9i,
SQL Server 2000, Win NT, Sun Solaris 5.8, Micro Strategy 7i, Unix AIX
5.1/4.3.
HINDUSTAN MOTORS, India Jan'05-
Dec'05
Data warehouse Developer
Workshop Management System was developed for Hindustan Motors, India, one
of the largest and leading manufacturers of Automobiles and Locomotives.
The prime objective of this Data Warehousing project was to enable
effective decision-making on the level of inventory to be maintained in the
workshop and minimizing the storage cost. Inventory data from MS SQL
Server, Vehicular information from MS Access and third party Vendor
information from Excel would be loaded to Oracle using Informatica as the
ETL tool.
Responsibilities:
. Analysis of Source Requirements, existing OLTP system and in turn
Identification of required dimensions and facts from the Database.
. Design & Develop mappings to extract data from various sources of
input and loading into the Oracle Data Warehouse.
. Loading the Data from the tables into the OLAP application and further
aggregate to higher levels for analysis.
. Creating temporary repository for already migrated database for system
analysis.
. Writing batch programs and database triggers at staging area for
population of warehouse.
. Develop data conversion, integration, loading and verification
specifications and design of a Mapping Specification and creating
multidimensional models.
. Setting of Error Logic for streamlining and automating the data loads
for cleansing and trapping incorrect data on staging servers and then
loading it to the data warehouse and populating the warehouse.
. Creating Sessions, managing batches, Performance Tuning and initial
testing followed by the volume testing. Resolve technical issues with
software consultants and vendors.
. Documentation of existing mappings as per standards.
. Used Business Objects for reporting.
. Designed Universes in Business Objects.
Environment: Informatica 7.1, Oracle 8, Business Objects, Erwin 3.5.2, HP
UNIX, and Crystal Reports 5.0