Ravindra N
Email: ********@*****.***
Mobile:+1-954-***-****
Experience Summary
. Eight years of professional IT experience.
. Solid experience in Data warehousing field specifically Informatica
8.X, 9.1.0, SQL SERVER 200/2005, SSAS,SSRS,Oracle 9i,10g.
. Experience in Informatica installation/configuration and
administration.
. Expertise in using Informatica client tools - Designer, Repository
Manager, Repository Server Administration Console, Workflow
Manager and Workflow Monitor
. Expertise in testing, debugging, validation and tuning of mappings,
sessions and workflows in Informatica
. Solid experience in analyzing and fixing Production issues in the
Informatica Applications
. Excellent Data Analysis, Data Mining skills.
. In-depth understanding of Data warehousing concepts such
as Kimball's methodology, OLAP, SCDs(type1,type2 and
type3), star schema and snowflake schema
. Strong programming and debugging skills in PL/SQL, SQL and Unix
shell scripting
. Extensive experience in entire SDLC
. Extensive production support experience including day to day
monitoring of Informatica jobs
. Worked as onsite coordinator/team lead, good experience in working
in onsite-offshore model.
. Working knowledge on OBIEE,Hyperion Essbase and SSIS.
. Worked on POCs using OBIEE and Essbase.
TECHNICAL SKILLS
Data Warehousing ETL Tools Informatica PowerCenter 8.1,8.6,9.1 and
9.5, DTS and SSIS
Databases Oracle 8i/9i/10g/11g, SQL Server
200/2005
Business Intelligence Tools SSAS,COGNOS and Hyperion Essbase and
OBIEE
Reporting Tools SSRS,COGNOS
Languages PL/SQL, SQL, Unix shell scripting.
Operating Systems UNIX, Linux, Microsoft 2003/2000/
NT/98/95
Defect tracking Tool Qubase
Change Management tool Remedy
Version Control Tool MS Visual SourceSafe
Job scheduling Tool UC4,Cron job, Windows scheduler
Others MS Word, Excel, VISIO 2007
CERTIFICATION:
Certified Informatica Developer and Microsoft Certified Professional
database developer
CHRONOLOGY OF EXPERIENCE
ADT Corporation-Boca Raton, FL. June 2013 to Present
Role: Informatica Developer
ADT provides residential and small business electronic security, fire
protection and other related alarm monitoring services in North America.
Athena data warehouse is a new enterprise data warehouse in ADT. Sourced
data from multiple source systems and loaded into staging area. Data is
subsequently loaded into integration layer and then to data mart layer.
OBIEE reports are built based on mart data.
. Preparing technical specifications documents.
. Worked on solution development
. Developed informatica mappings and workflows.
. Worked on performance tuning for long running mappings.
. Prepared unit test cases
. Worked on deployment document and deployment groups
Environment :
Informatica 9.1.0,Oracle 11g,AIX 6.1.,OBIEE
Bluegreen Vacations-Boca Raton, FL. Jan 2013 to June 2013
Role: Sr Informatica Developer
Bluegreen BI platform supports different business units in Bluegreen for
their BI needs. I have been working on developing informatica mappings with
sources as SQL Server,XML and Flatfiles, loads data into dimensions and
fact tables.
. Business requirements gathering and preparing technical specifications
documents.
. Extensively worked on Process Design, Development and Testing
. Effectively interacted with Business Users in analyzing the data
quality issues reported.
. Extensively worked on debugging the programs for data quality issues
and job failures
. Assisted in all aspects of the project to meet the scheduled delivery
time.
. Conducted unit testing of the enhacements done.
. Worked as a part of a team and provided 7 x 24 Support when required.
. Involved in enhancements and Production Support.
Environment :
Informatica,SQL Server
Mattel (American Girl)-Madison, WI. Dec 2011 to Dec 2012
Role : Informatica Lead Developer/Administrator
MDW is a centralized DWH in AG. This application sources data from
multiple systems namely OMS,ECOM and other external vendor systems.
Currently the system is running with ETL logic implemented in PL/SQL, AG is
migrating the PL/SQL into informatica 9.1. Also the entire legacy ETL
processes are being migrated into informatica.
. Worked on Informatica installation on AIX and configuration.
. Business requirements gathering and preparing technical specifications
documents.
. Extensively worked on Process Design, Development and Testing
. Trained DWH team on informatica 9.1
. Effectively interacted with Business Users in analyzing the data
quality issues reported.
. Extensively worked on debugging the programs for data quality issues
and job failures
. Worked on creating shell script for enhancement in the current
application.
. Completed the performance tuning in the current process and finally
reduced the job run time considerably.
. Developed Stored Procedures, Packages and Functions.
. Assisted in all aspects of the project to meet the scheduled delivery
time.
. Provided Knowledge Transfer to the end users and created extensive
documentation on the design, development, implementation and process
flow of the application
. Conducted unit testing of the enhacements done.
. Worked as a part of a team and provided 7 x 24 Support when required.
. Involved in enhancements and Production Support.
Environment :
Informatica,Oracle 10g,PL/SQL,Cognos and UNIX Shell scripting
Citi Bank LATAM HQ-Ft Lauderdale, FL. Apr 2011 to Nov 2011
Role : BI Developer
AML-MANTAS is a anti-money laundering application in Citi Bank. This
application sources data from multiple systems namely Mainframes and
Oracle. Data is then loaded into staging tables using SQL* Loader. Then,
data is loaded into a oracle schema with the format required by MANTAS
tool. Business validations are applied to provide quality data to MANTAS.
Data is being used by Citi compliance team to track the transaction
behavior and notify concerned departments in case of any suspicious
behavior in a particular transaction.
. Extensively worked on Process Design, Development, Testing and
Implementation of the Project.
. Effectively interacted with Business Users in analyzing the data
quality issues reported.
. Created Technical design specification documents for implementing the
solution for new countries.
. Extensively worked on debugging the programs for data quality issues
and job failures
. Worked on creating shell script for enhancement in the current
application.
. Completed the performance tuning in the current process and finally
reduced the job run time considerably.
. Developed Stored Procedures, Packages and Functions.
. Assisted in all aspects of the project to meet the scheduled delivery
time.
. Provided Knowledge Transfer to the end users and created extensive
documentation on the design, development, implementation and process
flow of the application
. Conducted unit testing of the enhacements done.
. Worked as a part of a team and provided 7 x 24 Support when required.
. Involved in enhancements and Production Support.
Environment :
Oracle 10g,PL/SQL,UNIX Shell scripting
Central Services IT(CSIT/IBOPS)-Deutsche Bank, London UK/Bangalore India
Jun 2008 to Mar 2011
Role: Tech Lead/onsite co-ordinator
CSIT is a platform supporting multiple business applications in Investment
Banking wing of Deutsche Bank. It supports users in Market Risk,Credit
Risk,Liquidity Risk and Operational Risk areas in IB division. Involves the
modules namely dbFeeds (ETL),dbCube (OLAP) and dbReports (Reporting). Data
is sourced from different source systems and loaded into staging area then
to dimension and fact tables. Cubes are being built on this data using MS-
OLAP and Hyperion Essbase tools. Finally reporting is done in Cognos.
. Actively involved in understanding the business requirements and
converting them into design documents.
. Designed the ETL/OLAP processes using Informatica/SSAS to load data
from Oracle,SQL Server, Flat Files (Fixed Width and delimited) into
Staging tables.
. Created Mappings and Mapplets using Informatica Designer and
processing tasks using Workflow Manager that populated the Data into
the Targets on Oracle 11g Instance and involved in Performance tuning
of targets, sources, mappings and sessions
. Extensively used Source Qualifier Transformation to filter data at
Source level rather than at Transformation level and used different
transformations such as Source Qualifier, Joiner, Expression,
Aggregator, Rank, Lookups, Filters, Stored Procedures, Update Strategy
and Sequence Generator in extracting data in compliance with the
business logic developed.
. Created Informatica mappings with stored procedures to build and load
cross reference tables
. Developed ETL mappings, transformations using Informatica PowerCenter
8.6. Extensively used Informatica client tools - Source Analyzer,
Warehouse designer, Mapping Designer, Mapplet Designer, Transformation
Developer, Informatica Repository Manager and Informatica Workflow
Manager.
. Used Mapplets and Reusable Transformations prevent redundancy of
transformation usage and maintainability
. Generated Flat file Targets and Reports for the target team with
Header and Tailers.
. Created database objects including tables, indexes, views,
materialized views, constraints, jobs, snapshots, table partitioning
and development of SQL, PL/SQL, stored procedures, triggers and
packages and troubleshooting and performance tuning PL/SQL and SQL.
. Involved in enhancements and maintenance activities including
performance tuning, writing of stored procedures for code
enhancements, creating aggregate tables, and modifying target tables.
. Proficient in Performance Tuning, Troubleshoot of sources, targets,
PowerCenter transformations and sessions using techniques like Explain
plans, Oracle hints, changing session parameters, re-designing the
mappings, using test loads.
. Responsible for monitoring all the sessions that are scheduled,
running completed and failed. Involved in debugging the Mappings that
failed.
. Provided support for Dev/SIT/UAT/Prod runs to analyze and fix the
issues in
Runtime.
. Involved in setting up the Dev/UAT UNIX Environment and Created UNIX
shell scripts to run the workflow from Unix and created PL/SQL
procedures, pre-session and post-session scripts to ensure timely,
accurate processing and ensure balancing of job runs.
. Extensively worked SSAS and MS-OLAP in building cubes
. Performed Disaster Recovery activity to ensure the systems
availability during un expected events.
. Involved in planning and migration of servers from one Data Centre to
another data centre.
Environment :
Informatica PowerCenter 8.6.1, Oracle 10g,MS SQL Server 2000/2005, Sun
Solaris 5.10 Unix, SQL, PL/SQL,Hyperion Essbase,SSAS and Cognos.
CFU DWH Migration-Deutsche Bank,Brussels Belgium
Jul 2007 to May 2008
Role : Informatica Developer
Objective of the project is to migrate entire CFU DWH Databases from SQL
Server 2000 Environment to Oracle 10g environment comprises 8 Databases in
SQL Server 2000. It involves converting SQL Server database objects to
Oracle database objects.
Converting DTS packages into informatica workflows.
Responsibilities:
. Analyze the DTS packages and prepare design documents to develop
informatica workflows
. Designed the ETL processes using Informatica to load data from Flat
Files (Fixed Width and delimited) to staging database and from staging
to the target Data Mart.
. Involved in preparing shell script for running informatica workflows.
. Developed standard and re-usable mappings and mapplets using various
transformations like expression, aggregator, joiner, source qualifier,
router, lookup Connected/Unconnected, and filter.
. Implemented the best practices for the creation of mappings, sessions
and workflows and performance optimization.
. Used Workflow Manager for creating, validating, testing and running
the sequential and concurrent sessions and scheduling them to run at
specified time and as well to read data from different sources and
write it to target databases.
. Query Optimization for improving the performance of the data
warehouse.
. Involved in performance tuning of the mappings, sessions and
workflows.
. Used session parameters and parameter files to reuse sessions for
different relational sources or targets.
. Created documentation on mapping designs and ETL processes.
. Documented ETL test plans, test cases, test scripts, test procedures,
assumptions, and validations based on design specifications for unit
testing, system testing, expected results, preparing test data and
loading for testing, error handling and analysis.
. Involved in Unit testing, User Acceptance Testing to check whether the
data is loading into target, which was extracted from different source
systems according to the user requirements.
. Involved in implementing the solution at onsite in Brussels Belgium.
Environment :
Informatica PowerCenter 8.6, MS SQL Server 2000, DTS, Oracle 10g and UNIX.
ReveleusServiceCenter-Product (i-flex solutions)
Feb 2007 to Jun 2007
Role : Informatica Developer
Prime Sourcing Reveleus Service Center Project is aimed at building
Services around Reveleus product. There is an increasing need for
integrating Reveleus with a host of different technical components and
these opportunities need to be exploited. Technically, this Service center
would focus on integrating tools with Reveleus Engine.
This project involves loading 21 different Products (Commercial Loans,
Options, Swaps and Bills etc.,) into Staging Area by validating data
against business rules and subsequently into ODS by using Informatica 8.1.0
as ETL Tool. Records those are not confirmed to the business rules are
captured in a table for taking corrective action. Reports are generated
from ODS by creating views on ODS tables by using Business Objects OLAP
Tool.
Responsibilities:
. Involved in Logical and Physical Database design and Star Schema
design. Identified Fact tables, and Dimensional tables.
. Designed and developed end-to-end ETL process to add data from various
source systems.
. Extensively involved in business and functionality of the system,
understanding source systems and requirement analysis through FRD's.
. Developed mappings, workflows to extract data from various sources
like oracle and flat files,.
. Created source and target definitions based on the data in the flat
files.
. Implemented SCD feature in informatica
. Developed number of complex Informatica mappings
. Used various transformations like filter, expression, sequence
generator, update strategy, joiner, stored procedure, normalizer and
union to develop robust mappings using the informatica designer.
. Developed reusable transformations and mapplets to implement the
common business logic.
. Created sessions, configured workflows to extract data from various
sources to loading into data warehouse.
. Designed and developed complex Procedures, Functions & Packages to
handle errors and exceptions at database level using PL/SQL
Environment :
Oracle 9i, Informatica PowerCenter 8.1, and UNIX.
MODM Migration-Barclays Bank UK.
Apr 2006 to Jan 2007
Role :Informatica Developer
Barclays is a UK-based financial services group, with a very large
international presence in Europe, the USA, Africa and Asia. It is engaged
primarily in banking, investment banking and investment management. In
terms of market capitalization, Barclays is one of the largest financial
services companies in the world.
The Middle Office Data mart project was initiated in Jan 2006 as a Barclays
Capital in-house development effort across the TDB and PCG IT teams within
Middle Office IT. The Middle Office Data Mart (MODM) enables the Finance
Regulatory Reporting group, and the Product Control Group (PCG).
The FIAT Trading backend is a series of databases called "UNITY".
These databases load information from a source system, enrich the data,
perform profit and loss calculations. FIAT Databases will pull the
necessary data from the MODM repository after Phase 4 and Phase 5 is
completed. The Reporting databases implement a common schema for handling
the data transferred from the Unities, such that functionality and
reporting, is source system agnostic. There exists more than one FIAT
Reporting data base to provide horizontal scalability and isolate business
users per business line to their own environments.
Trade volumes are increasing across all existing data feeds and the FIAT
Trading programme will increase the number of legal entities loaded for the
existing data feeds as well as add new data feeds from previously out of
scope systems. SQL Server performance is hitting limits in terms of
general DB size, log size and processing speed. These issues are
manifesting themselves in the need for constant DB tuning and maintenance
as well as the occasional SLA breach.
Responsibilities:
. Review the logical structure of the data model to understand the
current system.
. Prepared the understanding documents after reviewing the SQL Server
system.
. Involved in the gathering the new requirements to add it in the Oracle
system.
. Worked on covering the logic present in SQL Server stored procedures
to Oracle procedures.
. Involved in preparing the informatica mappings and workflows by
reviewing the logic present in DTS packages.
. Tuning the informatica mappings to obtain the better performance than
the DTS packages in SQL Server.
. Used Oracle Performance Tuning of SQL queries & Views using Explain
Plan, SQL Trace & TKPROF utilities.
. Prepared the test plans and executed to match the data between the SQL
Server and Oracle systems
. Supported the project in production in the initial days
Environment:
Informatica PowerCenter 8.1,SQL Server 2000,DTS,Oracle 9i and UNIX.
Exchange Control Mart,Central Bank -BARBADOS
June 2005 to March 2006
Role : Data Warehouse Developer
The CBB Data mart for Exchange Control is a project from the Central Bank
of Barbados. The CBB monitors the inflows and regulates the outflow of
foreign currencies, collects and collates foreign exchange data for
formulation of policies. CBB deals with multiple source systems and various
reports have to be generated to monitor the Exchange flow mechanisms. The
Analytics have been built on top of MS Analysis services and reports using
MS Reporting services.
Responsibilities:
. Translate Business Requirement into design documents.
. Created the database objects in SQL Server 2000
. Developed stored procedures in SQL Server
. Created the design documents.
. Created the DTS packages to load data from flat files into SQL Server
database.
. Involved in preparing and executing the UTPs and system test plans.
Environment: SQL Server 2000,DTS.