Post Job Free

Resume

Sign in

Data Warehouse

Location:
Mumbai, MH, India
Posted:
June 09, 2016

Contact this candidate

Resume:

Srinivas Doppalapudi

acu514@r.postjobfree.com

+91-961*******

lead consultant

Experience Summary

Overall 5.6 Years of expertise in Informatica/Data Integration Services .

Expertise in extraction, transformation and loading the data directly from different heterogeneous source systems like Oracle, Sql Server, flat files and CSV files.

Expertise in ETL Design, writing ETL Code and ETL Code Review, BO Universe design, Informatica Administration, Data query enhancements, writing SQL PL/SQL code.

Worked in Designing Staging of different Module related to RBI’s Data Warehouse Project to extract for Enterprise Data Warehouse/Data Mart and implementing the best practices for efficiency and optimization.

Handling Initial / Incremental Loads, Change Data Capture, Slowly Changing Dimensions, Performance Optimization and Tuning of ETL using Lookups, Bulk Loading, and Parallel Execution.

Business process documentation, Technical Design Documentation, Data Flow Diagrams, Test Case writing.

Technical Skills

Education Qualification

Professional Details

Project Accomplishments

ETL

Informatica Power Center 9.x

Reporting Tool

SAP - BO

RDBMS

Oracle 10g

Languages

SQL, PL/SQL

Operating Systems

MS Windows NT/2000/XP/Vista, UNIX

MCA

SV UNIVERSITY 2006-09

INFOSTEP India Pvt Ltd as

Lead Consultant Dec-2010 To Till Date

Summary of Recent Engagements

Project Name: Data Warehouse of Indian economy

Client: Reserve bank of India, Mumbai

Duration: Feb’2011 – Till Date

Role: ETL Consultant.

Description

The Reserve Bank of India (RBI) has been, historically, generating and compiling a large volume of data on various aspects of the economy. It has a rich tradition of publishing these data in several of its publications. With time, the scope of data released by the Reserve Bank has enlarged and the manner in which the data were released has changed from print version to electronic and now through the interactive database across the Internet.

This data is available in downloadable and reusable formats through its enterprise wise data warehouse. For the benefit of the researches, the analysts and others outside the RBI, it has provided access to the publishable part of the data warehouse to the public over the Internet.

About DBIE Application:

This is an enterprise wide data warehouse for the Reserve Bank of India.

It is a repository of current and historical information for decision support and analysis.

Data from various operational information systems are integrated in the Warehouse.

It provides the users the facility of Adhoc Data Query (both simple & Advanced) and Search in addition to the facility of pre-formatted reports arranged by Subject area & by various Frequencies.

It also provides the facility of Online Analytical Processing

RBI’s Data Warehouse Division extracting the data from different source system. This data can be analyzed through a web browser-based client which empowers to do the analysis on Real Time basis and provide the facility to see the data in the form of dashboards. The application supports data cleansing and classification activities and provides a rich reporting and analysis environment that offers both ad hoc analysis capabilities for analyzing all the data. This division provides the assurance to end user about the consistency of data.

Responsibilities:

Being part of the Enterprise Data Warehouse team, taken up the role of coordinating, technical requirements, solutions between the end users and the offshore team.

To study to business process, ETL Documents like Design doc and Technical Design.

Design of the enhancement / new processes in Informatica.

Analysis for the requests raised by the Client regarding the existing applications.

Review of Mappings that are developed by the team.

Development/Enhancement of the ETL mappings, extract data from the various Sources and load it into the Oracle database.

Error validation of the data moving from various sources to targets.

Test the mappings and check the quality of the deliverables.

Wrote scripts for handling dynamic delimiter files, sources and targets.

Handled dynamic unstructured files into one standard format.

Creating Scripts for maintaining the Control information of the Sources / Targets Used Normalizer transformation sql transformations like Bank_id, Data Cleanse, and Match.

Used Central Repository to check in and check out of the jobs to maintain version history.

Involved in the Integration Testing.

Extensively worked on performance tuning of the programs and processes by using concepts like parallel data movements, data stores, recovery mechanism, etc.

Involved in Bug fixing.

Universe Designing through BO XI 3 using the confirmed dimensions across all the schemas.

Implementation of the Aggregate awareness and aggregate navigation facilities to meet the client requirements.

Performance management and Optimization of the universes.

Report Development using Business Objects Rich Client tool and BO Web Intelligence tool.

Performance management and Optimization of the Webi Reports.

Environment: Informatica 8.6, Informatica 9.1, Business Objects XI3, Oracle10g, Toad.

project Name: Basic statistical Return – 1

Client: Reserve Bank of India

Duration: Oct’2012 to Dec’2014 (Till date)

The Basic Statistical Returns (BSR) System introduced in December 1972 has been in force for more than three and half decades. The improvements in the system are effected from time to time. To provide guidance for filling in of BSR 1 and 2 returns, the Reserve Bank brought out the first Handbook of Instructions in September 1972. Consequent upon The improvements and revisions in the BSR system, the Handbook was revised in January 1978, January 1984, January 1990, March 1996 and March 2002. In the last revision of March 2002, a new occupation/ activity coding system in BSR was introduced, which was in line with National Industrial Classification (NIC) – 1998 (present 2008 is base year).

The definition and concept of Small Enterprises (SE), comprising small and micro

Enterprises engaged in manufacturing and services, have been introduced in place of Small Scale Industries (SSI). The present edition, seventh in the line, provides for these amendments in the system. The revision also aims at bringing about improvement in the quality of data reported by the banks. The periodicity of BSR survey will remain yearly as hitherto and reference date of BSR-1 and BSR-2 will continue to be 31st March so as to coincide with the accounting year of the banks. However, in order to get more exhaustive and useful information, it has been decided to collect certain additional information through BSR-1.

Responsibilities:

Being part of the Enterprise Data Warehouse team, taken up the role of coordinating, technical requirements, solutions between the end users and the offshore team.

To study to business process, ETL Documents like Design doc and Technical Design.

Design of the enhancement / new processes in Informatica.

Analysis for the requests raised by the Client regarding the existing applications.

Review of Mappings that are developed by the team.

Development/Enhancement of the ETL mappings, extract data from the various Sources and load it into the Oracle database.

Error validation of the data moving from various sources to targets.

Test the mappings and check the quality of the deliverables.

Wrote scripts for handling dynamic delimiter files, sources and targets.

Handled dynamic unstructured files into one standard format.

Review of Mappings that are developed by the team.

Worked on Data Validation of source data stores being processed to ensure Data Accuracy, Data Consistency, and Data Cleaning using Auditing features to verify data warehouse integrity.

Developed shell scripts for automation of jobs through command option in Informatica.

Creating Scripts for maintaining the Control information of the Sources / Targets Used Normalizer transformation sql transformations like Bank_id, Data Cleanse, and Match.

Used Central Repository to check in and check out of the jobs to maintain version history.

Involved in the Integration Testing.

Extensively worked on performance tuning of the programs and processes by using concepts like parallel data movements, data stores, recovery mechanism, etc.

Involved in Bug fixing.

Universe Designing through BO XI 3 using the confirmed dimensions across all the schemas.

Implementation of the Aggregate awareness and aggregate navigation facilities to meet the client requirements.

Performance management and Optimization of the universes.

Report Development using Business Objects Rich Client tool and BO Web Intelligence tool.

Report Formatting using the Web Intelligence which includes utilization of features like Alerters, tracking, drill option etc.

Performance management and Optimization of the Webi Reports.

project Name: Hand Book of Statistics on Indian Economy

Client: Reserve Bank of India

Duration: Feb’2011 to Sep’2012

The dissemination of macroeconomic and financial data through Handbook of Statistics on the Indian Economy is increasingly becoming user friendly. The first Handbook publication was brought out by the Bank in 1998, the CD-ROM version started in 1999 and subsequently, since December 2000, it is released on the Bank’s website (www.rbi.org.in). The print version is presented with the aim of providing latest time periods data, longer historical data series has been provided in the CD-ROM. In the Platinum Jubilee Year of the Reserve Bank, Handbook was simultaneously released in electronic form on the “Database on Indian Economy (DBIE): Reserve Bank's Data Warehouse (http://dbie.rbi.org.in)” on September 15, 2009. Since then this publication is being updated nearly on real time basis and made available to internet users. We are encouraged by the positive response from researchers, analysts, market participants and other users as reflected by substantial increase in usage of the web version. The enthusiasm and expectations regarding

more frequent data updating has been reflected in the feedback received from some of the users among the teaching community, who are utilizing DBIE platform for discussing trends in various economic indicators of Indian economy.

The present volume of Handbook of Statistics on the Indian Economy for the year 2012-13 14th in the series, contains 247 statistical tables in print version and 269 statistical tables in CD-ROM version. It covers time series statistics on a wide range of economic and financial indicators pertaining to national income aggregates, output, prices, money, banking, financial markets, public finances, foreign trade, balance of payments and select socio-economic indicators, which are released at various frequencies. This publication in print/CD-ROM form is being brought out from the Bank's Data Warehouse by Operational Analysis Division (OAD), Department of Statistics and Information Management (DSIM).

Responsibilities:

Being part of the Enterprise Data Warehouse team, taken up the role of coordinating, technical requirements, solutions between the end users and the offshore team.

To study to business process, ETL Documents like Design doc and Technical Design.

Design of the enhancement / new processes in Informatica.

Analysis for the requests raised by the Client regarding the existing applications.

Review of Mappings that are developed by the team.

Development/Enhancement of the ETL mappings, extract data from the various Sources and load it into the Oracle database.

Error validation of the data moving from various sources to targets.

Test the mappings and check the quality of the deliverables.

Wrote scripts for handling dynamic delimiter files, sources and targets.

Handled dynamic unstructured files into one standard format.

Review of Mappings that are developed by the team.

Worked on Data Validation of source data stores being processed to ensure Data Accuracy, Data Consistency, and Data Cleaning using Auditing features to verify data warehouse integrity.

Developed shell scripts for automation of jobs through command option in Informatica.

Creating Scripts for maintaining the Control information of the Sources / Targets Used Normalizer transformation sql transformations like Bank_id, Data Cleanse, and Match.

Used Central Repository to check in and check out of the jobs to maintain version history.

Involved in the Integration Testing.

Extensively worked on performance tuning of the programs and processes by using concepts like parallel data movements, data stores, recovery mechanism, etc.

Involved in Bug fixing.

Universe Designing through BO XI 3 using the confirmed dimensions across all the schemas.

Implementation of the Aggregate awareness and aggregate navigation facilities to meet the client requirements.

Performance management and Optimization of the universes.

Report Development using Business Objects Rich Client tool and BO Web Intelligence tool.

Report Formatting using the Web Intelligence which includes utilization of features like Alerters, tracking, drill option etc.

Performance management and Optimization of the Webi Reports.

Environment: Informatica 8.6, Informatica Hotfix 9, Business Objects XI3, SAP BO Xcelsius 2008 SP3, Oracle10g, Toad, UNIX.

Project Name: Anti-Money Laundering.

Client: Union Bank of India

Role: Sr. ETL Consultant

Duration: Dec’2010 – Feb ‘2011

There are 27 reports developed for the AML. The reports are based on the performance of Transaction Details, Account, customer, branch, products. The highlighted Reports are Transaction Details, Customer Details, Employee Details and many more.

The architecture of ETL framework is designed using Informatica Power Center. Oracle MIS Server is used for extracting the tables. Data is extracted from multiple source system like Oracle test server, Oracle Staging area.

Responsibilities:

To study and understand the SQL Scripts.

Used various Informatica Power Center Transformations, variables, built-in and custom functions, Scripts to implement Data flow design mapping.

Designed the ETL Jobs to Populate the Staging and Warehouse Tables to implement SCD’s

Implemented mapplet to ETL Data Flow Design.

Implemented Performance Tuning by using Data Transfer Methodologies.

Implemented Post-Session Scripts to generate the output file according to timestamp.

Error validation of the data moving from various sources to targets.

Test the mappings and check the quality of the deliverables.

Environment: Informatica 8.6, Informatica Hotfix 9, Business Objects XI3, SAP BO Xcelsius 2008 SP3, Oracle10g, Toad, UNIX



Contact this candidate