Dhruv Prabhakar
Email: ********@*****.***
Mobile: 408-***-****
Address: **** ******* *****, ***# *****, San Jose, CA 95129-1725
Experience Summary
. 6+ years of extensive experience working on Data warehouse and
Integration projects using Informatica Power Center, Repository
Manager, Designer, Workflow Manager, workflow Monitor, Data mart,
PL/SQL
. Hands-on experience with batch processing using SQL Loader, Oracle
external tables, bulk techniques/collections, and partitioning
. Experience in developing Data Replication, Data Synchronization,
Mapping Configuration, Mapplets and scheduling jobs using Informatica
Cloud application.
. Experience in developing jobs using Talend Enterprise Data Integration
tool.
. Experience in developing jobs using CDC model in Talend.
. Experience in complete life cycle implementation of an Enterprise Data
warehouse and DataMart.
. Strong experience in designing the Design Documents for loading data
into Data Warehouse and DataMart tables.
. Expertise in implementing complex Business rules by creating reusable
Mappings/Mapplets, reusable Transformations and Tasks.
. Proficient in using Workflow Manager Tools like Task Developer,
Workflow, Designer and Worklet Designer.
. Implemented Slowly Changing dimensions Type 1, Type 2, and Type 3
methodology for accessing the full history of accounts and transaction
information.
. Extensive ability to debug, troubleshoot, and solve complex PL/SQL
issues, and the ability to reverse engineer existing PL/SQL code into
functional requirements
. Experience in SQL tuning of medium/large database environments
including optimizer hints and de-normalization
. Basic understanding of analyzing database performance-using utilities
like Explain Plan, and TKPROF
. Hands-on experience developing Oracle PL/SQL Procedures, Triggers &
Functions
. Extensive experience in UNIX shell scripting (Ksh/Csh), Job scheduling
using Control-M and Informatica Scheduler.
. Experienced in performance and tuning of Informatica mappings and
sessions for better performance.
. Strong experience in working with client for requirements gathering
and functional documentation while performing Business Analyst/Team
Lead roles
. Strong experience in Data warehousing concepts, Star Schema and
Snowflakes Schema methodologies used in relational, dimensional
modeling and multidimensional modeling, Data Profiling and Data
cleansing.
. Strong experience in integrating data between different sources and
targets like Salesforce.com, JD Edwards, E-Business Suite, Relational
databases, Flat files etc.
. Strong experience in Integrations involving to invoke the web services
using the web service consumer transformation.
Skills
Operating Systems UNIX, Windows 7/2000/XP
Databases Oracle 11g/10g, SQL Server
Database tools: Toad SQL developer, PL/SQL Developer and HPDM
ETL & Business Informatica Power Center 9.5/9.1/8.6,OBIEE 10.1.3.X,
Intelligence Tool Talend enterprise data integration 5.6.1, Informatica
Cloud
Data modeling & Star-Schema, Snowflake Modeling, FACT/Dimension
Data integration Tables, Erwin
Scheduling Control-M and Informatica Scheduler
Project# 1
Client VMware
Date July 14 - till date
Role ETL Consultant
Tools/Technolo Informatica Power Center, Informatica Cloud, Oracle 11g,
gies: SQL Server, UNIX (SSH shell), Windows XP, Oracle SQL
Developer, Toad, Control-M, Perforce, HP Project and
Portfolio Management Workbench, Force.com, SoapUI,
Salesforce.com, JD Edwards
VMware, Inc. is a U.S. software company that provides cloud and
virtualization software and services. It virtualizes computing, from the
data center to the cloud to mobile devices. It also claims to be the first
to commercially successfully virtualize the x86 architecture.
Responsibilities:
1. Involved in multiple projects to integrate the data between different
systems like EBS, Salesforce.com, SQL Server database, Mongo DB etc.
1. Worked closely with Data Architects & Business Analysts on capturing
and analyzing the requirements from the Business and Functional
requirement documents.
2. Worked on the overall designing of the Informatica workflows
architecture and implementing them as jobs/ batches.
3. Worked as team lead with a group of other Informatica developers in
order to implement the end-to-end logic and functionality over
multiple tracks. Manages the team of developers through daily sync up
calls and meetings in order to discuss the development progress, new
requirements, changes and project timelines.
1. Worked on the multiple technical design documents.
2. Worked on the integrations using Salesforce.com, JD Edwards, Oracle
EBS, SQL Server DB, flat files, Mongo DB, Oracle Fusion Middleware and
others as both sources and targets.
4. Created mappings using various Transformations like Source Qualifier,
Aggregator, Expression, Filter, Router, Joiner, Stored Procedure,
Lookup, Update Strategy, Sequence Generator, Normalizer, Transaction
control, SQL, Web Service consumer.
1. Used Informatica workflows, UNIX scripts to control and automate
various complex loads.
2. Developed reusable components like mapplets, worklets and
transformations.
3. Worked on the reusable sources and targets by creating a shared folder
and using the shortcuts from the shared folders into the main folders.
4. Experience in working on Informatica Cloud application. Created Data
Replication, Data Synchronization and Mapping configuration tasks.
5. Developed Data Synchronization tasks to synch the data between
different Salesforce.com objects and Oracle database.
6. Developed Incremental Data Replication tasks to replicate the data
from Salesforce.com to Oracle database.
7. Created Mapping Configuration tasks using the Mapping Integration
templates. Also worked on creating the mapplets on Informatica cloud
using an existing XML.
8. Developed reusable Unix scripts in order to archive the staging
tables, update the parameter files with the changing values in each
batch run.
9. Worked on the concepts of maintaining the parameters in the staging
database. Using the scripts to generate the parameter file dynamically
in each batch run picking the latest values from the staging database.
1. Worked the pmcmd utility in the command tasks to internally trigger
another job.
2. Tuned performance of mapping and sessions by optimizing source, target
bottlenecks, Memory management, Partitioning and tuning SQL overrides
during the Load testing on the Load testing environments.
1. Performed Code reviews to make sure the code adheres to the company
coding standards.
2. Scheduled various daily and monthly ETL loads using Informatica
PowerCenter, Informatica Cloud and Control-M.
5. Worked on the release notes and the master deployment plans
collectively with release team and other Dev teams involved.
6. Used the version control tools like Perforce & P4V to promote the code
like Informatica sources, targets, mappings, sessions, workflows,
Database DDL and DML scripts, Unix shell scripts, Migration docs to
QA, UAT, LT, STAGE and PROD environments.
Project# 2
Client Rovi
Date July 13 - June 14
Role ETL Consultant
Tools/Technolo Informatica Power Center, Talend Enterprise Data
gies: Integration, Oracle 10g, UNIX (SSH shell), Windows XP,
SQL navigator, Toad, ERwin Data Modeling Control-M, PVCS
Rovi Corporation provides guidance technology, entertainment data,
algorithms for recommendations, data analytics and interactive advertising
solutions for digital entertainment devices and services.
Responsibilities:
1. Involved in Designing and customizing data models for Data warehouse
supporting data from multiple sources.
2. Worked closely with Data Architects & Business Analysts on designing
efficient Data Models.
3. Worked with several Integration teams to gather the Source and
identify the Target and prepared the High level Design document & ETL
Technical Specification document for this project
4. Worked on Design and development of Informatica mappings, workflows to
load data into staging area, data warehouse and data marts in Oracle.
5. Created mappings using various Transformations like Source Qualifier,
Aggregator, Expression, Filter, Router, Joiner, Stored Procedure,
Lookup, Update Strategy, Sequence Generator, Normalizer, Java
transformation and XML transformation.
6. Developed reusable Mapplets and Transformations
7. Used Informatica workflows, UNIX scripts to control and automate
various complex loads.
8. Developed and automated process to load data into various data marts
using Informatica Designer and Workflow manager.
9. Developed shell scripts for job control of Informatica Workflows using
pmcmd command.
10. Developed materialized views.
11. Performed Code reviews to make sure the code adheres to the company
coding standards
12. Scheduled various daily and monthly ETL loads using Appworx.
13. Used the version control tools like Perforce & PVCS to promote the
code like Informatica mappings, sessions, workflows, reusable
transformations, Oracle DDL and DML scripts, Unix shell scripts,
Migration docs to QA, UAT and prod.
14. Developed jobs using Talend data integration tool and its various
components like tFileInputDelimited, tFileOutputDelimited,
tFilterColumns, tFilterRow, tJoin, tMap, tNormalize, tReplace, tRules,
tSortRow, tMsgBox, tRowGenerator.
15. Worked on the Talend Contexts, Repository schemas and Metadata schemas
for DB Connections, File Delimited, File xml, Salesforce.
16. Developed jobs using the CDC model.
Project# 3
Client Constellation Energy
Date Feb 12 - July 13
Role ETL Consultant
Tools/Technologies: Informatica Power Center, Oracle, UNIX Shell
Programming, Toad (oracle), Control-M
Constellation, an Exelon company, is a leading competitive supplier of
power, natural gas, renewable energy and energy management products and
services for homes and businesses across the continental US. It also
provides demand-side management solutions that help customers strategically
buy, manage and use their energy.
Responsibilities:
1. Worked on the analysis, design and development to ensure the
successful implementation of the data loading processes.
1. Set up connection parameters for the source and target using
Relational Connections and to hold the path for source and target.
1. Involved in analysis and performance of mappings/sessions. Increased
performance by tuning the transformations and discussing database
issues with the DBA.
1. Worked with Data Extraction from Relational DBMS and Flat Files.
1. Created and managed the global and local repositories and permissions
using Repository Manager in Oracle Database. Created users and
assigned privileges.
1. Created and Monitored Sessions and Batches using Server Manager to
load the data into the Target Database.
1. Managed versioning of folders in a repository.
1. Developed Mappings using Active Transformations such as Aggregator,
Filter, Joiner, Router, Sorter, and Union Transformations.
1. Also used Passive Transformations such as Expression, Lookup, and
Sequence Generator in various mappings.
1. Created and used reusable Transformations for easier maintenance of
the data.
1. Integrated data cleansing processes within the Mappings. Performed
debugging of Mappings using the Debugger for failed Mappings.
1. Created Workflows and Sessions to carry out loading process into the
target.
1. Created and used Workflows using different Tasks like Session, Command
and Timer.
1. Wrote shell scripts in Unix to provide automation for the process of
running the Workflows. Worked with Workflow and Session Log files to
troubleshoot the encountered errors.
Project# 4
Client World Fuel Services
Date May 10 - Jan 12
Role Data warehouse consultant
Tools/Technolo Informatica Power Center, Oracle, UNIX Shell Programming,
gies: Toad (oracle), Control-M
World Fuel Services is a global leader in the downstream marketing and
financing of aviation, marine, and ground transportation fuel products and
related services. World Fuel Services is a provider of fuel procurement,
fuel management, credit and financing, and price risk management, as well
as specialized segment capabilities such as aviation trip planning, bunkers
quality control and technical support, and aviation and ground fuel bulk
supply
Responsibilities:
1. Worked on Informatica Power Center 8.1/8.6 and used Source Analyzer
and Warehouse Designer to import the source and target schemas, also
used the Mapping Designer to map the sources to the targets.
1. Involved in developing ETL processes/programs with Informatica Power
Center to extract data from the client's operational databases,
Transform and filter the data, and Load it into Target database.
1. Worked directly with Requirement's Gathering and Analysis of the Data.
1. Integrated data cleansing processes within the Mappings. Performed
debugging of Mappings using the Debugger for failed Mappings.
2. Used Informatica PowerCenter Workflow manager to create sessions,
batches and workflows to run the logic embedded in the mappings.
2. Designed various Mappings, Mapplets and Workflows using Informatica
PowerCenter.
3. Developed Mappings and loaded data into the Relational Database and
displayed understanding of Dimensions as well as Fact Tables.
3. Extracted data from Flat Files of both types (fixed width as well as
delimited files) and from the MySQL, Oracle 9i/10g RDMBS.
2. Developed and tested all the Informatica Mappings involving Stored
Procedures, Lookups and Update Strategy Transformations.
2. Worked extensively on different types of Transformations like Source
Qualifier, Aggregator, Expression, Filter, Joiner, Lookup, Normalizer,
Rank, Sequence Generator, Stored Procedure and Update Strategy.
1. Created complex Mappings involving both Connected and Unconnected
Lookup Transformations.
1. Lead a team 5 ETL developers and coordinated the tasks with them on a
daily basis
Between onsite and offshore teams.
2. Involved in writing UNIX shell scripts for Informatica ETL tool to run
the Sessions.
14. Involved in Session Partitioning and Performance Tuning of the
Database and Informatica and scheduling.
15. Involved in analysis of Session, Event and Error logs for
troubleshooting Mappings and Sessions.
Project# 5
Client Cardinal Healthcare
Date Apr 09 - May 10
Role Informatica Developer
Tools/Technolo Informatica Power Center, Oracle, UNIX Shell Programming,
gies: Toad(oracle), Control-M
Cardinal Healthcare is a leading company that improves the cost
effectiveness of healthcare. As the business behind healthcare, Cardinal
Health helps pharmacies, hospitals and ambulatory care sites focus on
patient care while reducing costs, improving efficiency and quality, and
increasing profitability. It also, develops and delivers pharmacy software
solutions, consulting and management services for hospitals and health
systems.
Responsibilities:
. Designed the ETL Plan to extract data from source system and
load into the landing tables and from there to the Datamart
tables.
. ETL plan included populating the dimension and fact tables,
sequencing the loads to maintain referential integrity with
consideration of data cleansing and Data quality.
. Designed the Technical specification document and the
implementation method documents and shared with Integration team
and other ETL project teams.
. Designed the required Dimension and facts tables which satisfy
the BO reporting rules by interacting with reporting team and
produced the DDL scripts to Database team.
. Designed the ETL specification documents to gather existing
workflows information from different ETL teams and shared with
Integration and production maintenance team.
. Designed SFDC ETL mapping for complicated Business rules like
Incentives and Hierarchical structure Payees.
. Designed on-demand ETL mapping by using web services calls to
SFDC systems.
. Designed the Validation Mapplets in IDQ and exported to power
center for validating the Payee related data.
. Prepared the Data validation report queries and executed after
every ETL runs and shared the resultant values with Business
users in different phases of the project.
. Configured the ETL workflows with control tables of warehouse to
extract the data on requirement basis.
. Written shell scripts at workflow command level to Archive the
spreadsheet provided by Business users.
. Coordinated with the operations team for scheduling the Work
Flows and processed the flat files with data wise.
. Tested the mappings generated by analyzing the data flow,
evaluating transformations and fixing the bugs so that they
conform to the business needs.
. Supporting upcoming enhancements to global data warehouse for
corporate wide data requests including ETL modifications and
next generation reporting capabilities.
. Exclusively used the concepts of Data Profiling, Data Match,
Organizing the record Groups, Data Standardization, Association
rules and Address validators from IDQ.
. Exclusively worked with Business users and gathered Information
in ETL development for reports like Eligibility Summary Report,
Incentive Compensation statement, Plan to Date Scorecard,
Management Summary, Awards and Performance Consistency Reports.
. Exclusively worked with production team in resolving processing
difficulties with control tables and shared objects.
. Worked on data base changes and created required indexes and
applied post session commands in ETL
. Implemented the session level parallel processing performance
method to extract the data.
Project# 6
Client Apple
Date Sep 08 - Mar 09
Role PL/SQL Developer
Project iCustomer Database
Tools/Technolo Unix, Oracle PL/SQL, SVN, Autosys, Crontabs
gies:
iCustomer database is a repository of customer information. It deals with
maintaining single image of the customer by complex process of
standardizing/matching/de-duping data.
Project involves changes and enhancements to existing logic to improve
customer experience
Responsibilities:
. Continuous enhancement to existing logic, includes design, test and
implement new changes
. Wrote PL/SQL, Unix scripts to run analysis across millions of records
. Check and optimize queries by Indexing/Partitioning
. Used SQL Loader to load huge amount data into tables
. Wrote Unix/SQL scripts to monitor 8-10 inbound/outbound interfaces
. Worked on performance improvements using explain plan,
Indexing/partitioning
. Wrote custom scripts with Unix/SQL for identifying and alerting
stakeholders during yearly peaks
. Co-ordination with all inbound/outbound systems for cross-system
functionality changes and involved in testing/implementing the same
. Used light weight triggers to accomplish small functionalities
. Worked on advanced queues for transferring data between systems and
handled maintenance activity on AQs
. Worked on migration from Advanced queues to messaging tables for
custom ETL
. Created Views for logically grouping data going to different
downstream systems