SHRADDHA NEEMA
Ph.: 281-***-****; Email: abmpij@r.postjobfree.com
PROFESSIONAL SUMMARY:
Over a span of 8 years associated with BI and Data Warehouse projects in
the roles such as:
. Extensively worked with large Databases in Production environments.
. Data Processing experience in designing and implementing Data Mart
applications, mainly transformation processes using ETL tool
Informatica Power Center and Business Object for Reporting purpose.
. Proficient in understanding business processes / requirements and
translating them into Technical requirements Exposure to Business
Objects reporting environment.
. Extensively worked on Business object Universe Creation.
. Worked with Informatica XML component, Web service component, Power
exchange module, SAP R/3, Essbase cube, SQL server, Oracle, Flat file,
Main Frame.
. Extensively used BO features to make report more lively and
understandable to Users.
. ETL Process Design/Development/Deployment using Informatica and Sqls
. Performing ETL design/code reviews, Optimization suggestions for
Informatica and ETL Processes.
. Data staging strategy/designing with ETL tools like Informatica.
. Extensive work in ETL process consisting of data transformation, data
sourcing, mapping conversion and loading.
. Worked with Dot Net created consumer web service to perform certain
task as well as Informatica owned web services to perform basic
workflow operations such as login, logout, start a workflow etc.
. Extensively worked with Informatica XML component to read hierarchical
xml source file and load the data to multiple oracle tables.
. Proficient in XSD concepts that is useful to read an xml source file.
. Worked on two way data replication between AS 400 and Oracle using
Informatica power exchange CDC ( change data capture) component.
. Designed and extracted SAP Cost hierarchies, cost center tables with
the Informatica mapping. Generated ABAP program using Informatica to
SAP R/3 connectivity.
. Extracted data out from SAP R/3 in streaming mode.
. Implementation of ETL Strategy, which optimizes Mapping Performance
. Sound Dimensional modeling Concepts for Data Warehouse, Dimensional
Star Schema and Snowflakes Schema methodologies.
. Scheduling of ETL process using UNIX scripting and Cron job scheduler
. Executing sessions, sequential and concurrent batches for proper
execution of mappings and sent e-mail using server manager.
. Project Planning, Scheduling and Leadership.
. Proficient in Analysis, Designing & Development, and Maintenance
projects.
. Client Liaison, Requirement gathering and Impact Analysis.
. Six Sigma trained and knowledge of rigor in software projects.
. Well Developed Interpersonal, Analytical and communication skills.
EDUCATION:
Bachelor of Engineering in Computer Science
CERTIFICATION:
Informatica certified professional
. PowerCenter 6 Architecture and Administration
. PowerCenter 6 Mapping Design
. PowerCenter 6 Advanced Mapping Design
. PowerCenter 6 Advanced Administration
OCP Certifications
. Certification in Oracle 9i SQL (OCP Level 1, 1Z0-007)
. Certification in Oracle 9i SQL (OCP Level 2, 1Z0-031)
. Six Sigma Green Belt Certified.
SKILL SETS:
Technical: ETL Design, Architect, Development, Testing, Tuning and
Scheduling
Software & Tools: Informatica Power center8.x/ 7.x / 6.x, Business
Object 6.x, TOAD, SQL+, PL-SQL Developer, Secure UNIX
Shell Script, Erwin
Methodologies: Star Schema, Snow Flake Schema
RDBMS: Oracle 9i/8i (SQL), SQL server 2005
Operating System: Windows 2000/XP, Windows NT
Professional Experience:
Litton Loan Servicing, Houston Tx Sept 2008 - March 2010
ETL Developer
As an ETL developer, I worked in multiple projects going in Litton loan
servicing such as BPO(Broker Price Opinion) and INSPECTION data load using
Informatica xml component, data push Utility for Technical liaison, Data
replication between AS 400 and Oracle using power exchange component.
Responsibilities:
. Collected requirements from business users for Data Push Utility
project. This project gives flexibility to business user to load the
data in no time.
. Used the Tidal scheduler to schedule each DPU module, so that user can
drop the flat file to network location at any time and data will get
loaded to respective oracle tables.
. Created Requirement as well as technical design documents for BPO and
Inspection load from existing Java code.
. Implemented ETL Strategy, which optimizes Mapping Performance for BPO
and Inspection load as we used to receive 5 to 6 thousands of xml file
which needs to process everyday..
. BPO and Inspection Data Mapping Documents are created that provide the
mapping from Source to Target for the data extracted from xml file to
Oracle table.
. Multiple photo files are attached in BPO And inspection xml file, Java
transformation is being used to extract the Images and convert the
jpg, gif, tif photo file to PDF file so that user can access it easily
via PEARL frontend.
. Used Dot net created consumer web service in Informatica to create CMA
and Inspection PDF form for better analysis of a Loan.
. Used Informatica owned web service to start the workflow, login and
logout.
. Used Informatica concurrent feature to run workflow in multiple
instances simultaneously. This helped in loading the multiple xml file
simultaneously to Oracle tables.
. Designed and developed Informatica mappings, to replicate the data
from AS 400 to Oracle table and vice versa using power exchange change
data capture (CDC) module.
. Designed and Developed Complex mappings in Informatica to load the
data from various sources using different transformations like Union,
Source Qualifier, Look up (connected and unconnected), Expression,
Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, and
Router transformations.
. Performed Data analysis and modifications as per user request for DPU
and BPO and Inspection load.
. Responsible for the migration of Informatica mapping to UAT
environment.
. Performed ETL design/code reviews for Informatica and ETL Processes.
. Performed integrated testing for various mappings. Tested the data and
data integrity among various sources and targets.
. Executed sessions, sequential and concurrent batches for proper
execution of mappings and sent e-mail using server manager.
. Identified source systems, connectivity, tables and fields, ensure
data suitability for mapping.
. Production deployment of Informatica mappings.
Environment: Informatica Power Center 8.x, Informatica XML component, Power
exchange component, Web service component, xml file, MS Excel, Flat files,
Tidal Scheduler, PL/SQL, Erwin Data Modeler.
Plant Operations Key Performance Indicator Dashboard - Reliant Energy,
Houston Tx Feb 2008 - Aug 2008
ETL Technical Lead
Developed and implemented a repeatable and sustainable Key Performance
Indicator scorecard in a Dashboard for Plant Operations that measures
performances against the Operations Excellence commitments in spend,
safety, environmental, and GCTWF. The Plant Ops KPI Dashboard
Implementation project was initiated by Reliant to enable the dissemination
of performance reporting data via a central dashboard to key decision
makers located across the organization. Before this dashboard, Plant
reports are created in Excel or Word and distributed through email or
shared network folders. This causes delays in the sharing of information,
and lacks the controls and security required for financial and key
performance information. A centralized dashboard for Plant Ops KPI's will
enable financial and operational information to be available in a more
efficient manner. It will also move Reliant closer to "one version of the
truth" in metric information.
Responsibilities:
. Collected requirements from business users and analyzed based on the
requirements.
. Created Requirement as well as technical design documents.
. Implemented ETL Strategy, which optimizes Mapping Performance.
. Data Mapping Documents are created that provide the mapping from
Source to Target for the data extracted to the ODS or ETL Staging area
that being loaded to a Dimensional Model by subject area.
. Actively involved in establishing SAP R/3 to Informatica connectivity.
. Actively involved in installing Hyperion Data integration management,
ERP/ CRM - SAP connector, Hyperion Essbase connector.
. Designed and developed Informatica mappings, SAP R/3 as one of source
system.
. Extracted SAP Cost hierarchies, cost center tables with the
Informatica mapping. Generated ABAP program using Informatica to SAP
R/3 connectivity.
. Designed and developed sustainable method of populating data to
dashboard Essbase cube. In order to achieve this created Informatica
mappings using Hyperion DIM to load the data into Plant ops KPI
dashboard and Wholesale Dashboard.
. Designed and Developed Complex mappings in Informatica to load the
data from various sources using different transformations like Union,
Source Qualifier, Look up (connected and unconnected), Expression,
Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, and
Router transformations.
. Performed Data analysis and modifications as per user request.
. Responsible for the migration of Informatica mapping to UAT
environment.
. Performed ETL design/code reviews for Informatica and ETL Processes.
. Performed integrated testing for various mappings. Tested the data and
data integrity among various sources and targets.
. Executed sessions, sequential and concurrent batches for proper
execution of mappings and sent e-mail using server manager.
. Communicated heavily with clients for a better understanding of the
requirements.
. Identified source systems, connectivity, tables and fields, ensure
data suitability for mapping.
. Production deployment of Informatica mappings.
. Handled Ad-hoc data base change requests and transition to maintenance
team.
Environment: Hyperion DIM (Data Integration management) 9.3.1.1, Source
system - SAP R/3 (FI, CO module), Hyperion Analytical services Essbase 9.x,
MS Excel, Flat files, Toad Data Modeling tool.
CFS Marketing Data Mart GE commercial finance, Norwalk CT Dec 2006
- Jan 2008
Sr Developer/ETL Technical Lead
Creation of a Marketing Data Mart which incorporates new and existing
Siebel, Workflow, LPC, and Oracle HR. The Marketing Data Mart (SMART) will
be used by Marketing and Finance (Pricing) employees to improve sales force
effectiveness. In addition to the database and load procedures, Business
Objects Universe(s) will be created to provide users with an extensive
canned and ad-hoc reporting environment. Previously users were creating
manual report by using MS Excel, now Users got training in BO so that they
can effectively use the Business Object Reporting environment to create
their own reports which they can save and run on time to time to get
updated report.
Responsibilities:
. Designed and Developed Complex mappings in Informatica to load the
data from various sources using different transformations like Union,
Source Qualifier, Look up (connected and unconnected), Expression,
Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, and
Router transformations.
. Used Informatica 7 new features as sharing objects with folders,
versioning control so that we can fetch the older version of any
object if there is any need using queries.
. Collected requirements from business users and analyzed based on the
requirements.
. Implemented ETL Strategy, which optimizes Mapping Performance.
. Used Informatica to test the mapping and fixed the bugs.
. Followed client's internal template to create Unit testing plan
documents for testing purpose which test transformation to
transformation and field to field values.
. Designed mapping documents, which includes details of field-to-field
mapping.
. Utilized standard documentations for transformations and Mapping
variables.
. Involved in data mapping and conversions using transformations and
data cleansing.
. Performed Data analysis and modifications as per user request.
. Responsible for the migration of Informatica mapping to UAT
environment.
. Performed ETL design/code reviews for Informatica and ETL Processes.
. Used Informatica Session partitioning concept to improve the Loading
time.
. Implemented detailed error handling strategy to capture all errors
from within the mapping and reprocessing the error records from the
error tables.
. Performed integrated testing for various mappings. Tested the data and
data integrity among various sources and targets.
. Executed sessions, sequential and concurrent batches for proper
execution of mappings and sent e-mail using server manager.
. Communicated heavily with clients for a better understanding of the
requirements.
. Identified source systems, connectivity, tables and fields, ensure
data suitability for mapping
. Designed and developed Informatica mappings.
. Involved in Creating Business Object Universe.
. Utilized BO functionality to hide the inner complexity of DWH to
users.
. Created the BO Reports as per the client requirements.
. Extensively used BO features to make report more lively and
understandable to Users.
. Used the BO security feature to show the facts required to the
respective Users.
. Utilized BO pie chart and column chart to show the detail pictorially.
. Used drill up and down, Slice and dice functionality. Also used BO's
customized list of values feature.
. Trained the client on creation of Business Object Reports.
. Production deployment of Informatica mappings and Business Object
universe.
. Handled Ad-hoc data base change requests and transition to maintenance
team.
Environment: Informatica Power Center 8.0, MS Excel, Flat files, Business
Objects 6.5
GLM Adhoc Reporting GE commercial finance, TCS - Mumbai Feb
2006 - Nov 2006
Data Architect/ETL Team Lead
Global Lease Management (GLM) is a single leasing system that all the GE
Commercial Finance businesses can use which will drive the standardization
of processes and data. This will replace the various leasing systems that
currently exist across GE Commercial Finance.
UK Business uses legacy system called Dec-Alpha, GLM will replace this
system. Currently Bob is used by UK business as an Ad-hoc reporting tool
and has become an essential tool for day to day management for the various
departments, the Bob is refreshed with data from Dec Alpha system on a
monthly basis, the feed from the Dec-alpha system is FTP'ed in the form of
flat files and loaded to Bob Universe tables. The Database was built over
time in order to meet the successive user needs and consist of the
historical data as present in the Dec-alpha system.
On rollout of GLM, The Dec Alpha will no longer feed the BOB Universe with
the data it currently receives. So the business will not have the reporting
capabilities that it does currently. GLM will be used for reporting
purposes.
Responsibilities:
. Facilitated end-user training over Business object Universe and how to
create Adhoc reports.
. Production deployment of Informatica mapping and Business Object
universes & reports
. Extracted data from heterogeneous data sources like Oracle, Flat files
etc.
. Extensively involved in Data Extraction, Transformation and Loading
(ETL process) from Source to target systems.
. Part of the Data Modeling Team and responsible for giving client
presentations about different strategies to implement the ETL routine.
. Developed new Builds and ETL jobs as per Business requirements.
. Designed and developed Informatica mappings.
. Responsible for Project management activities as the ETL team Lead to
include analyzing data, creating data plans, assigning task and
meeting with stakeholders.
. Implemented ETL Strategy, which optimizes Mapping Performance. The
benefit was less time to load large amounts of data into the Target
system.
. Scheduled ETL processes using UNIX shell scripting in Cron tab.
. Debugger used for Testing and performance tuning of Informatica
mappings.
. Utilized Internal standard template to create Unit testing plan
documents for testing purpose which test transformation to
transformation and field to filed values.
. Designed complex builds involving incremental refreshes of huge fact
tables
. Analyzed data and made modifications per user request.
. Migrated Informatica mapping to UAT for user to test the data.
. Performed ETL design/code reviews, Optimization suggestions for
Informatica and ETL Processes.
. Implemented a detailed error handling strategy to capture all errors
from within the mapping; Reprocessed the error records from the error
tables.
. Created high level document for view of shell script.
. Analyzed and implemented enhancements, operations and maintenance of
data warehouse.
. Handled Ad- hoc data base change requests and transition to
maintenance team.
. Worked on different Instances of BO universe to test the reports.
. Used prompt syntax of BO for interactive reports.
. Primarily responsible for resolving the loop situation in BO universe
--Aliases and Context.
. Involved in Performance tuning of various queries.
Environment: Informatica 7.x, PL/SQL Developer, Oracle 9i, MS Excel, Flat
Files, Windows 2000, Business Object 6.5,Erwin
GE VFS CDF Gen -2 GE Commercial Finance, TCS - Mumbai India Dec 2005 -
Jan 2006
Project Lead
The CDF (Commercial Distribution Finance) Finance department has no way of
reporting the full picture on a customer / VBU (i.e. wing-to-wing reporting
showing risk, sales / opportunity, finance, and AIMS operational history
information) in order to develop a more accurate picture of a given
customer or VBU. For this purpose they came up with VFS (Vendor Finance
Services) CDF. VFS CDF Gen 2 is the enhancement for VFS CDF.
Responsibilities:
. Participated in Maintaining Business Object Universe and creating
reports.
. Gave training to user on Business object Universe and how to create
Adhoc reports.
. Handled Ad-hoc data base change requests and transition to maintenance
team.
. Worked on different Instances of BO universe to test the reports.
. Used prompt syntax of BO for interactive reports.
. Production deployment of Informatica mappings and BO Universe
. Responsible for User Acceptance assistance.
. Assisted Users in creating their own reports.
. Developed and implemented ETL processes using Informatica to move data
from different sources into Data mart.
. Design ETL flow strategy for mappings.
. Responsible for Project management activities to include general
project oversight.
. Keep a track of all the meetings in the form of minutes of meeting.
. Used TestDirector to create Unit testing plan documents for testing
purposes.
. Used Test Director to create the Unit testing Result documents for the
UTP's.
. Facilitated transition training to the Maintenance team for better
understanding and flow of mappings.
. Created Run Guides that explain about source system, target system and
how they are related.
. Performed Data analysis and modifications as per user request.
. Reviewed all Informatica mappings before migration to UAT according to
ETL standards.
. Performed ETL design/code reviews and Optimization suggestions for
Informatica & ETL Processes.
. Involved in Performance tuning of various queries.
. Scheduled ETL processes using UNIX shell scripting in Cron Job.
Environment: Informatica 7.x, PL/SQL Developer, Oracle 9i, MS Excel, Flat
Files, Windows 2000, Business Object 6.5, TestDirector
GE VFS CDF - SRS GE Commercial finance, TCS- Mumbai India Feb 2004 -
Oct 2005
Module Lead
The CDF Finance department has no way of reporting the full picture on a
customer / VBU (i.e. wing-to-wing reporting showing risk, sales /
opportunity, finance, and AIMS operational history information) in order to
develop a more accurate picture of a given customer or VBU. For this
purpose they came up with CDF SRS. CDF SRS one is the conversion project.
They have database in SQL server and primarily SRS data deals with North
America and Europe data and for that purpose they have two different
database one for North America and other one for Europe. And on top of that
they have two universes. Once we have migrated whole data from SQL server
to Oracle 9i, we have only one data base and on top of it one universe. By
doing this, we get integrated data for both regions. Also, privacy &
security issues are being considered, Europe associate has only access to
Europe's data. We can do that at Business object level.
Responsibilities:
. Production deployment of Informatica mappings and Business Object
universes & reports.
. Participated in Building Business Object Universe and creating Reports
per User requirement.
. Developed UNIX shell scripting for scheduling jobs.
. Handled Ad-hoc database change requests.
. Responsible for User Acceptance assistance.
. Implemented Transformation Developer to create the filters, lookups,
Joiner, update Strategy, expressions and aggregations transformations.
. Managed Repositories. Created user groups by setting up their
privileges and profile.
. Designed mapping documents, which includes details of field-to-field
mapping
. Scheduled and monitored task, session, workflows used in performance
tuning for efficient loading of data from various sources.
. Executed sessions, sequential and concurrent batches for proper
execution of mappings.
. Used session partitions, dynamic cache memory, and index cache to
improve the performance of Informatica server.
. Developed test specifications documents in Test Director, and kept a
track of number of issues.
. Responsible for overall project management activities.
. Data analysis and modifications per user request.
Environment: Informatica 7.x, PL/SQL Developer, Oracle 9i, MS Excel, Flat
Files, Windows 2000, Business Object 6.5, Test Director
Sales Work Bench: Variable Compensation GE Health Care System, TCS - Mumbai
India
Oct 2003 - Jan 2004
Team Member
Callidus Software EIM System True-Comp is used by the GE Healthcare to
automate the sales compensation paid to sales representatives. GE Medical
System uses True-Comp Manager for calculation of compensation for selling
medical equipments across the globe. True Comp Manager enables compensation
professionals to efficiently model, implement, and monitor incentive
compensation programs with easy-to-create business rules. Data is loaded
into different tables using Informatica.
Information are shown using Actuate reports where the information is
fetched from Oracle using Actuate basic scripting. Reporting provide
various information for sales representative related to their performance.
Reporting uses page level security for security reasons. All the reports
are connected to SWB portal using Java programs and SWB is accessible to
sales representative using Web.
Responsibilities:
. Developed various Mappings, Transformations and Mapplets for migration
of data using Informatica Designer
. Worked extensively with complex mappings using expressions,
aggregators, filters, lookups, Update transformations in Informatica
. Followed Six Sigma process for every phase of the project.
. Increased the cache size of transformations depending upon the
requirements
. Designed mapping documents, which includes details of field-to-field
mapping
. Performed data migration, data cleansing and aggregation during the
transformation
. Configured Informatica Server Manager. Scheduled data integration
Sessions & Batches
. Responsible for Performance tuning in Informatica
. Performed Error Handling and debugging using Informatica debugger
. Responsible for data staging strategy/designing with ETL tools like
Informatica
. Implemented ETL Strategy, which optimizes Mapping Performance.
. Used internal Quality analysis tool for the mapping and related
documents.
. Developed test specifications documents in Test Director.
. Tested ETL mapping results and validations.
Environment: Informatica 6.2, PL/SQL Developer, Oracle 9i, Windows 2000,
TestDirector
Data Bypass, SVITS - Indore India Jan 2002-
Aug 2003
Team Member
This system uses the serial communication concept and passes the data from
one computer to another with help of serial communication wire. It transfer
the data in cheapest cost, no unreliable floppy disk nothing just u have to
connect the wire with the respective Computer and then you can transfer the
data.
Responsibilities:
. Part of Requirements Analysis team, used to gather requirement from
the Client and mapped it to technical requirements.
. Analyzed the source and target system.
. Designed the 9 pin and 25 pin connector to connect with the computer
so that data can be transferred.
. Developed the code in VC++ for sending and receiving module.
. Applied the serial communication concept in C language.
. Tested the functionality as well as GUI part.
. Created the UTP's and UTR's for testing purposes.
. Responsible for Quality activities including code review while
strictly adhering to the coding standards.
Environment: PL/SQL, VC++, Dynamic C, Oracle 8i, Windows 2000.
[pic]