BHASKAR M
Email : **************@*****.*** [pic]
Mob no : (M) +1-763-***-****
* Over 8.5 years of experience in IT Industry in Development,
maintenance and enhancement projects in Data Warehousing and Client
Server Architecture applications.
* Extensively worked on creating Mappings, Transformations, Sessions and
Workflows to extract data from various sources and load into targets.
* Experienced in performance tuning of mappings, sessions and workflows
including partitioning.
* Extensively worked on Informatica PowerCenter Transformations such as
Source Qualifier, Lookup, Filter, Expression, Router, Normalize,
Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter,
Sequence Generator, XML Source Qualifier, External procedure.
* Involved in full life cycle development of Data Warehousing.
* Excellent understanding of Star Schema modeling, Snowflake modeling.
Fact, Lookup and Dimensions Tables.
* Extensive knowledge on various performance tunings for mappings,
sessions and stored procedures.
* Excellent skills in understanding business needs and converting them
into technical solutions.
* Extensive experience in error handling and problem fixing in
Informatica
* Strong technical, communication, time management and interpersonal
skills.
Technical Skills
> ETL Tools : Informatica Power Centre -7x and 8x,9x
Metadata
Manager,
Business Glossary, Data Analyzer,Power
Exchange,Power Connector
> RDBMS : Oracle, DB2, SQL
Server, Ms-Access
> Programming Languages : Shell Script, Batch Script
> Scheduling Tools : WLM,CA7
> Operating Systems : Windows NT/2000/XP, MS-
DOS, UNIX
> Data Modeling Tools : Erwin, Visio
Academic Profile
V Master of Computer Appllications from Anna University.
V Bachelor of Computer Science from Acharya Nagarjuna University
CERTIFICATIONS
> Informatica 8x Designer Level
> Informatica 8x Adv & Architecture Level
> Informatica 8x Adv Designer Level
> HIPPA Certified
> Brainbench certified in Business Communication 2010
> Brainbench certified in SQL Server 2005
> Brainbench certified ISTQ Foundation 2008
Experience Summary
Project #1 DTCC(Depository Trust and Clearing Corporation Warehouse)
Client : Allianz
Role : Senior ETL Developer
Duration : Jan'14 to Till now
Location : Minneapolis MN
Environment:Informatica9.6/9,5/9.1,InformaticaILM,PowerExchange,
Power Exchange Navigator,Oracle,DB2,Mainframe,Unix,Linux,CA
7,OBIEE,SAP R/3,SAP AP,SSIS,SSRS,ERWIN
The Depository Trust & Clearing Corporation (DTCC) is a US post-trade
financial services company providing clearing and settlement services to
the financial markets. It provides a way for buyers and sellers of
securities to make their exchange in a safe and efficient way. It also
provides central custody of securities. Through its subsidiaries, DTCC
provides clearance, settlement, and information services for equities,
corporate and municipal bonds, unit investment trusts, government and
mortgage-backed securities, money market instruments, and over-the-counter
derivatives. It also manages transactions between mutual funds and
insurance carriers and their respective investors.
As part of DTCC project we replaced the legacy system of Mercator with
Informatica and to fit multiple source systems called VS10 and ALIP. DTCC
will send new business information in the form of APP feed which will be a
source to issue the policy. Other than APP feed DTCC will also send and
receive the COM, POV, FAR and AAP data.
Responsibilities:
o Involved in analysis, LLD, HLD and documentation of mapping specs as
per business rules.
o Involve in understanding test requirements
o Prepare test strategy and Test Directories
o Participate in test case development and execution
o Participate in test results document preparation for the project
o Participate in bug reporting and tracking for the project
o Extensivey used Power Exchange for getting Source from webservices
data.
o Extensivey used Power Exchange for getting Source from Salesforce APEX
data.
o Performed a data map and running row test in PowerExchange Navigator.
o Created datamaps for oracle source from PowerExchange.
o Performed ETL data loading into OBIEE Environment.
o Familiar with the OBI data structures .
o Extensively used CDC on real time data.
o Extensively used Sql server Database from source and target.
o Participate in review of test document for the end-to-end testing
o Prepare project status reports for the work carried out
o Collect and reporting metrics on end-to-end testing.
o Hands on Experience in Erwin by creating logical and phhysical models.
o Expereince in Reversed Engineering process by using Erwin
o Experience in Dimensional modeling by creating star and snow flake
schemas
o Experience in 1&3NF by implementing into projects.
o Involved in low-level design and developed various jobs and performed
data loads and transformations using different transformation of
INFORMATICA.
o Experienced in ETL auditing process.
o Used most of the transformations such as Source Qualifier, Filter,
Joiner, Sequence Generator, Expression, Aggregator, Connected & Dynamic
Lookup, Rank, Router, and Update Strategy, SQL Transformation, etc.
o Scheduled the Informatica Workflows as per user requirement by using
AUTOSYS scheduling tool.
o Creating retention policy and assigning retention policy to entities
o Creating optimized file archive project with rentention policy
o Running job for applying retention policy and viewing applied retention
policy
o Performance tuning has been done to increase the through put for both
mapping and session level and SQL Queries Optimization as well.
o Experienced with Informatica connector to SAP
o Experience with SAP Account Payable.
o Experienced in deploying the code into releasing folder and placing it
into Build and troubleshooting the build issues.
o Involved in unit testing to test the processed data and user acceptance
testing.
o Experienced in troubleshooting and fix the different
Development/Production environment issues.
o Involved in environment creation of metadata manager.
o Involved in recycle Repository, integration services and creation of
automated scripts for deployment using pmrep command.
o Involved in Workflow,Mapping deployment by using deployment group from
Dev to UAT and UAT to Quality and production Repositories.
o Experienced in writing Uniting Test Cases with multiple scenarios.
o Generating the Reports in SSRS
Project #2 Bankline,GPL,RMP MI
Client : RBS
Role : Analyst
Duration : Jun'12 to Dec'13
Environment:Informatica9.1,InformaticaILM,OBIEE,Oracle,DB2,Mainframe,U
nix,Linux,CA 7,SSRS,ERWIN,Power Exchange.
Enterprise Data and Information is RBS MI data warehouse. The overall
project objective is to populate data layer, provide concentration risk
extracts to risk modeling team and also to populate Information layer
(dimensions and facts). The Process flow is extracting data from different
heterogeneous systems and applying business logics and loading data into
warehouse system.
Responsibilities:
o Extracting the data from the Enterprise Memory Sources
o Transforming the data as specified in the Business Rules
o Loading the data into the Data Layer through DWS
o Identifying/Analyzing root-cause for the bugs/defects raised by Users
and QA team in DWH model.
o Fixing bugs by modifying existing ETL code/logic or proposing new ETL
design to meet user requirements.
o Performed ETL data loading into OBIEE Environment.
o Familiar with the OBI data structures .
o Expert in CDC real time data.
o Extensively used Sql server Database from source and target.
o Identifying the reason for rejection records in production during ETL
data loads and fixing them by modifying/altering ETL code.
o Perform end-to-end impact analysis in case of any changes required in
existing ETL code.
o Extensivey used Power Exchange for getting Third party data as a
source.
o Experienced in logical and physical models with Erwin.
o Experienced in Reverse Engineering conepts by using Erwin.
o Hand on Experience in Erwin tool.
o Communicating/Coordinating with DB admins and Informatica admins to
resolve load issues.
o Creating retention policy and assigning retention policy to entities
o Creating optimized file archive project with rentention policy
o Running job for applying retention policy and viewing applied retention
policy
o Communicating with various source teams to understand business
logic/purpose.
o Tuning long running sessions by changing ETL code and SQ/LKP queries.
o Involved in peer-code reviews.
o Monitoring Workflow Monitor, Workflow Manager, Designer.
o CA7 tool Monitoring.
o Created high, normal, low priority incidents and changes by using Smile
(ticket raising tool).
o Experience reports performance turning in SSRS and MS SQL DB query
optimization.
o Understanding the error and find a solution to complete batches which
are daily, weekly, monthly, quarterly, yearly.
o checking files Unix directories which has been placed or not
o Handled Complexity mappings, issues and resolved them in time.
o Experience using MS SQL Server Reporting Services (SSRS).
o Communicate directly with clients and giving smooth supporting to the
business.
o Handled Critical Supporting Activities which were in quarterly and
yearly.
o Estimating efforts required to close defects.
o Perform Unit testing and Integration testing thoroughly to make sure no
flaws in the code.
Project #3 UNISYS_HORIZON_DMDWH_BI_ASM_SAU UNISYS
Client : Unisys
Role : Sr. Software Engineer
Duration : Dec'10 to Jun'12
Environment : Informatica8.6, Oracle,Salesforce,Apex
Explorer,Oracle,Sql Server 2008.
Unisys is a worldwide information technology consulting services and
solutions company combining expertise in Consulting, Systems integration,
Outsourcing, Infrastructure and Server technology. Unisys serves five
primary vertical markets worldwide namely financial services, Public
Sector, Communications, Transportation and Commercial. Application support
model (ASM) for the Business Intelligence team requires enhancements and
maintenance of the existing Data warehouse, ETL loads and Reporting
requirements.
This project is the maintenance of Unisys IT Corporate DWH & its supporting
Application. Maintenance, Minor & Major enhancement, Service Request
resolution are the main scope of work. To build a highly integrated Data
warehouse using Informatica data integration tools.
Responsibilities:
o Analyzed source data and gathered requirements from the business users.
o Developed approach paper according to the business rules.
o Had a very good experience handling SAP Account Processing data
structure.
o Prepared technical specifications to develop Informatica ETL mappings
to load data into various tables confirming to the business rules.
o Developed mappings with transformations confirming to the business
rules.
o Preparing the Documentation for the mapping according to the designed
logic used in the mapping
o Involve in understanding test requirements
o Prepare test strategy and Test Directories
o Participated in test case development and execution
o Participated in test results document preparation for the project
o Participated in bug reporting and tracking for the project
o Participated in review of test document for the end-to-end testing
o Prepared project status reports for the work carried out
o Collect and reporting metrics on end-to-end testing.
o Experience in Connecting the Informatica with SAP.
o Preparing Test Cases
o Processed session data using threads and various tasks (session,
command etc.) and managed database connections and sc
o heduled workflows.
o Performance tuning of mappings, transformations and sessions to
optimize session performance.
Project #4 Management by Metrics
Client : WellPoint
Role : ETL Developer
Duration : Nov'2009 to Dec'2010
Environment: Informatica8.6, Oracle, SQL Server,Teradata, Erwin, Shell
Scripts Windows NT, Unix
MBM WARE HOUSE
As part of Management by Metrics (MbM) initiative, WellPoint has identified
a list of KPIs and supporting metrics for measurement and continuous
improvement framework. Each of the metrics will be measured against targets
set by WellPoint IT leadership. Apply the business rules on the source
data, build data warehouse, and load data to the data warehouse. Give
support to the metrics reporting.
Responsibilities:
o Analyzed source data and gathered requirements from the business users.
o Developed approach paper according to the business rules.
o Prepared technical specifications to develop Informatica ETL mappings
to load data into various tables confirming to the business rules.
o Developed mappings with transformations and mapplets confirming to the
business rules.
o Preparing the Documentation for the mapping according to the designed
logic used in the mapping
o Optimizing performance tuning at mapping and Transformation level
o Interacting with Onshore people and Database team if any issues
o Preparing Test Cases
o Processed session data using threads and various tasks (session,
command etc.) and managed database connections and scheduled workflows.
o Performance tuning of mappings, transformations and sessions to
optimize session performance.
o Tuned performance of Informatica session for large data files by
increasing block size, data cache size, sequence buffer length and
target based commit interval.
o Extensively used stored procedures and also worked with parallel
processing.
o Created shared folders, local and global shortcuts to reuse metadata.
o Worked on issues with migration from development to testing.
o Worked with business analysts for requirement gathering, business
analysis, and testing and project- coordination.
1 Project #5
Central Repository
The ESI is one of North America's leading drug store operators with
more than 1,800 drug stores in Canada and the United States. At a high
level, The ESI needs to offer its customers the capability of electronic
Repeat/Refill prescription service.
At present ESI peoples wants to create a "Central Repository of all
ESI Group Customer/Patient Data for Electronic Prescription Refills". For
achieving this we receive feed in the form of flat files from each store
(400 stores) on weekly frequency. As in our initial processing we cleansed
the raw data and loaded into staging environment. Once data has been
cleansed we are primarily focusing to load the data into final enterprise
data warehouse based on business constraint. Eventually data is used to
make decisions for new product improvements, failure analysis of existing
products and to improve customer service by using different reports with
the help of Microstrategy reporting functionality.
Client : ESI
Role : ETL Developer
Duration : Jul '09 - Nov'09
Environment : Informatica 8x, Oracle, Teradata,Windows XP, Unix
Responsibilities:
o Involved in analysis and documentation of mapping specs as per
business rules.
o Acted as a Administrator for this project.
o Involved in low-level design and developed various jobs and performed
data loads and transformations using different transformation of
Informatica.
o Experienced in ETL auditing process.
o Used most of the transformations such as Source Qualifier, Filter,
Joiner, Sequence Generator, Expression, Aggregator, Connected &
Dynamic Lookup, Rank, Router, and Update Strategy.
o Performance tuning has been done to increase the through put for both
mapping and session level and SQL Queries Optimization as well.
o Automated the jobs thru scheduling using Built in informatica
scheduler, which runs every week by maintaining the data validations
o Experienced in batch scripting to archiving source files and mailing
the status of workflow runs.
o Used Debugger to troubleshoot the mappings
o Experienced in unit testing and user acceptance testing.
o Experienced in documenting all the processes like deployment, coding
standards, and UTP docs.
o Experienced in administering to maintain privileges of added users and
taking repository backups on weekly and monthly basis.
o Experienced in Informatica software installation, configuration for
the Informatica client and upgrades as needed.
o Create detailed project outlines and application design
specifications.
2 Project #6
MetLife
MetLife have take up the task of conduction the customer
behavior analysis by mesasuring the preference and response to their
various insurance linked policies. A particular pattern of behavior of the
employee segment Vs Business community was to be seen, and for the regular
business operation support. The Historical Data from various sources was
first staged for transformations and using appropriate transformations
loaded to the Target Data Base. Build a data store to support the MetLife
Project.
.
Client : MetLife., NA
Role : ETL Developer
Duration : July '06 - Jun '09.
Environment : Informatica 7x, Cognos, Oracle 9i, WinNT4.0
Role : ETL Developer.
Responsibilities:
o Performed major role in understanding the business requirements and
designing and loading the data into data warehouse (ETL).
o Used Informatica client tools - Source Analyzer, Warehouse designer,
Mapping Designer, Mapplet Designer, and Transformation Developer for
defining Source & Target definitions and coded the process of data
flow from source system to data warehouse.
o Designed and developed complex aggregate, joiner, look up,
transformation rules (business rules) to generate consolidated
(fact/summary) data identified by dimensions.
o Extracted, transformed data from various sources such as Flat files,
MS SQL Server and transferred data to the target data warehouse
Oracle9i database and Flat files
o Performed target load plan to load data into the targets.
o Conducted unit testing for Developed Mappings.
o Creating Transformations, mappings and mapplets as per specifications.
o Involved in ETL performance assessment and tuning.
o Involved in unit testing to check the data consistency.