Santhoshi
ETL Consultant
aco927@r.postjobfree.com
Summary:
. Eleven plus (11+) years of IT experience in the Analysis, Design,
Development, Testing and Implementation of business application
systems for Health care, Financial, Telecom and Timeshare Sectors.
. Strong experience in the Analysis, design, development, testing and
Implementation of Business Intelligence solutions using Data
Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
. Strong experience with Ralph Kimball and Inmon data modelling
methodologies.
. Strong experience working with ETL tools Informatica/SSIS.
. Strong Data Warehousing ETL experience of using Informatica
9.5/9.1/8.6.1/8.5/8.1/7.1 PowerCenter Client tools - Mapping Designer,
Repository manager, Workflow Manager/Monitor and Server tools -
Informatica Server, Repository Server manager.
. Experience working on Dataquality tools Informatica IDQ(9.1),
Informatica MDM (9.1).
. Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP
implementations teamed with project scope, Analysis, requirements
gathering, data modeling, Effort Estimation, ETL Design, development,
System testing, Implementation and production support.
. Strong experience in Dimensional Modeling using Star and Snow Flake
Schema, Identifying Facts and Dimensions, Physical and logical data
modeling using ERwin and ER-Studio.
. Worked in Agile and Waterfall project methodologies.
. Expertise in working with various sources such as Oracle
11g/10g/9i/8x, SQL Server 2008/2005, DB2 8.0/7.0, UDB,
Netezza,Teradata,flat files,XML,COBOL,Mainframe.
. Extensive experience in developing Stored Procedures, Functions, Views
and Triggers, Complex SQL queries using SQL Server, T-SQL and Oracle
PL/SQL.
. Utilized AUTOTRACE and EXPLAIN PLAN for monitoring the SQL query
performance.
. Experience in resolving on-going maintenance issues and bug fixes;
monitoring Informatica sessions as well as performance tuning of
mappings and sessions.
. Created pre-session, post session,pre-sql,post sql commands in
Informatica.
. Worked with different Informatica transformations like
Aggregator,Lookup,Joiner,Filter,Router,Update strategy,Transaction
Control,Union,Normaliser,SQL in ETL development.
. Worked with Event wait and event raise tasks,mapplets,reusable
transformations.
. Worked extensively on NZLOAD,NZSQL scripts to read and write data with
Netezza Database.
. Worked with Parameter file for ease of use in connections across
Dev/QA/Prod environments.
. Experience in all phases of Data warehouse development from
requirements gathering for the data warehouse to develop the code,
Unit Testing and Documenting.
. Extensive experience in writing UNIX shell scripts and automation of
the ETL processes using UNIX shell scripting.
. Proficient in the Integration of various data sources with multiple
relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, DB2,
Teradata, VSAM files and Flat Files into the staging area, ODS, Data
Warehouse and Data Mart.
. Experience in using Automation Scheduling tools like Autosys,Tidal,Control-M,Tivoli Maestro scripts.
. Experience working with Powerexchange.
. Worked extensively with slowly changing dimensions SCD Type1 and
Type2.
. Excellent interpersonal and communication skills, and is experienced
in working with senior level managers, business people and developers
across multiple disciplines.
Educational Qualification:
Bachelor of Engineering in Electrical and Electronics.
Technical Skills:
Data Warehousing Informatica Power Center 9.5/9.1.0/8.6.1/7.1/6.2,
(Designer, Workflow Manager, WorkflowMonitor,
Repository Manager), Informatica Power Connect, DTS, SQL
Server
Integration Services (SSIS),Webservices,Informatica MDM,
Informatica IDQ
Data Modeling Erwin, Toad
Databases Oracle11i/10g/9i/8i/7x, SQL Server 2003/2008, MS
Access,Excel,salesforce.com
Business Hyperion,OBIEE,Cognos (Impromptu, Transformer, Power Play
Intelligence Reports,
Scheduler, IWR, PPWR, Upfront, Access Manager),
Business Objects XI/6.5(Supervisor, Designer, Reporter),
SQL Server Analysis
Services (SSAS), SQL Server Reporting Services (SSRS),
Crystal Reports 9/10/11.
Languages SQL,T-SQL,ANSI-SQL, PL/SQL, Unix Shell Script, Visual
Basic,
ANSI SQL, SQL*Plus 3.3/8.0
Tools Toad, SQL* Loader, Crystal Reports 9.0/8.x/7/6/5/4
Operating Systems Windows 2007/2005/2003/XP/NT Server and Windows NT, UNIX,
HP-UX,
UNIX AIX 4.2/4.3, Sun Solaris 8.
Analytical Tools SQL Server Analysis Service (SSAS), Performance Point
Server 2007,
ProClarity Analytics Server
SQL Server Tools Query Analyzer, SQL Server Profiler, SQL Server Mail
Service, Enterprise
Manager, SQL Server Agent, DTS, BCP,Microsoft Visual Studio
Versioning Team Foundation Server (TFS), Visual Source Safe (VSS).
Professional Experience
Genpact/Cox Communications Nov 2014 -
Till Date
Role: Lead/Senior Informatica developer
Atlanta, GA
Client Description:
As the third-largest cable provider in the nation, Cox Communications Inc.
is noted for its high-capacity, reliable broadband delivery network as well
as the company's ability to provide superior customer service.
Enterprise Datwareouse is built to understand,analyse customer
satisfaction, daily work orders processed,
Customers across different subject areas
(Business/Residential/Medicaid).Informatica is the ETL being used to pull
data from various sources like ICOMS,UNICA,MEDALLIA etc. OBIEE reports with
dashboards for different subject areas are being used for business
analysis.
Responsibilities:
. Worked as Lead for the projects, involving in all the phases of SDLC.
. Worked with DataArchitect in designing the data mart,
defining,designing and building FACTS and DIMENSIONS using Star Schema
model.
. Worked with SCD Type1, Type 2, Type3 to maintain history in Dimension
tables.
. Developed Informatica workflows/worklets/sessions associated with the
mappings across various sources like XML,COBOL,flat files,
Webservices, Salesforce.
. Worked with cleanse, parse, standardization, validation, scorecard
transformations.
. Worked with transformations Source Qualifier,Update Strategy, XML
transformation, SQL Transformation, Webservices, Java
transformation,Lookup (Connected and Unconnected).
. Worked with Push down optimization to improve performance.
. Worked on UNIX shell scripting for file processing to third party
vendor through SFTP, encryption and decryption process.
. Worked with different scheduling tools like Tidal, Tivoli, Control M,
Autosys
. Documentation of ETL process for each project and KT to Offshore and
team.
. Work closely with DBAs, application, database and ETL developers,
change control management for migrating developed mappings across Dev,
QA, and PROD environments.
. Production support for the Informatica process, troubleshoot and debug
any errors.
. Worked with Informatica tools IDQ (Data Analyst,Developer) with
various dataprofiling techniques to cleanse,match/remove duplicate
data.
. Worked with cleanse, parse, standardization, validation, scorecard
transformations.
. Worked extensively with Netezza scripts to load the data from
flatfiles to Netezza database.
. Used NZSQL scripts,NZLOAD commands to load the data.
. Used FIFO to load the flat files from source for faster data load into
Netezza database.
. Worked with Informatica power exchange and Informatica cloud to
integrate Salesforce and load the data from Saleforce to Oracle db.
Environment: Informatica Power Center 9.5/9.1.0, Flat Files,MainFrame
Files, T-SQL,Oracle 11i, Quest Toad Central 9.1, Unix Shell Scripting and
Windows 2000, Windows 2003,SQL Server 2005,SQL Server 2008,
Teradata,Netezza,Aginity workbench.
AMDOCS/AT&T May 2014 - Nov 2014
Role: Lead/ Senior Informatica developer
Atlanta, GA
Client Description:
The main objective of the project is to integrate customers of the ATT
wireless into the Cingular systems by developing and maintaining Customer
Information Data Mart. It included extraction of data from different source
systems such as Relational, Flat files, Excel files and application
databases like Access. ATT wireless has information such as Orders,
Billing, Remedy and Communications information, which is integrated with
Cingular wireless information and stored in the Data mart.
Worked on projects for Enabler where the data is loaded in phases from
Source to Stage and then stage to EDW which resides in Teradata.
Responsibilities:
. Lead design, development and implementation of the ETL projects end to
end.
. Responsible for ETL technical design discussions and prepared ETL high
level technical design document.
. Involved in the analysis of source to target mapping provided by data
analysts and prepared function and technical design documents.
. Involved in designing of star schema based data model with dimensions
and facts.
. Interacting with onsite and offshore team to assign Development tasks
and scheduling weekly status calls with Offshore team on status.
. Extracted data from flat files, Golden Gate, Oracle,Sql server
using Informatica ETL mappings and loaded to Data Mart.
. Utilized Informatica IDQ (Data Analyst,Developer) for dataprofiling
and matching/removing duplicate data,fixing the bad data,fixing NULL
values.
. Created quality rules, development and implementation patterns with
cleanse, parse, standardization, validation, scorecard
transformations.
. Created complex Informatica mappings using transformations
Unconnected Lookup, joiner, Rank, Source Qualifier, Sorter,
Aggregator, Lookup and Router transformations to extract, transform
and loaded data mart area.
. Used Teradata utilities fastload, multiload, tpump to load data from
various source systems.
. Developed scripts in BTEQ to import and export the data.
. Developed CTL scripts to load the data to and from Teradata database.
. Worked extensively on shell scripting for file management.
. Created re-usable transformations/mapplets and used across various
mappings
. Wrote complex PLSQL scripts /functions/procedures/packages.
. Developed Informatica workflows/worklets/sessions associated with the
mappings using Workflow Manager
. Worked extensively with Netezza scripts to load the data from
flatfiles to Netezza database.
. Used NZSQL scripts,NZLOAD commands to load the data.
. Created Tivoli Maestro jobs to schedule Informatica Workflows
. Expert in performance tuning of Informatica code using standard
informatica tuning steps.
Environment: Informatica Power Center 9.5/9.1.0/8.6.1, Flat Files,MainFrame
Files, Oracle 11i, Quest Toad Central 9.1, Unix Shell Scripting and Windows
2000, Windows 2003,SQL Server 2005,T-SQL,SQL Server 2008,
Teradata,Netezza,Aginity workbench.
Aviana Global Technologies LLC Jan 2014 - Apr
2014
Role: Lead/Senior Informatica Developer Brea,CA
Client Description:
Aviana has built a solid reputation as a preferred custom solutions and
consulting partner for clients in healthcare, entertainment, finance,
insurance, state and local government, retail, gaming and manufacturing.
Domain : HealthCare (Dialysis Treatment)
Responsibilities:
. Worked as Informatica Lead for ETL projects to design,develop
Informatica mappings.
. Worked with Informatica IDQ (Data Analyst,Developer) with various
dataprofiling techniques to cleanse,match/remove duplicate data.
. Worked with cleanse, parse, standardization, validation, scorecard
transformations.
. Involved in analyzing, defining, and documenting data requirements by
interacting with the client and Salesforce team for the Salesforce
objects.
. Worked with Informatica Power Exchange as well as Informatica cloud to
load the data into salesforce.com
. Worked on Informatica Cloud to create Source /Target sfdc
connections,monitor,synchronize the data in sfdc.
. Worked on sfdc session log error files to look into the errors and
debug the issue.
. Created and edited custom objects and custom fields in Salesforce and
checked the field level Securities.
. Created Web services mappings for consumer and Provider,used
Webservices consumer transformation,XML parser to parse the incoming
data.
. Worked extensively with Netezza scripts to load the data from
flatfiles to Netezza database.
. Used NZSQL scripts,NZLOAD commands to load the data.
. Involve in all phase of SDLC,i.e design, code, test and deploy ETL
components of datawarehouse and integrated Data Mart .
. Extensively worked with Teradata database using BTEQ scripts.
. Worked with FLOAD,MLOAD,TPUMP utilities to load the data to Teradata.
. Created Informatica mappings using various transformations like XML,
Source Qualifier, Expression, Look up, Stored procedure, Aggregate,
Update Strategy, Joiner, Normaliser,Union, Filter and Router in
Informatica designer.
. Created pre-session, post session,pre-sql,post sql commands in
Informatica.
. Used UNIX scripts for file management as well as in FTP process.
. Work closely with DBAs, application, database and ETL developers,
change control management for migrating developed mappings to PROD.
. Production support for the Informatica process, troubleshoot and debug
any errors.
Environment: Informatica Power Center 9.5/9.1.0/8.6.1, Informatica Data
Quality 9.1.0/9.5.1, Flat Files,MainFrame Files, Oracle 11i, Netezza,Quest
Toad Central 9.1, Unix Shell Scripting and Windows 2000, Windows 2003,SQL
Server 2005,SQL Server 2008, Salesforce.com,Webservices.
Wyndham Vacation Ownership Jan 2012-Dec 2013
Role: Lead/Senior Informatica Developer Orlando,FL
Client Description:
Wyndham Vacation Ownership develops, markets and sells vacation ownership
interests and provides consumer financing to owners through its three
primary consumer brands, Wyndham Vacation Resorts, WorldMark by Wyndham,
and Wyndham Vacation Resorts Asia Pacific.
Responsibilities:
. Work with offshore/onsite team and lead the project and assign tasks
appropriately to the team members.
. Responsible for projects estimates, design documents, resource
utilization and allocations.
. Interacting and assigning development work to Developers that were
offshore and guiding them to implement logic and troubleshoot the
issue that they were experiencing.
. Involved in extracting, transforming and loading data Accounts,
Contracts, Reservations, Owner interactions - Interactions tables
from various source systems to Salesforce.com and also reverse data
feed from Salesforce for CRM telesales.
. Involved in analyzing, defining, and documenting data requirements by
interacting with the client and Salesforce team for the Salesforce
objects.
. Worked with Informatica cloud for creating source and target objects,
developed source to target mappings.
. Worked with Informatica Cloud to create Source /Target
connections,monitor,synchronize the data in sfdc.
. Created Web services mappings for consumer and Provider,used
Webservices consumer transformation,XML parser to parse the incoming
data.
. Utilized Informatica IDQ (Data Analyst,Developer) for dataprofiling
and matching/removing duplicate data,fixing the bad data,fixing NULL
values.
. Created quality rules, development and implementation patterns with
cleanse, parse, standardization, validation, scorecard
transformations.
. Involve in all phase of SDLC,i.e design, code, test and deploy ETL
components of datawarehouse and integrated Data Mart .
. Worked with data loading utiltities Bteq, Tpump,Fload,Mload to load
the data into Teradata Database.
. Created Informatica mappings using various transformations like XML,
Source Qualifier, Expression, Look up, Stored procedure, Aggregate,
Update Strategy, Joiner, Normaliser,Union, Filter and Router in
Informatica designer.
. Worked on shell scripting for file management and in FTP process.
. Work closely with DBAs, application, database and ETL developers,
change control management for migrating developed mappings to PROD.
. Production support for the Informatica process, troubleshoot and debug
any errors.
Environment: Informatica Power Center 9.1.0/8.6.1, Informatica Data Quality
9.1.0, Flat Files,MainFrame Files, Oracle 11i, Netezza,Quest Toad Central
9.1, Unix Shell Scripting and Windows 2000, Windows 2003,SQL Server
2005,SQL Server 2008, Salesforce.com,Webservices,Teradata.
BayviewFinancial Aug 2007-
Dec 2011
Role: Lead/Senior Informatica Developer Miami,FL
Client Description:
Bayview Financial/Bayview asset Management is a mortgage investment
company focused on providing capital and servicing solutions to
banks and financial companies.
Responsibilities:
. Lead and interacting with Business Analyst to understand the business
requirements. Involved in analyzing requirements to refine
transformations.
. Analyzed the existing mappings and understand the dataflow process.
. Fixing the existing issues by introducing the data cleansing
techniques into the mappings rather than cleaning the source data
files manually.
. Responsible for mentoring Developers and Code Review of Mappings
developed by other developers
. Extensively used Toad for analyzing the queries in the existing the
mappings to better understand the business logic implemented.
. Created complex mappings in Power Center Designer 8.6 using Aggregate,
Expression, Filter, and Sequence Generator, Update Strategy, Union,
Lookup, Joiner, XML Source Qualifier and Stored procedure
transformations.
. Performed performance tuning by identifying the bottlenecks in
informatica mappings and sessions and also using explain plan in
oracle using TOAD.
. Wrote SQL, PL/SQL, stored procedures & triggers, cursors for
implementing business rules and transformations.
. Created pre-session, post session,pre-sql,post sql commands for
email notifications with the Email Task, also to update target tables
after the data is loaded.
. Created and used tasks like Email Task, Command Task, Control task in
Informatica workflow manager and monitored jobs in Workflow Monitor.
. Developed UNIX scripts for file management like to zip & unzip the
files.
. Developed the automated and scheduled load processes using TIDAL
Schedular.
Environment: Informatica Power Center 8.6.1, Flat Files,MainFrame Files,
Oracle 11i, Quest Toad Central 9.1, Unix Shell Scripting and Windows 2000,
Windows 2003,SQL Server 2005,SQL Server 2008
Mutex Systems Oct
2006 to July 2007
Role: Informatica developer
Denver,Colorado
Project : Datamart/Datawarehouse
The purpose of the project is to desingn a datawarehouse for the
company with all the employee information and contact details.For
maintaining records of the employee salaries,benefits and packages.
Informatica Power Center is the ETL tool being used to pull the data
from the front end application which is stored in different databases
like oracle,sql server,flatfiles.
Responsibilities:
. Worked closely with the Business analyst to understand the various
source data.
. Involved in designing Logical and Physical models for staging,
transition of the Data.
. Involved in designing of star schema based data model with dimensions
and facts
. Designed ETL mapping document to map source data elements to the target
based in Star-Schema dimensional model.
. Designed and developed Informatica Mapping for data load and data
cleansing
. Created Stored Procedure, Functions and Triggers as per the business
requirements
. Used Update Strategy and Lookup transformation to implement the Change
Data Capture (CDC) process
. Partitioned sources to improve session performance.
. Developed several complex Mappings, Mapplets and Reusable
Transformations to facilitate One time and Monthly loading of Data
. Utilized the Aggregate, Join, Router, Lookup and Update transformations
to model various standardized business processes
. Worked with Scheduler to schedule Informatica sessions on daily basis
and to send an email after the completion of loading
. Created design documents and performed Unit Testing on the Mappings.
. Created complex SCD type 1 & type 2 mappings using dynamic lookup,
Joiner, Router, Union, Expression and Update Transformation.
. Worked on identifying Mapping Bottlenecks in Source, Target and
Mappings and Improve performance.
. Extensively used Workflow Manager to create connections, sessions,
tasks and workflows
. Performance tuned stored procedures, transformations, mappings,
sessions and SQL queries
. Worked on the Database Triggers, Stored Procedures, Functions and
Database Constraints.
Environment: Informatica 7.1/6.x, Oracle 10g, SQL Server 2005, Autosys,
Business Objects 6.5
Institute Of Embedded Technologies July 2005 to
Oct 2006
Role: Software Programmer/oracle developer
Hyderabad,INDIA
Project: Client Communication
The purpose of the project is to store the information of all the clients
the company is provided the services to.In order to maintain the
data,oracle is primarily used as the storage database.In this database,the
information is stored pulling the data from the front end applications,flat files etc..and creating the oracle tables and views
correspondingly.Reports are generated from this data to analyse the
business.
Responsibilities:
. Involved in creating Functional and Program Specification documents.
. PL/SQL Development and Implementation.
. Extensive Performance Tuning(SQL Tuning, PL/SQL Tuning)
. Involved in ETL Development using native Oracle tools(SQL*LOADER,
Oracle PL/SQL)
. Involved in the creation of Partitioned Tables and Indexes
. Involved in the creation and modification of Packages, Stored
Procedures and Triggers.
. Involved in writing complex SQL Queries to implement the business
requirements.
. Involved in loading data into Database using SQL*Loader.
. Data Migration using PL/SQL stored Procedures.
. Involved in DATA MODELING using ERWIN.
. Created stored Procedures using EXECUTE IMMEDIATE and REF CURSORS
(Native Dynamic SQL).
. Involved in cleaning and maintaining migrated data.
. PL/SQL Collections were extensively used for high performance of
stored procedures.
. Involved in Index Monitoring for identifying the Unused Indexes.
. Involved in analyzing Schema's, Tables and Indexes as part of
OPTIMIZATION.
. Used data pump to refresh the development and test database
environment.
. Worked with AUTONOMOUS TRANSACTIONS in Triggers and Functions in order
to include logging.
. Involved in creating UNIX shell Scripts for automating various routine
database tasks.
. Made use of AUTOTRACE and EXPLAIN PLAN for monitoring the individual
query performance.
. TOAD and SQL PLUS were used for PL/SQL Development.
Created Namespaces, Query Subjects, Calculated Fields, Filters, Joins
and Packages in Cognos Frame Work Manager.
. Created User Classes and Users in Access manager.
. Installed ReportNet in IIS, Windows and Oracle Environment,
. Created standard and Adhoc reports using Query Studio and Report
Studio.
. Created User Groups, Roles to implement the security in ReportNet.
Environment: Oracle 9i, HP-UX 11.0, SQL DEVELOPER, TOAD, Cognos
Reportnet(Report Studio, Query Studio, Cognos Connection) Framework
Manager, Cognos Series 7 (Impromptu Administrator, Power play Enterprise
Server, Power play Transformer, Access Manager), Microsoft IIS
Sri Sai Electronic Design Centre August 2003
to June 2005
Role: Software Programmer/oracle developer Hyderabad,INDIA
Project: Automation products
The project is based on automation machines which are being used for
residential and commericial purposes. The day to day activities of the
production of these machines has to be stored in a database.Oracle is being
used as the main source of database for storing the information of the
business activities and there of reports are generated out of this data.
Responsibilities
. Involved in creating Functional and Program Specification documents.
. PL/SQL Development and Implementation.
. Extensive Performance Tuning(SQL Tuning, PL/SQL Tuning)
. Involved in ETL Development using native Oracle tools(SQL*LOADER,
Oracle PL/SQL)
. Involved in the creation of Partitioned Tables and Indexes
. Involved in the creation and modification of Packages, Stored
Procedures and Triggers.
. Involved in writing complex SQL Queries to implement the business
requirements.
. Involved in loading data into Database using SQL*Loader.
. Data Migration using PL/SQL stored Procedures.
. Involved in DATA MODELING using ERWIN.
. Created stored Procedures using EXECUTE IMMEDIATE and REF CURSORS
(Native Dynamic SQL).
. Involved in cleaning and maintaining migrated data.
. PL/SQL Collections were extensively used for high performance of
stored procedures.
. Involved in Index Monitoring for identifying the Unused Indexes.
. Involved in analyzing Schema's, Tables and Indexes as part of
OPTIMIZATION.
. Used data pump to refresh the development and test database
environment.
. Worked with AUTONOMOUS TRANSACTIONS in Triggers and Functions in order
to include logging.
. Involved in creating UNIX shell Scripts for automating various routine
database tasks.
. Made use of AUTOTRACE and EXPLAIN PLAN for monitoring the individual
query performance.
. TOAD and SQL PLUS were used for PL/SQL Development.
Created Namespaces, Query Subjects, Calculated Fields, Filters, Joins
and Packages in Cognos Frame Work Manager.
. Created User Classes and Users in Access manager.
. Installed ReportNet in IIS, Windows and Oracle Environment,
. Created standard and Adhoc reports using Query Studio and Report
Studio.
. Created User Groups, Roles to implement the security in ReportNet.
Environment: Oracle 9i, HP-UX 11.0, SQL DEVELOPER, TOAD, Cognos
Reportnet(Report Studio, Query Studio, Cognos Connection) Framework
Manager, Cognos Series 7 (Impromptu Administrator, Power play Enterprise
Server, Power play Transformer, Access Manager), Microsoft IIS