Anand
Email: *****.*****@*****.*** Phone : 360-***-****
• Overall 7+years of Software Life Cycle experience in System Analysis, Design, Development,
Implementation, Maintenance, and Production support of Data Warehouse Applications.
• Extensive experience with Informatica Power Center 9.x, 8.x, 7.x and SSIS.
• Expertise in working with Informatica, Designer, Work Flow Manager, Work Flow Monitor, Source
Analyzer, Target Designer, Transformation Developer, Mapplet Designer, Mapping Designer,
Workflow Designer, Task Developer, Worklet Designer, Gant Chart, Task View,Mappings,
Workflows, Sessions, Re-usable Transformations, Shortcuts, Import and Export utilities.
• Good knowledge of Data warehouse concepts and principles (Kimball/Inman) - Star Schema,
Snowflake, Data Vault, Oracle Designer and modeling tools like Erwin and Visio, SCD,Surrogate
keys, Normalization/De normalization.
• Analysis, estimations, Design, Construction, Unit and System Testing and implementation.
• Extensive Knowledge in designing functional and detailed design documents for data
warehouse development
• Experience in the integration of various data sources such as Oracle, SQL Server, Sybase,
Teradata, DB2, XML Files and Flat files.
• Vast experience in Designing and developing complex mappings from varied transformation
logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression,
Aggregator, Joiner, Update Strategy, Data Transformation Services etc
• Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in
various levels like sources, targets, mappings, and sessions.
• Experience with writing daily batch jobs using UNIX shell scripts, and developing complex UNIX
Shell Scripts for automation of ETL.
• Knowledge in designing Dimensional models for Data Mart and Staging database.
• Worked in a variety of Business domains which include Finance, Telecom, Health care and Retail
industries, including off-shoring model.
• Expertise in Data Warehousing, Data Migration, Data Integration using Business Intelligence
(BI) tools such as Informatica Power Center, Power Exchange CDC, B2B Data Transformation,
Informatica Data Quality, OBIEE, Cognos etc.
• Expertise in defining and documenting ETL Process Flow, Job Execution Sequence, Job
Scheduling and Alerting Mechanisms using command line utilities.
• Extensive experience in implementing Error Handling, Auditing and Reconciliation and Balancing
Mechanisms in ETL process.
• Skilled in developing Test Plans, Creating and Executing Test Cases.
• Involved with every phase of the life cycle development, including feasibility studies, design, and
coding for large and medium business intelligence projects and continually provided value-added
services to the clients.
• Worked with Waterfall, Agile for SDLC project lifecycle.
• Excellent Analytical, Written and Communication skills.
TECHNICAL EXPERTISE
Informatica Power Center 9.x./8.x/7.x/6.x (Repository Admin
ETL Tools Console, Repository Manager, Designer, Workflow Manger,
Workflow Monitor), Informatica B2B data exchange
Erwin 4.0/3.5, MS Visio, Oracle designer 2000
Data Modeling
Oracle 11g/10g/9i, MS SQL Server 2005/2000, Microsoft Access,
DBMS Excel, Teradata, ODBC
SQL, PL/SQL, Unix Shell scripting
Programming Languages
Windows XP/2008/2003/2000/NT/98/95, UNIX, LINUX
Operating Systems
VISIO, ERWIN, TOAD, CITRIX, Autosys (Batch Scheduling)
Other tools
Work Experience:
Kaiser Permanente,Denver,CO April 2014-Present
Sr. Data warehouse Developer
Kaiser Permanente is made up of three distinct groups of entities--the Kaiser Foundation Health Plan and
its regional operating subsidiaries; Kaiser Foundation Hospitals; and the autonomous regional
Permanente Medical Group.Kaiser Permanente provides care throughout seven regions in the United
States
Responsibilities:
• Involved in the complete end-to-end flow of SDLC.
• Worked closely with business analysts and data analysts to understand and analyze the
requirement to come up with robust design and solutions.
• Involved in standardization of Data like of changing a reference data set to a new standard.
• Data if validated from third party before providing to the internal transformations should be
checked for its accuracy (DQ).
• Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
• Created Technical Specification Documents and SolutionDesign Documents to outline the
implementation plans for the requirements.
• Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica
Sessions, Batches and the Target Data.
• Involved in massive data cleansing prior to data staging from flat files.
• Responsible for developing, support and maintenance for the ETL (Extract, Transform and
Load) processes using Informatica Power Center 9.5 by using various transformations like
Expression, Source Qualifier, Filter, Router,Sorter, Aggregator, Update Strategy, Connected and
unconnected look up etc.
• Implemented slowly changing dimensions Type 2 using ETL Informatica tool.
• Created Informatica components required to operate Data Quality (Power Center required)
• Designed best practices on Process Sequence, Dictionaries, Data Quality Lifecycles, Naming
Convention, and Version Control.
• Created Use-Case Documents to explain and outline data behavior.
• Developed scripts for creating tables, views, synonyms and materialized views in the datamart.
• Involved in designing and developing logical and physical data models to best suit the
requirements.
• Utilized dimensional and star-schema modeling to come up with new structures to support drill
down.
• Converted business requirements into highly efficient, reusable and scalable Informatica ETL
processes.
• Created mapping documents to outline source-to-target mappings and explain business-driven
transformation rules.
• Data if sourced from database that has valid not null columns should not undergo DQ check for
completeness
• Created Macros in Teradata to enable Change Data Capture ( CDC) to identify the delta and
maintain the data mart in sync with the source system.
• Designed mappings to read data from various relational and file source systems such as
Teradata, Oracle, flat files and XML files.
Environment: Informatica Power Center 9.5.1, Oracle 11g, SQL, PLSQL, TOAD 9.0.1.8, SQL * Loader,
IBM AIX, Windows NT/XP, XML file sources, ERWIN 7, MS Visio,, COGNOS, REMEDY.
Dignity Health, Phoeix, AZ Sept 2012 –April 2014
Sr. ETL Developer
Dignity Health is a California-based not-for-profit public benefit corporation that operates hospitals and
ancillary care facilities in 17 states. Dignity Health is the fifth largest hospital system in the nation and the
largest not-for-profit hospital provider in California.
Responsibilities:
• Conducted JAD sessions with business users and SME's for better understanding of the
reporting requirements.
• Design and developed end-to-end ETL process from various source systems to Staging area,
from staging to Data Marts.
• Develop high level and detailed level technical and functional documents consisting of Detailed
Design Documentation function test specification with use cases and unit test documents
• Analysis of source systems and work with business analysts to identify study and understand
requirements and translate them into ETL code
• Handled technical and functional call across the teams.
• Responsible for the Extraction, Transformation and Loading (ETL) Architecture & Standards
implementation
• Responsible for offshore Code delivery and review process
• Used Informatica to extract data from DB2, UDB, XML, Flat files and Excel files to load the data
into the Teradata
• Worked in all phases of Data Integration from heterogeneous sources, legacy systems to Target
Database.
• Worked on Informatica Power Center tool – Source Analyzer, Warehouse designer, Mapping and
Mapplet Designer, Transformations, Informatica Repository Manager, Informatica Workflow
Manager and Workflow Monitor
• Involved in Design Review, code review, test review, and gave valuable suggestions.
• Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic
and Persistence) and Join cache while developing the Mappings.
• Created partitions for parallel processing of data and also worked with DBAs to enhance the
data load during production.
• Performance tuned informatica session, for large data files by increasing block size, data cache
size, and target based commit
• Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).
• Worked with data Extraction, Transformation and Loading data using BTEQ, Fast load,
Multiload
• Used the Teradata fast load/Multiload utilities to load data into tables
• Involved in writing a procedure to check the up-to-date statistics on tables.
• Used Informatica command task to transfer the files to bridge server to send the file to third party
vendor.
• Took part in migration of jobs from UIT to SIT and to UAT
• Created FTP scripts and Conversion scripts to convert data into flat files to be used for
Informatica sessions
• Involved in Informatica Code Migration across various Environments.
Environment: Informatica Power Center 9.5/9.1, Oracle 11g, Teradata V 13.0, Fast load, Multiload,
Teradata SQL Assistant, MS SQL Server 2012, SQL, PL/SQL, T-SQL, SQL*Plus, TOAD, Erwin, AIX, Shell
Scripts, Autosys, Unix
ConAgra foods, Nebraska, Omaha May 2010 – Sept 2012Informatica
Developer
ConAgra food is an American packaged foods company headquartered
in Omaha, Nebraska. ConAgra makes and sells products under various brand names that are
available in supermarkets, as well as restaurants and food service establishments.
Responsibilities:
• Coordinated with Business Users for requirement gathering,business analysis to understand the
business requirement and to prepare Technical Specification documents(TSD) to code ETL
Mappings for new requirement changes.
• Estimation, Requirement Analysis and Design of mapping document and Planning for Informatica
ETL.
• Analysis of Source, Requirement, existing OLTP system and Identification of required dimensions and
facts from the Database.
• Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load)
processes using Informatica Power Center 8.6.1.
• Created complex mappings in Power Center Designer using Aggregate, Expression, Filter,
Sequence Generator, Update Strategy, Rank, Joiner and Stored procedure transformations.
• Involved in performance and tuning of the ETL processes.
• Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica
Sessions, Batches and the Target Data.
• Created Workflows, Worklet, Assignment, Decision, Event Wait and Raise and Email Task, scheduled
Task and Workflow based on Client requirement.
• Set up batches and sessions to schedule the loads at required frequency using Power Center
Workflow manager.
• Extensively worked on Autosys to schedule the jobs for loading data.
• Involved in promoting the folders from Development to Test and Test to Production Environment.
• Developed standardsandprocedures for transformation of data as it moves from source systems to
the data warehouse.
• Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of
performance for pre and post session management.
• Configured and scheduled Pre and Post Session commands with Shell Scripts.
• Used Informatica Debugger to troubleshoot data and error conditions.
• Used mapping parameters and variables for pulling incremental loads from source
• Identified and fixed the Bottle Necks and tuned the Mappings and Sessions for improving
performance. Tuned both ETL process as well as Databases.
• Defined the program specifications for the data migration programs, as well as the necessary test
plans used to ensure the successful execution of the data loading processes.
Environment: Informatica Power Center 8.6.1, Oracle 10g, MS SQL Server, UNIX(Sun Solaris5.8/AIX),
SSRS, UltraEdit-32,Erwin Data Modeler 4.1, FTP, MS-Excel, Ms-Access,Autosys.
Rabobank, Sacramento, CA March 2009–May 2010 ETL Developer
Rabobank is a Dutch multinational banking and financial servicescompany. It is a global leader in
Food, Agri financing and sustainability-oriented banking and a large number of specialized international
offices and subsidiaries. Food & Agribusiness is the prime international focus of the Rabobank
Group.
Responsibilities:
• Involved in Business analysis and requirements gathering.
• Assisted in creating fact and dimension table implementation in Star Schema model
based on requirements.
• Preparation of technical specifications and Source to Target mappings.
• Extensively used Informaticapowercenter for extraction, transformation and loading
process.
• Created mappings for dimensions and facts.
• Extracted data from various sources like Oracle, flat files and DB2
• Worked extensively on Source Analyzer, Mapping Designer, Mapplet designer and
Warehouse Designer and Transformation Developer.
• Developed several Mappings and Mapplets using corresponding Sources, Targets
and Transformations.
• Designing and creation of complex mappings using SCD type II involving
transformations such as expression, joiner, aggregator, lookup, update
strategy, and filter.
• Optimizing/Tuning mappings for better performance and efficiency.
• Migrated mappings from Dev to Test and Test to Production repositories.
• Created sessions and workflows to run with the logic embedded in the mappings
using Power center Designer.
• Worked on issues with migration from development to testing.
• Designed and developed UNIX shell scripts as part of the ETL process, automate the
process of loading, pulling the data.
• Refreshed reports using Scheduler.
• Preparation of UTP (Unit Test Plan) with all required validations and test cases.
• Responsible for testing and validating the Informatica mappings against the pre-
defined ETL design standards.
• Created various tasks like sessions, decision, timer & control to design the workflows
based on dependencies
• Used workflow manager for session management, database connection
management and scheduling of jobs.
• Involved in production support.
Environment:Informatica Power Center 7.1, Oracle 8, Windows 2000, TOAD, Erwin 4.0
Apollo Health Street, Hyderabad, India May 2007 – Feb 2009
Jr. OracleDeveloper
This project included extensive development of Packages, Grouping the Procedures and Functions that
were already written into respective packages to increase the system performance, and to create Indexes
on tables with large data volumes.
Responsibilities:
• Involved in gathering business requirements, logical modeling, physical database design, data
sourcing and data transformation, data loading, SQL and performance tuning.
• Involved in creating Technical Specification Document (TSD) for the project.
• Involved in the process design documentation of the Data Warehouse Dimensional Upgrades.
• Used Informatica for loading the historical data from various tables for different departments.
• Used Informatica Designer for developing mappings, using transformations, which includes
aggregation, Updating, lookup, and summation.
• Involved in the development of Data Mart and populating the data marts using Informatica.
• Developed sessions using Server Manager and improved the performance details.
• Created transformations like Aggregate, Expression, Filter, Sequence Generator, Joiner, and
Stored procedure transformations.
• Created reusable transformations called mapplets and used them in mappings in case os reuse
of the transformations in different mappings.
• Created sessions to run the mappings.
• Created mappletsand reusable transformations to use across different mappings.
• Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various
needs of the transformations while loading the data.
Environment: Oracle RDBMS 9i, Informatica, JAVA, SQL*Plus Reports, SQL*Loader, XML, Toad