7+ years of experience in Data Warehousing and ETL using Informatica PowerCenter 9.5/9.1/8.6.
Experience in complete Software development life cycle (SDLC) with a strong background in Design/Modelling, Database development and implementation of various business intelligence and data warehouse/DataMarts (ODS) projects that cover gathering Business Requirements, Development, Implementations and Documentation like Technical design, source to target mappings etc.
Experience in Insurance, Banking, Mining and Telecom industries.
Experience in Data Warehouse/Data Mart Development Life Cycle and employed ETL procedures to load data from different sources into data warehouse using Informatica Power Center tools (Repository Manager, Designer, Workflow Manager and Workflow Monitor).
Designed and developed mappings using diverse transformations like Unconnected/Connected Static/Dynamic Lookup, Expression, Router, Rank, Joiner, Sorter, Aggregator, Stored Procedure, Normalizer, Transaction control, SQL, XML, Source Qualifier transformations.
Experience working with advance Informatica concepts like concurrent workflow, Indirect Load (File List), pmcmd, parameter file, Audit Table etc. Excellent in concepts like Bulk Load, Normal Load, and Constraint base Load and Test Load, Incremental Aggregation, Commit Intervals and Recovery data etc.
Expertise with various scheduling tools – Autosys, UNIX Crontab and Informatica scheduler.
Worked with various data sources like relational tables, flat files (fixed/delimiter), COBOL and XML files.
Experience in performance tuning of Informatica Sources, Targets, Mappings, and Transformations & Sessions.
Sound knowledge of Star, Snowflake schema design in Dimensional Data Modelling using Erwin, Fact (snap shots/type 2) and Dimension tables, SCD (Type1, Type2 and Type3).
Implemented Informatica Velocity Naming standards and Best Practices for the full life cycle of data warehouse projects, right from design to development through go-live and support.
Expertise in Oracle, MySQL, SQL Server DB, developing PL/SQL code, SQL*Loader, UTL_FILE, Toad, Query Optimization.
Provided production support to the system and handle production queries/defects (PCR).
Analytical and Technical aptitude with ability to work in a fast paced, highly flexible environment where in-depth knowledge of technology, smart work and ingenuity are highly appreciated.
Po PowerCenter9.6/9.5/9.1/8.6/8.5, Erwin, SQL Developer, Toad, SVN,
P Power Exchange, Informatica Developer, Informatica BDE.
Or Oracle 11g/10g/9i, MySQL 5.5/5.1, SQL Server 2008, Hive
Windows, UNIX, Oracle Linux.
Ralph-Kimball Methodology, Bill-Immon Methodology, Star Schema, Snowflake Schema, Dimensional data modeling, Snowflake modeling, Fact Tables, Dimensional tables
Masters in Systems Administration, India
Bachelor of Computers Science, India
Supply Chain Management, Canada
Accenture, Toronto, ON Apr’16– Mar’17
This project is about consolidating two strong loyalty brands of LOBLAWS .Data from various interfaces
have to be integrated to a Hadoop Data Hub
Involved in interacting with Business Analyst in analyzing the requirements.
Turning functional designs into concise frameworks and then translating into code
Data integration between multiple systems using Informatica Power Center version 9.6
Gathering requirements and designing frameworks accordingly with the client
Ensure all coding is to industry standard
Basic project management responsibilities, including presenting concepts to clients, consulting with clients, providing status updates)
Daily scrum and team meetings to provide development updates
Put them in front of clients for BA activities
Developing ideas to strengthen code and the coding process
Evaluate all functional requirements and map documents and perform troubleshoot on all development processes.
Assist development teams to maintain all data sources and data storage options.
Design all test cases to provide support to all systems and perform unit tests.
Environment: Hadoop, HIVE, Linux, Autosys, Shell Scripting, HiveSQL, Informatica Powercenter 9.6, Informatica BDE
Kinross Gold Corp, Toronto, ON Jan’15 – Mar’16
Kinross Gold is a senior gold mining company with a diverse portfolio of mines and projects internationally. The Company is focused on delivering value through operational excellence, financial discipline and responsible mining. The project was to analyze the source data and business related to it so that the new data sources can be seamlessly integrated to the existing data warehouse.
Analyzing the source data coming from different sources like trading and mining data in and working with business users and developers to develop the Model.
Defined reusable components like pl/sql scripts, Oracle Packages, Mapplets, transformations, sessions, worklets, and ETL Audit scripts etc.
Created various artifacts like understanding documents, High/Low level designs (Tech designs).
Involved in designing the Informatica mappings by translating the business requirements and extensively worked on Informatica lookup, update and router to implement the complex rules and business logic.
Load data using Informatica mappings to Data Warehouse (from Transient to Staging to Data warehouse and ODS).
Extensively worked with various lookup caches like Static Cache, Dynamic Cache and Persistent Cache.
Extensively worked with both Connected and Unconnected Lookup Transformations.
Developed Re-Usable Transformations and Re-Usable Mapplets.
Developed mapping logics for SCD type 1&2, Change Data Capture (CDC), Delta Load.
Designed and developed complex aggregate, join, lookup transformation rules to generate consolidated (facts/summary) data identified by dimensions using Informatica PowerCenter tools.
Used Workflow Manager for creating, validating, testing and running the sequential and concurrent batches and sessions.
Monitored the workflows in the workflow monitor and resolved the issues in the process using the session logs.
Worked with Shell scripts and pmcmd to interact with Informatica server from command mode.
Scheduled the ETL jobs daily, weekly and monthly based on the business requirement.
Environment: Informatica PowerCenter 9.5, Oracle 11GR2, Erwin 8.2, SQL Developer, PL/SQL, SQL*PLUS, UTL_FILE, Shell Scripting, Autosys, Unix-AIX.
Contus, India Jun’13 – Nov ’13
Contus provides IT services to different clients. In this project worked for a client and developed ETL processes to extract multiple feeds data to AML downstream based on the business requirements
Worked Closely with Business Analysts in Requirement Gathering, Analysis, Process Design, Data Design, Development, Testing and Implementation of load processes and Data transformation processes.
Analyzing the source data coming from different sources flat files, XML and COBOL
Prepared technical documentation of transformation components and Participated in design and development reviews.
Designed data mapping documents and detail design documents for ETL Flow.
Created Informatica mappings to build business rules to load data into different Target Systems.
Extensively used Informatica Power Center Designer, Workflow Manager and Workflow Monitor to develop, manage and monitor workflows and sessions.
Most of the logics are designed to reuse by mapplets, reusable transformations, and sessions and optimized at mapping & session level. Parameterized workflows for automatic environments selection.
Extensively worked on mapping variables, mapping parameters and session parameters.
Scheduled Batches and sessions at required frequency using WF Scheduler and Crontab.
Involved in Writing Shell scripts to automate the Pre-Session and Post-Sessions Processes.
Environment: Informatica PowerCenter 9.1, DB2, WinSQL, MS SQL Server 2008, pmcmd scripting, Sybase.
VDSI, India Apr ’12 – Jun ’13
Verizon is one of the largest telecommunications holding companies in the United States. Involved in creating Data mart for Customers, provide visibility in to key data for analyzing and tracking across products & geography. Data from operational source system was extracted, transformed and loaded into EDW using Informatica.
Involved in Designing and developing multi-dimensional model (Star Schema).
Created ETL scripts that transfer data from Oracle source systems to the Data Mart.
Involved in extraction, transformation and loading of data using PL/SQL stored procedures.
Used aggregator, lookup, expression, stored procedure, filters in the mapping
Used ETL to extract and load data from Oracle, SQL Server, and flat files to Oracle.
Involved in writing complex functions and stored Procedures.
Created various transformations such as Update Strategy, Look Up, Joiner, Filter and Router Transformations.
Developed mapping to load data in slowly changing dimensions.
Performance Tuning of Sessions and Mappings.
Created and Monitored Workflows using Workflow Manager and Workflow Monitor
Involved in Unit testing of the mappings.
Creating Relational Connections, Migrating the Mappings from Dev to Test and Test to Production.
Creating Tables, Indexes, Triggers in different Environments.
Created sessions, batches and designed worklets and master workflows.
Environment: Informatica Power Center 8.6.1/8.1, Oracle 10g/11g, MS SQL Server 2005, TOAD 10, PL/SQL, pmcmd scripting, Shell Scripting, UNIX, MS Visio, WS_FTP9.01, Autosys
HP, India Mar’ 09 – Mar’ 12
Description: The Hewlett-Packard Company was an American global information technology company. The aim of the project was to develop and implement a Data warehouse which will be an integral part of the processing services for managing different type of insurance policies.
Involved in all phases including Requirement Analysis, Client Interaction, Design, Coding, Testing and Documentation.
Created Workflows, Tasks, database connections, FTP connections using Workflow Manager.
Responsible for the Dimensional Data Modeling and populating the business rules using mappings into the Repository
Created Mappings using Informatica to create the input feed according to the standard format work file to process for accounting engine.
Extensively used Transformations for heterogeneous data joins, complex aggregations and procedure transformation
Created various mappings and Mapplets, transformations, lookups etc to validate the fields and derive the fields based on the input and converting raw input data into standard accounting data format.
Wrote several stored procedures for recycling and other extraction purposes.
Populated the warehouse and documented the existing mappings as per standards.
Performed Tuning of Informatica Mappings and Sessions for optimum performance.
Wrote PL/SQL stored Procedures and Functions for data Migration and Stored Procedure transformations.
Used Workflow Manager for Creating, Validating, Testing and running the workflows and Sessions and scheduling them to run at specified time.
Identified and resolved the bottlenecks in source, target, transformations, mappings and sessions to improve performance.
Worked on creating staging Tables, Indexes, Sequences, Views and performance tuning like using SQL Loader/Parallel to load into tables, analyzing tables, proper indexing and dropping.
Performed the Unit Testing which validated that the data is mapped correctly which provides a qualitative check of overall data flow up and deposited correctly in Target Tables.
Environment: Oracle 10g, Informatica Power Center 8.6.1/8.1, SQL Developer, PL/SQL, SQL*PLUS, SQL*Loader, UTL_FILE, Shell Scripting, Autosys, UNIX
Airtel, India May’08 – Feb’09
Description: Airtel Limited is a leading global telecommunications company with operations globally. This project includes developing Data warehouse from different data feeds and other operational data sources.
Involved in development, testing, and production rollout scripting using Unix Shell scripting.
Involved in the development of ad- hoc SQL scripts for different reports.
Designed new database table structure and created backend store procedures, triggers, packages and functions using SQL and PL/SQL. Prepared full set of DDL and DML scripts.
Involved in bug fixing and Post Implementation Support.
Wrote SQL queries, Stored Procedures, Functions, and logical/materialized Views.
Worked with users, clients, business and testers to analyze and resolve CR (Change Requests) and defects
Researched and developed Unix Solutions for automation of Queries.
Environment: Oracle 9i, Toad, PL/SQL, SQL*PLUS, SQL*Loader, UTL_FILE, Shell Scripting, Linux