Post Job Free

Resume

Sign in

ETL/Data Warehouse Developer

Location:
Toronto, ON, Canada
Posted:
March 06, 2010

Contact this candidate

Resume:

VINIL KUMAR

Phone: 647-***-****

email: fab20x@r.postjobfree.com

Professional Summary

• Over 6 years of IT experience in Designing, Development and Implementation of Business intelligence solutions using ETL, Business Intelligence reporting and OLAP applications.

• Good working experience in design and developing ETL solutions using Informatica Powercenter 8x/7.x/6.1/5.1 (Repository Manager, Powercenter Designer, Server Manager, Workflow Manager, and Workflow Monitor), OLAP and OLTP.

• Working experience in Dimensional Data modelling using Star and Snow Flake schema.

• Extensive experience in Business Intelligence in generating various complex reports using Cognos Series 7.1/6.5.

• Solid experience in coding using SQL, PL/SQL stored procedures/functions, triggers and packages.

• Experience in Databases like Oracle 10g/9i/8i, MS SQL Server 2008/2005/2000/7.0/6.5 and MS Access 2000/2003.

• Providing effective and efficient support in the analysis, design, development and maintenance of the systems.

• Very good experience in gathering business requirements and converting them into functional, technical specifications and developed mapping documents to map source systems to the Data Warehouse model.

• Designed and developed Informatica ETL mappings/mapplets to read source data and loading into target warehouse.

• Experience in Performance tuning of Informatica (sources, mappings, targets and sessions) and tuning the SQL queries.

• Experience in Performance Tuning and SQL Tuning and creation of indexes for faster database access and better query performance.

• Experience in documenting ETL mappings, created Unit test plans and involved in developing deployment plan.

• Expertise in utilizing oracle Utility tools viz., SQL *Loader, Import and Export.

• Thorough knowledge of database management concepts like conceptual, logical and physical

data modeling and data definition, population and manipulation.

• Highly skilled in developing Test Plans, Test scripts, Test Matrix and Executing Test Cases

• Excellent analytical, problem solving, and communication skills, with ability to interact with individuals at all levels.

Technical Skills Summary:

Data Warehousing Informatica PowerCenter 8.6.1/8.1.1/7.1.2/6.2/5.1 (Repository Manager, Designer, Server Manager, Workflow Manager, Workflow Monitor)

Data Modeling Dimensional Data Modeling (Star Schema, Snow-Flake, FACT, Dimensions Tables, Physical and Logical Data Modeling, ER Diagrams, Erwin 4.5/4.0/3.5

Business Intelligence Cognos Series 7.1/6.5(Impromptu Administration, Access Manager, Report Administration, IWR, PowerPlay Transformer, PowerPlay for Windows), Crystal Reports, Micro Strategy

Programming SQL, PL/SQL, Unix Shell Scripting, Toad 9x/7.x, SQL*Loader, SQL Developer, MS SQL server management studio express, Putty, WinSCP

Databases Oracle 10G/9i/8i, MS SQL Server 2000/2005/2008, MS Access 2000.

Environment Win 95/98/2000/XP, Win NT 4.0, Unix (AIX, Sun Solaris), MS DOS, Ubuntu.

Professional Experience:

Cancer Care Ontario (CCO), Toronto, Canada. Feb’ 2009-Til Date

Enterprise Data Warehouse Integration Project.

ETL/Data WarehouseDeveloper.

In November 2004, the Ontario Ministry of Health and Long-Term Care (MOHLTC) launched a provincial Wait Time Strategy. One of the primary objectives of the Ontario Wait Time Strategy was to create a central reporting system to track, measure, and reduce wait times for health care services.

Project involves in Optimization of the existing Informatica ETL processes and MicroStrategy reports to improve the overall performance of iPort Access and correct any known issues that may impact project success. Development of new ETL processes to automatically extract, transform and load data into the EDW

Responsibilities:

• Worked on designing ETL design Documents and Informatica Load processes and handled document walkthroughs.

• Extensively participated in functional and technical meetings for designing the architecture of ETL load process (mappings, sessions, workflows from source systems to staging to DW).

• Developed mapping document indicating the source tables, columns, data types, transformation required, business rules, target tables, columns and data types.

• Created Several Informatica Mappings to populate the data into dimensions and fact tables.

• Created mappings to Extract, Transform and Load (ETL) data from source (SQL Server) to the Data Warehouse by using Informatica ETL tool.

• Created Parameter files, defined mapping and Maplet parameters in the parameter file.

• Defined the Source, Target structures, and connections and imported them into the Informatica source analyzer and warehouse designer.

• Developed complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy Rank and Router transformations.

• Worked on performance tuning of the complex transformations, mappings, sessions and SQL queries to achieve faster data loads to the data warehouse.

• Created session tasks, event waits & raise, command task, worklets etc in the workflow manager and deployed them across the DEV, QA, UAT and PROD repositories.

• Exclusively worked with Views, Synonyms, Procedures/Functions and Packages in Oracle PL/SQL to build business rules.

• Run MicroStrategy reports and improved the performance of the reports by analyzing the underlined SQL that is generated by MicroStrategy reports using Oracle optimization methods like Partitioning Tables, Creating BITMAP Join index, Flushing buffer Cache etc.

• Have hands on experience to use MS SQL server management studio express (2005/2008).

• Wrote Unix scripts for scheduling the maestro jobs to automate the processes.

• Wrote Unix scripts for unit testing the ETL Code (sessions,Workflows,Log Files).

• Created unit test plan master document with appropriate test cases and test results for ETL Code.

• Involved in creation of Environment Document which provides instructions to implement/Test the Project in QA and Production environments.

• Extensively used Microsoft Access to compare data within databases using ODBC.

• Created Run Book document for Production Support.

• Actively participated in middle level management and technical meetings to discuss and resolve the problems Created.

• Responsible for ETL Code walkthrough and Knowledge Transfer at all stages.

• Implemented type I & II slowly changing dimensions tables.

• Performed Unit testing to detect the defects in the codes and tracked by using HP Quality center tool.

• Involved in ETL Code migration from informatica 8.1.1 to 8.6 using XML.

• Involved in pre & postproduction support to resolve the issues.

Environment: Informatica PowerCenter 8.6.1/8.1.1, Oracle 10g, SQL, PL/SQL, SUN OS 5.10, TOAD/SQL Developer, MicroStratagy, Tortoise SVN, HP Quality Center 9.2, Microsoft SQL Server Management Studio Express (2000/2008), Putty, WinSCP, Microsoft Access.

CANADIAN WHEAT BOARD (CWB), Winnipeg, Canada May’ 2008–Nov’ 2008

Supply Chain Management.

ETL Developer.

The CWB's supply chain is key to the business. This project involves building of a Data Warehouse solution to meet the various requirements of CWB Supply Chain Management process. Involved in analysis, design, development, implementation and quality assurance of DW applications.

Responsibilities:

• Designed and Developed the mapping technical design specifications before creating mappings

• Analyzed the data and mapped using Informatica PowerCenter. Loaded the data in to Oracle and Flat files.

• Documented the functional specifications, technical specifications and Informatica Load processes and handled document walkthroughs.

• Extensively participated in functional and technical meetings for designing the architecture of ETL load process (mappings, sessions, workflows from source systems to staging to DW) and offshore team management.

• Created several Informatica mappings with business rules to load data.

• Created complex mappings and Mapplets using several transformations such as Source Qualifier, Filter, Aggregator, Expression, Joiner, Lookup, Router, Update Strategy, Normalizer, Sequence generator transformations. Used Debugger to pinpoint errors in code and take corrective action.

• Exclusively worked with Views, Synonyms, Procedures/Functions and Packages in Oracle PL/SQL to build business rules.

• Defined the Source, Target structures, and connections and imported them into the Informatica source analyzer and warehouse designer.

• Improved the mapping performance using SOURCE, LOOKUP and SESSION overrides and fine tuning the queries

• Developed Re-Usable Tasks, Sessions and Workflows.

• Documented the Informatica Load processes, Unit tested the developed Informatica mappings

• Improved the source to target load performance.

• Unit Tested the developed mappings and prepared Check List.

• Written UNIX Scripts and automate the load process.

Environment: Informatica PowerCenter 7.1.2, Oracle 10G, PL/SQL, AIX 5.2, ERWIN 4.1, TOAD/SQL Developer, Visio, IBM Tivoli, Tortoise SVN, Win XP

Foresters, Toronto, Canada Sep’ 07–April’ 08

ETL Consultant

Involved in analysis, design, development, implementation and quality assurance of DW applications and reporting system. The Data warehouse solution provides the reports to the claims department for generating reports.

Responsibilities:

• Designed the procedures for getting the data from source systems to Data Warehousing system. The data was standardized to store various Business Units in tables.

• Developed the mapping technical/Functional specifications before creating mappings

• Created Informatica mappings with business rules to load data.

• Extensively used Source Qualifier Transformation to filter data at Source level rather than at Transformation level. Created different transformations such as Source Qualifier, Joiner, Expression, Aggregator, Rank, Lookups, Filters, Stored Procedures, Update Strategy and Sequence Generator.

• Prepared technical specifications for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables and defining ETL standards Designed and created mappings and mapplets using Informatica Designer and Used Debugger to test the data flow and fix the mappings errors.

• Built re-usable ‘Mapplets’ using Informatica Designer.

• Developed Sessions and Workflows .

• Deployed the developed code across the DEV, QA, UAT and PROD repositories.

• Created Parameter files, defined mapping parameters in the parameter file

• Used Incremental Aggregation to improve the performance of the fact table load

• Improved the source to target load performance.

• Extensively used ETL to load data from flat file to Oracle database.

• Involved in preparing Backup and Recovery strategy for Production and Development servers.

• Written UNIX Scripts and automated the load process

Environment: Informatica PowerCenter 7.1, Oracle 10G, PL/SQL, AIX 5.2, ERWIN 4.1, TOAD 8.6, Win XP

Apotex Pharmaceuticals, CA Jun’06 to Aug’07

Data Warehouse Developer

This project involves building of a Data Warehouse solution to meet the various requirements of Apotex Pharmaceuticals Sales depots, which are located across the country. The basic requirement for Apotex Pharmaceuticals with respect to sales and distribution are, stock availability across location and products, sales reports, analysis, inventory management, order management, customer accountancy and sales.

Responsibilities:

• Involved in designing of Informatica mappings by translating the business requirements.

• Created Informatica mappings to Load Dimension Tables Such as Customer, Account, product wise etc.

• Built re-usable ‘Mapplets’ using Informatica Designer.

• Created various tasks and Workflows.

• Monitored workflows and collected performance data to maximize the session performance.

• Used Informatica’s features to implement Type I, II, changes in slowly changing dimension tables.

• Defined Parameter File and Defines Connection Information in Parameter File.

• Unit tested the Developed Mappings.

• Reading the Flat Files (Customer Orders, sales etc) And Loading into Data base.

• Improved Performance at mapping level by using SQL override, mapping variables.

• Gone through the user requirements and developed database to Cognos catalog folder mapping document.

• Designed and developed Cognos Impromptu catalog

• Created user classes and configured to access the Cognos Access Manager security Information

• Developed reports based on customer, sales distribution area wise in the impromptu

• Created various list, cross tab and drill through reports

• Implemented cascading prompts

• Designed and created the PowerPlay Transformer Model for the Sales Cube

• Created IQD files

• Acted as a Cognos Administrator and published the reports on the Impromptu Web Reports Server

• Improved the performance of the Reports by weighting and by writing custom SQL queries

• Demonstrated to the end business users on usage of the Cognos Impromptu web reports and the cube in the PowerPlay for Windows

• Involved in identifying the final reports to be exported to the Cognos web based Reporting tool

• Written test cases and tested the Reports

• Gone through the user requirements and developed database to Cognos catalog folder mapping document

Environment: Informatica PowerCenter 7.1, Oracle 10G, Cognos Impromptu 7.1, PowerPlay Transformer Model, PowerPlay for windows, IWR Oracle 9i, Windows 2003

Projects worked through SoftSol India Limited

Client: City Bank, INDIA Feb’05 - Apr’06

Data Warehouse Developer

City Bank is a Financial Service Organization. We have built Data warehouse for all Sales Divisions. This DW Provides information for data of Customers and their Loan Information from various source applications. The Data warehouse Implementation was accomplished using Informatica PowerCenter ETL tool using Oracle 9i database.

Responsibilities:

• Worked as a Informatica Developer is to execute technical design and development activities related to ETL (Extract, Transform and Load) between specific data sources and the data warehouse.

• Created ETL mappings according to business logic.

• Developed mappings for Sales, Orders and Orders Received metrics.

• Developed reusable Transformations.

• Used lookup, router, filter, joiner, stored procedure, source qualifier, aggregator and update strategy transformations extensively.

• Performed tuning of the mappings.

• Developed mappings to load data into II slowly changing dimension tables.

• Created sessions and workflows in the workflow manager.

• Created Users, Folders, and Assigning access permissions to repository users.

• Unit tested the mappings using SQL scripts.

• Written SQL overrides in source Qualifier to retrieve the data efficiently.

• Used debugger to identify the errors in the mappings.

• Monitored the running workflows and sessions in the workflow monitor.

• Suggested to create Oracle Indexes on some of the source tables

• Scheduling the UNIX script from the CRONTAB

Environment: ETL-Informatica PowerCenter 7.0, Oracle 9i, PL/SQL, AIX, TOAD, Windows XP

Client: PROCTOR & GAMBLE, OH Mar’ 04– Nov’ 04

Data Warehouse Developer

Proctor & Gamble Pharmaceuticals is a division within Procter & Gamble Health Care. P&G Health Care includes over-the-counter medications; oral care products and other health care products. This Data mart is an investors Data mart. The project aims at consolidating customer data received from different source systems and creating a central data warehouse to analyze the business performance, trends over time and build forecasts in addition to generating various standard, adhoc/canned reports. ETL process has been carried out using informatica and business objects for reporting and analysis purposes.

Responsibilities:

• Went through the requirement documents and developed Functional and implementation specifications.

• Involved in physical and logical models development.

• Extensively used Informatica powerCenter for extraction, transformation and loading process.

• Used Informatica Repository Manager to create Users and to give permissions to users.

• Used Informatica Designer to create mappings using different transformations to move data from source to a Data Warehouse.

• Designed and Developed mappings in Informatica using different transformations like Source Qualifier, Expression, Lookup, Aggregator, and Router, Rank, Filter and Sequence Generator transformations.

• Created Update Strategy and Stored Procedure transformations to populate targets based on business requirements.

• Created and Configured Workflows, Worklets and Sessions to transport the data to target warehouse tables using Informatica Workflow Manager.

• Scheduled the sessions to extract, transform and load data into warehouse database as per Business requirements. Improved the session performance by pipeline partitioning.

• Optimized the performance of the mappings by various tests on sources, targets and transformations.

• Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.

Environment: Informatica Power Center 6.2, ERWIN 4.0, windows 2000, Oracle 9i, TOAD, PL/SQL

Client: Allergan, USA. Oct’ 02– Feb’ 04

Data Warehouse Developer

Allergan is the leading maker of eye care, skin care and prescription-based pharmaceutical products. The company’s eye care products include medications, surgical equipment, contact lens cleaners, and intraocular lenses (used to replace the eye’s natural lens during cataract surgery). The company’s skin care products include treatments for acne and psoriasis. Sales outside the U.S. make up nearly half of Allergan’s revenues. I was involved in developing data extract modules for Enterprise Data warehouse, which facilitates Allergan to have an integrated view of the amount and the type of business with its trade customers worldwide identifying its partners that do significant business and analyzing the profitability on the basis of customer, product, region, etc.

Responsibilities:

• Involved in developing mapping document indicating the source tables, columns, data types, transformation required, business rules, target tables, columns and data types.

• Worked with multiple sources such as Relational Databases, Flat files for extraction using Source Qualifier and Joiner.

• Developed various Mappings & Mapplets to load data from source database using different Transformations like Aggregator, Joiner, Lookup, Update Strategy, Source Qualifier, Expression and Sequence Generator to store the data in target table.

• Extensively used ETL to load data from Flat files and relational database into relational target.

• Created sessions, batches for carrying out test loads in the Server Manager.

• Analyzed business requirements and system specifications and wrote test plan for the unit, functional, performance and integration testing.

• Extensively used PL/SQL, toad for creating views, stored procedure, and indexes on the tables.

• Created Cognos Impromptu catalog

• Set up the user classes and users in the Access Manager

• Created cognos Impromptu Canned Reports

• Involved in analyzing the required source definitions for PowerPlay Transformer model

• Involved in maintaining Cognos impromptu reports and supported the production from offshore

Environment: Informatica PowerCenter 5.1,Cognos Impromptu 6.5, Access Manager, PowerPlay transformer, windows 2000, Oracle 8i,TOAD

Education

Bachelors in Computer Applications

References: Available upon the Request



Contact this candidate