Post Job Free

Resume

Sign in

Data Manager

Location:
Mumbai, MH, India
Posted:
February 19, 2016

Contact this candidate

Resume:

Samreen H

Phone#: 213-***-****

Email: actlz6@r.postjobfree.com

SUMMARY:

Over all 7+ years work experience in ETL (Extraction, Transformation and Loading) of data from various sources into EDW, ODS and Data marts using Data Integration Tool Informatica Power Center 9.5.1/8.x/7.x/6.x in Insurance, Retail, Banking, Telecom and Health care departments.

Experience in the Implementation of full lifecycle in Data warehouse, Operational Data Store (ODS) and Business Data Marts with Dimensional modeling techniques Star Schema and Snow flake Schema using Kimball methodologies

Experience in design, development and maintenance of software applications in Information Technology, Data warehouse and RDBMS concepts.

Expertise on Informatica Mappings, Mapplets, Sessions, Workflows and Work lets for data loads.

Experience in Performance Tuning of Targets, Sources, Sessions, Mappings and Transformations.

Worked on Exception Handling Mappings for Data Quality, Data Profiling, Data Cleansing, Data Validation

Good knowledge in interacting with Informatica Data Quality (IDQ).

Experience in Configuration of Informatica MDM Data Director, Hierarchy Manager, Match/Merge process

Hands-on experience with Informatica MDM Hub configurations - Data modeling Data Mappings (Landing, staging and Base Objects), Data validation, Match and Merge rules, writing and customizing user exits, customizing configuring Business Data director (BDD) Informatica data director (IDD) applications

Experience in developing Informatica Reusable components for using them across the projects.

Extensively worked with Informatica Mapping Variables, Mapping Parameters and Parameter Files.

Worked on Slowly Changing Dimensions - Type 1, Type 2 and Type 3 in different mappings as per the requirements

Databases like Oracle, DB2, SQL Server, Microsoft Access and Worked on integrating data from Flat files like fixed width /delimited, XML files and COBOL files.

Experience in writing Stored Procedures, Functions, Triggers and Views on Oracle 10g/9i/8i, DB2, SQL Server 2008/2005/2000, DB2, materialized TSQL

Extensively worked on Monitoring and Scheduling of Jobs using UNIX Shell Scripts

Worked with PMCMD to interact with Informatica Server from command line and execute the Shell script.

Good knowledge on Migrating the environment from Informatica 7.x and 8.x to V9.

Experience on ER Data Modeling tools like Erwin, ER-Studio and Visio in developing Fact & Dimensional tables, Logical and Physical models.

Expertise on tools like Toad, Autosys, and SQL Server Management Studio. Involved in Unit testing, Functional testing and User Acceptance testing on UNIX and Windows Environment.

Good exposure to the Software Development (SDLC) and OOAD Techniques.

Completed documentation in relation to detailed work plans, mapping documents.

Experience in managing onsite- offshore teams and coordinated test execution across locations

Excellent communication skills, documentation skills, team problem solving ability, analytical and programming skills in high speed, quality conscious, multi-tasked environment.

TECHNICAL SKILLS:

Data Warehousing

Power Center9.x/8.x/ 7.x/6.x(Source Analyzer, Data Warehouse designer, Mapping Designer, Mapplet, Transformations, Workflow Manager, Workflow Monitor, Work lets), IDQ, Informatica MDM,Data Profiling, Data cleansing, OLAP, OLTP, Data Stge7,DW Concepts, DAC, Data Quality, DATA Migratio, DATA Movement, HEDIS.

Data Modeling

Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin 4.1/3.5.2.

Database

Visual Studio 2012, 2010 and 2005, Oracle12g/11g/10g/9i/8i, SQL Server 2000/2005/2008/2012/2014, Teradata, Sybase, Greenplum, HIPPA, Netezza, AWS Redshift.

DB Tools

TOAD, SQL*Plus, PL/SQL Developer, SQL * Loader, Teradata SQL Assistant, SSIS, SSRS, OBIEE, SSAS, RDMBS, DTS Packegs, SAS, SPSS.

Reporting Tool

Cognos 8.0/7.0, Tidal Schedule.

Programming

SQL, PL/SQL, TSQL, C, Linux, Unix Shell Scripting,AS/400,AIX,XML, MSSQL, java, jQuery, Java scripts, Agile, Ajax, XSD,B2B.

Environment

Windows NT/98/XP/2000, HP-UX 10.20, MS DOS.

Others

AutoSys, BODS 4.0, MS Excel, MS Word, MS PowerPoint, MS Outlook, MS Office products, Metadata, Performance Tuning, .net framework, Auto Sys.

EDUCATION:

Bachelors in Computer Science.

PROFESSIONAL EXPERIENCE

Client: Truven Health Analytics, Ann Arbor, MI Aug 2014 – Till Date

Role: Sr Informatica Developer

Description: The main objective of this project shared data Repository is to capture new vitality program customer’s data, policies, group policies and Medicare plans. Data is coming from various sources like SQL Server, Mainframe etc which will be loaded in to EDW based on different frequencies as per the requirement. The entire ETL process consists of source systems, staging area, Data warehouse and Data mart.

Truven Health Analytics is a multinational Health Care company. The Company delivers unbiased information, analytic tools, benchmarks, research, and services to the healthcare industry, including hospitals, government agencies, employers, health plans, clinicians, pharmaceutical, biotech and medical device companies

Responsibilities:

Responsible for gathering requirement of the project by directly interacting with client and made analysis accordingly.

Coordinated the work flow between onsite and offshore teams.

Defined various facts and Dimensions in the data mart including Fact less Facts, Aggregate and Summary facts.

Extracting, Scrubbing and Transforming data from Flat Files, Oracle, SQL Server, DB2,Teradata and then loading into Oracle database using Informatica

Worked on optimizing the ETL procedures in Informatica9.5.1/8.6.1 version.

Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.

Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Implementing logical and physical data modeling with STAR and SNOWFLAKE techniques using Erwin in Data warehouse as well as in Data Mart.

Used Type 1 and Type 2 mappings to update Slowly Changing Dimension Tables.

Involved in the performance tuning process by identifying and optimizing source, target, and mapping and session bottlenecks.

Configured incremental aggregator transformation functions to improve the performance of data loading. Worked Database level tuning, SQL Query tuning for the Data warehouse and OLTP Databases.

Used Informatica repository manager to create folders and add users for the new developers.

Configured Informatica Server to generate control and data files to load data into target database using SQL Loader utility.

Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.

Informatica Data Quality (IDQ 8.6.1) is used here for data quality measurement.

Used Active batch scheduling tool for scheduling jobs.

Checked Sessions and error logs to troubleshoot problems and also used debugger for complex problem trouble shooting.

Negotiated with superiors to acquire the resources necessary to produce the project on time and within budget. Get resources onsite if required to meet the deadlines.

Delivered projects working in Onsite-Offshore model. Directly responsible for deliverables.

Developed UNIX Shell scripts for calling the Informatica mappings and running the tasks on a daily basis.

Wrote Oracle PL/SQL procedures and functions whenever needed.

Created & automated UNIX scripts to run sessions on desired date & time for imports.

Environment: Informatica Power Center 9.5.1/8.6.1, IDQ,PL/SQL, Oracle 9i, TOAD, Erwin 7.0, Unix, SQL Server 2000,Autosys, Windows Server 2003, Visio 2003.

Client: Caterpillar Inc, Peoria, IL Apr 2013 -Jul 2014

Role: Sr ETL Developer

Description: Caterpillar Inc., is an American corporation which designs, manufactures, markets and sells machinery, engines, financial products and insurance to customers via a worldwide dealer network. Caterpillar is the world's leading manufacturer of construction and mining equipment, diesel and natural gas engines, industrial gas turbines and diesel-electric locomotives

The primary objective of this project is to capture different Customers, Policies, Products and financial related data from multiple OLTP Systems and Flat files. Extracted Transformed Loaded data in to data warehouse using Informatica Power Center and generated various reports on a daily, weekly monthly and yearly basis. These reports give details of the various products of Caterpillar Inc that are sold. The reports are used for identifying agents for various rewards and awards and performance, risk analysis reports for Business development Managers.

Responsibilities:

Interacted with the Business users to identify the process metrics and various key dimensions and measures. Involved in the complete life cycle of the project.

Involved in Requirement Analysis, ETL Design and involved in massive data cleansing prior to data in staging tables from Oracle, flat files, DB2, and SQL Server, MY SQL.

Developed complex Informatica power Center Mappings with transformations like Source qualifier, Aggregator, Expression, lookup, Router, Filter, Rank, Sequence Generator, and Update Strategy.

Created Mapplets, reusable transformations and used them in different mappings.

Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the workflow manager.

Implemented Star schema logical, physical dimensional modeling techniques for data warehousing dimensional and fact tables using Erwin tool.

Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Mapplets and Transformation objects.

Created Slowly Changing Dimension (SCD) Type 2 mappings for developing the dimensions to maintain the complete historical data.

Worked on handling performance issues of Informatica Mappings, evaluating current logic for tuning possibilities, Created PL/SQL procedures triggers views for better performance.

Tuned SQL Queries in Source qualifier Transformation for better performance.

Tuning the ETL-Informatica code in Mapping level, and session level.

Developed the designed techniques and transformation techniques for acquiring the implementation of Master Data Management.

Wrote test plans and executed it at UNIT testing and also supported for system testing, volume testing and USER testing.

Provided production support by monitoring the processes running daily, Provided data to the reporting team for their daily, weekly and monthly reports.

Involved in team weekly and by monthly status meetings.

Environment: Informatica PowerCenter8.6, Power Exchange,Oracle10g, SQL Server 2008, MySQL, Toad, SQL, PL/SQL (Stored Procedure, Trigger, Packages), Erwin, MS Visio, Tidal, Windows XP, AIX,UNIX Shell Scripts.

Client: Ford Motor Company, Dearborn, MI May 2012 – Mar 2013

Role: Informatica Developer

Description: The Ford Motor Company (commonly referred to as simply Ford) is an American multinational automaker. The company sells automobiles and commercial vehicles under the Ford brand and most luxury cars under the Lincoln brand. Ford introduced methods for large-scale manufacturing of cars and large-scale management of an industrial workforce using elaborately engineered manufacturing sequences. This position required implementing data warehouse for Forecasting, Marketing, Sales performance reports. The data is obtained from Relational tables and Flat files. I was involved in cleansing and transforming the data in the staging area and then loading into Oracle data marts. This data marts/Data warehouse is an integrated Data Mine that provides feed for extensive reporting.

Responsibilities:

Extensively involved in writing ETL Specs for Development and conversion projects.

Involved in analysis of business requirements from the Data Management group for the newly identified external and internal source data for various business units.

Involved in analyzing the source data coming from different Data sources such as Oracle, SQL Server, and identifying data anomalies in the source data.

Coordinated with Business Users and the Offshore Development Team for solving design issues and requirements understanding for the coding phase.

Extensively created complex mappings using transformations like Filter, aggregator, update strategy, lookup, router, stored procedure, sequence generator, and joiner using Informatica Power Center 9.5.

Implemented slowly changing dimensions Type 2 using date.

Scheduled the Jobs to run at particular time/interval thru the Scheduler using Workflow Manager.

Created tasks like Session, Decision, Email, Command, Event Wait, Event Raise and Control tasks in preparing the Workflow.

Successfully moved Sessions and Batches from the development to production environment.

Wrote several stored procedures for recycling and other extraction purposes.

Created reusable Mapplets and Transformations to reduce the complexity in mappings and to handle repetitive tasks such as setting and getting Dimension details.

Involved in Performance tuning at various levels including Target, Source, Mapping, Session for large data files.

Designed and Developed UNIX Shell scripts to enhance the functionality of ETL application.

Redesigned some of the mappings and Mapplets in the system to meet new functionality.

Enabled incremental loading in fact table mappings and made required changes to the mappings to populate the production data.

Worked on 24/7 on call production support environment weekly on call rotation basis between the Production Support Developers in the team.

Prepared project documentation, Program Specifications, Test Cases and Participated in the complete SDLC.

Environment: Informatica Power Center 9.5, Oracle 11g, SQL, COBOL, UNIX Shell Scripts.

Client: Bank Of America, Charlotte, NC Jan2011 - Apr 2012

Role: Informatica Developer

Description: Bank of America providing corporate finance & investment banking services. The objective of the project was to build data Warehouse for Customers Investment Deposit, funding accounts and Corporate Services. The data for Customers, Accounts and Transactional related information were extracted from multiple sources, transformed and loaded into the target database using ETL tool.

Responsibilities:

Extensively worked with the data modelers to implement logical and physical data modeling to create an enterprise level data warehousing.

Created and Modified T-SQL stored procedures for data retrieval from MS SQL SERVER database.

Automated mappings to run using UNIX shell scripts, which included Pre and Post-session jobs and extracted data from Transaction System into Staging Area.

Extensively used Informatica Power Center 6.1/6.2 to extract data from various sources and load in to staging database.

Extensively worked with Informatica Tools - Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, Repository server and Informatica server to load data from flat files, legacy data.

Created mappings using the transformations like Source qualifier, Aggregator, Expression, Lookup, Router, Filter, Rank, Sequence Generator, Update Strategy, Joiner and stored procedure transformations.

Designed the mappings between sources (external files and databases) to operational staging targets.

Involved in data cleansing, mapping transformations and loading activities.

Developed Informatica mappings and Mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.

Involved in the process design documentation of the Data Warehouse Dimensional Upgrades. Extensively used Informatica for loading the historical data from various tables for different departments.

Performing ETL & database code migrations across environments.

Environment: Informatica 6.1/6.2, PL/SQL, MS Access, Oracle 8i/7i, DB2, Windows 2000, UNIX.

Client: HSBC India Jan 2009 – Dec 2010

Role: Informatica Developer

Description: This Project aims to help the employees of HSBC GLT to refer candidates’ for open positions in the organization. Through this application admin can post new referral schemes for open positions in the organization and accordingly the employees can refer candidates and can get the benefit of cash rewards or gifts that are available under a particular referral scheme. This project also has various reports, which helps the admin to do a detailed analysis as per his criteria.

Responsibilities:

Experienced working with Telecom Network Information for mapping data from legacy systems to target systems

Extracted Data from legacy Sources by using Informatica Power Center.

Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformations.

Cleanse the source data, Standardize the Vendors address, Extract and Transform data with business rules, and built Mapplets using Informatica Designer.

Extracted data from different sources of databases. Created staging area to cleanse the data and validated the data.

Responsible for analyzing and comparing complex data from multiple sources (Tera data, Oracle, flat files, XML files, COBOL files).

Worked on debugging using session Log messages.

Used Informatica Power Center Workflow manager for session management, and scheduling of jobs to run in batch process.

Implemented Informatica partition to increase the performance.

Developed Data Migration document and ETL mapping documents for every mapping for smooth transfer of project from development to testing environment and then to production environment.

Generated various reports using Business Objects functionalities like Queries, Slice and Dice, Drill down, Functions, Cross Tab, Master/Detail and Formulas as per client requirements.

Designed and developed complex Aggregate, Expression, Filter, Join, Router, Lookup and Update transformation rules.

Developed schedules to automate the update processes and Informatica sessions and batches.

Analyze, design, construct and implement the ETL jobs using Informatica.

Environment: Informatica Power Center 5.1.1., Cognos, Windows NT, PL/SQL, Excel, SQL Server 7.0, Erwin.



Contact this candidate