Resume

Sign in

Data Manager

Location:
Farmington, Michigan, United States
Posted:
January 19, 2017

Contact this candidate

JEMON ABRAHAM THOMAS Email:acydmf@r.postjobfree.com Mobile: +17472429028

Data Warehouse Analyst with 11.6 years of progressive ETL, MDM and IDQ experience in Banking, Financial,Insurance,Retail, Telecom and Health care domains. In depth knowledge about various phases of IT projects which include planning and designing, gathering business requirements, analysis, development, testing, data modeling, data governance and implementation encompassing both waterfall and AGILE SDLC methodologies.

Profile Summary:

11.6 years of IT industry experience with wide range of skill set, roles and responsibilities in Consulting and Architecture.

Involved in various projects related to Data Modeling, System/Data Analysis, Design and Development for both OLTP and Data warehousing environments.

10 plus years Experience in Informatica.

Successfully performed Architect, techno-functional and development focusing on MDM,IDQ and Informatica Powercenter.

Expertise with Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling /IDQ Developer.

Strong Exposure in Data Warehouse concepts like Star schema & Snow-flake schema. Experience in implementing Slowly Changing Dimensions - Type I & II in Dimension tables as per the requirements using Informatica Power Center.

Expertise in Master Data Management concepts, Methodologies and building MDM solutions.

Extensive experience in providing technical leadership for the team on Informatica Powercenter, Informatica MDM (Siperian), IDD (Informatica Data Director), IDQ (Informatica Data Quality), Address Doctor and Identity Matching.

Knowledge on Core Java

Knowledge on Big data and Hadoop eco systems

Training on Hadoop, MapReduce, HDFS, HBase, Zookeeper, Hive, Pig, Sqoop, Oozie, Flume.

Hands on experience on handling huge volume of data and performance tuning. Successfully worked in global delivery models working with offshore & onsite teams

Experience working on multiple domains – Health care, BFSI (Banking,Financial Services and Insurance),Telecom and Retail.

Performed Business Request analysis,creating Business request response,Data Profiling,Source system analysis,data modeling,Trust/Match strategy design,MDM performance/load count monitoring, UAT,3rd party issue resolution,SIF API implementation

Skill Set:

Databases

Oracle 11g,DB2, IMS DB

Languages

SQL, PL/SQL, UNIX,COBOL,Core Java

Operating System

Windows server 2012/2008 R2, Win-7/XP.

Tools

SQL Developer, MS-Visio,Control-M,ERWIN,TOAD

ETL Tools

Informatica PowerCenter 8.x/9.x IDQ 9.5,MDM 9.7.1,MDM 10.1,Informatica Data Director(IDD)

Other Skills

Mainframe(COBOL,JCL)

Education

Bachelors in Engineering (Computer Science) - Karunya Institute of Science and Technology(2000-2004)

Professional Experience

Employer : ISGF April 2016 - Present

Client:HAP,Michigan

Role: Senior Data Analyst/Tech Lead (Informatica DQ,MDM)

Research, retrieve, design, troubleshoot, and deliver organized analytic regarding accounts receivable, utilization data, claims data and financial accounting data

Designed and developed IDQ solution to support MDM Projects

Analyzes and extracts data from the Enterprise Data Warehouse (EDW) to report to internal and external customers

Designs, develops, tests, documents and maintains database queries for ad hoc reporting used in complex conceptual analyses.

Manages reporting and analysis elements on all business initiative projects.

Develops innovative approaches to achieve deliverables and increase efficiencies

Develop and publish global data standards, master data dictionary, data governance policies and processes required to effectively manage enterprise master data. Modify these standards, policies and processes as needed when business requirements and/or systems change

Collaborate across both processes and data domains to ensure all needs are met with respect to the assigned data domain.Collaborate functionally across business process, data & system domains to ensure all needs are met with respect to master data

Generates and records ideas for improvement (e.g. integrity checks, maintenance processes, metadata, services

Work with Data Stewards on defining the terms in the Enterprise Glossary and Data dictionary

Actively contributes to the results of a team and works towards achieving team goals and objectives

Develop KPIs to monitor and measure actual data quality and process effectiveness that point to tangible business value

Build profiling, cleansing and validation plans.

Analyze profiling results and make recommendations for improvements.

Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ.

Profile the Data from the current Source systems and identify Data Quality rules for cleansing and standardization

Gathers requirements for customized solutions and provides alternative solutions

Monitor all functions on utilization of enterprise master data policies and procedures .

Perform root cause analysis of process & system problems pre and post implementation and for existing production processes & systems, recommend solutions to prevent problem recurrence

Creates documentation of business processes, training procedures, and standard operating procedures

Responsible for data accuracy, performing and reporting on periodic audits and data integrity checks

Supports the strategic vision for the business use and management of corporate data to achieve increased efficiency, effectiveness and profitability

Be part of the modernization of data management and analytics capabilities, processes and practices

Support the development and implementation of capabilities to improve, monitor, and audIit data quality within the reporting and analytics environment..

Establish and maintain data management best practices and standards, and ensure consistent application of those practices across the organization.

Analyze existing business processes and identify opportunities for improvement; Propose alternative business process solutions

Validates and serves as point of escalation for change requests

Assists in maintaining master data governance processes from definition through to execution and continuous improvement

Analyzes data loads and exceptions to find patterns in data to enhance business rules

Corrects data issues and ensures consistent standardization to align with master data policies

Generates ideas for improvement (e.g. integrity checks, maintenance processes

Perform acceptance tests on delivered software components

Manage risk by implementing effective scope management process

Analyze all aspects of Enterprise Data Warehouse including: refreshes, programming, system functionality, troubleshoot data integration and providing support for quality improvement team.

Collect, synthesize and research complex data and design work flows and procedures.

Troubleshoot client inquiries for data reported while regularly updating on progress and documenting steps.

Serve as analytic contact/resource for external business partners to support ongoing data exchange, analysis, reporting and interpretation of performance results; establish and maintain effective ongoing client relationships

Client:Florida Blue

Role: Senior Informatica DQ MDM Developer/Analyst

Florida Blue is the oldest and largest health plan insurer in the state of Florida as it transitions into a health solutions company.Florida Blue and its family of subsidiaries serve more than 15.8 million people in 16 states.In its primary health business, Florida Blue serves more than 4.3 million members, which represents a 30 percent share of the overall Florida health insurance market.

Projects:-

Full File Audit Phase 1

This project is to identify an enterprise solution for Error Reporting process and to make sure data undergoes strict data Governance Process.This involves identifying existing issues using Informatica data quality profiling and creating Master data Solution.

NASCO IU65

The “Florida Blue” is pursuing a new strategic initiative and in support to this initiative, Florida Blue wants to implement a Data Quality solution based on Informatica Data Governance Edition 9.6.1.

Overall biggest impact for GuideWell is by enforcing governance from beginning to end.

Quality in business process needs to be understood and have the governance guidance in defining business rules and to Invest more energy to do the right thing at the right place.

New enrollment process from NASCO is closely monitored and delivered Business solutions to create Master data Solutions

Responsibilities:

Creates and implements the Data Governance framework that meets data objectives for the Organization.

Was responsible in managing tasks and deadlines for the ETL teams.

Was the point of contact person on the ETL team for other teams such as Reporting, Testing, QA and Project Management for updates on Project Status and issues.

To get the Functional Requirement Specifications from customer and understand the same.

Created Development standards(Naming Conventions,Folder structure etc),Reusable codes(mapplets,shortcuts,workletsetc),Deployment group in Repository Manager.

Break up the work into task list and estimate based on simple, medium and complex methodology

Involved in business meetings and interacted with Clients, Data Stewards to analyze the business requirements and developed a design plan and solution approach.

Coordinated with business analysts to deeply analyze the business requirements from end to end perspective.

Designed and reviewed the implementation plan

High level & low-level ETL flow design

Involved in designing MDM data model and created base objects, mappings, defined trust setting for sources, customized user exists, customized IDD applications

Worked with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.

Was involved in conducting the review of Informatica Code, Unit Test Cases & Results with the Developers.

Worked in an agile methodology to successfully accommodate changes and delivered user stories with more quality in stringent timelines.

Normalized data models across four different source systems and successfully delivered projects within timelines.

Created DTD documents based on the good understanding of FRD and BRD documents.

Resolved functional as well as technical queries by taking help of functional SME or client users.

Involved in designing MDM data model and created base objects, mappings, defined trust setting for sources, customized user exists, customized IDD applications

Designed complete IDQ application from Source extraction through mapping design and Master workflows

Designed complex application in duplicate/discrepancy data handling by exception and human task and projecting bad data to analyst for data stewards to work on it and manager to review the work of the data stewards .

Assist in any other areas relating to the use of data quality processes, such as unit testing, or integration deployment

Assist in identification and definition of business glossary terms as well as general business glossary management.

Created a Hybrid process in IDQ by combining both IDQ Developer and analyst version through LDO(Logical Design Objects) .

Worked on IDQ Analyst for Profiling, Creating rules on Profiling and Scorecards.

Worked with Management for creating the requirements and estimates on the Project.

Assisted Business Analyst with drafting the requirements, implementing design and development of various components of ETL for various applications.

Exported the IDQ Mappings and Mapplets to power center and automated the scheduling process.

Assist in code testing, migration, deployment, enhancement and bug fixes.

Provide high-level design walkthrough on how the solution will be built for each requirement.

Company : UST-GLOBAL Feb 2010-April 2016

Client: WellPoint,Woodland Hills CA

Role: Technical Lead/Architect/ System Analyst

WellPoint, Inc. is the largest managed health care, for-profit company in the Blue Cross and Blue Shield Association. It was formed when WellPoint Health Networks, Inc. merged into Anthem, Inc., with the surviving Anthem adopting the name, WellPoint, Inc. and began trading its common stock under the WLP symbol on December 1, 2004.

Projects:-

EPDS V2 - Migration

Providers from regional systems are migrated to a common database to use it as single source of truth. This will be used by downstream applications for claim processing. The project is centered on the Informatica MDM platform and will utilize capabilities of Informatica data quality tools and Informatica data integration tools

Responsibilities:

Providing Good Architectural design and created solutions for the customers to showcase the functionality of Informatica Master Data Management (MDM) and other products integrated with the MDM hub.

Understood business requirements and data and created MDM data models and IDD configuration for domains as per the requirements provided by customers with their data.

Designed and configured MDM hub and IDD. Created Staging, Landing tables, Mappings and data flow to the Base object tables, Match& Merge, Trust settings, Complex hierarchies & workflow for Informatica MDM.

Data Profiling and Data Quality Analysis of the data given by customer to find out anomalies in the data and highlighting the effectiveness of the Informatica Products.

Analyzed the source system for integrity issues to the data resided in various source systems and defined the same in Systems and Trust.

Used Address Doctor extensively for Address validation of Providers from different legacy Source Systems. Built several reusable components on IDQ using Parsers, Standardizers and Reference tables which can be applied directly to standardize and enrich Address information.

Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.

Defined the Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages and query groups.

Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.

Used Hierarchies tool for configuring entity base objects, entity types, relationship base objects, relationship types, profiles, put and display packages and used the entity types as subject areas in IDD.

Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records.

Configured match rule set property by enabling search by rules in MDM according to Business Rules.

Performed match/merge and ran match rules to check the effectiveness of MDM process on data.

Worked on performance tuning of the complex transformations, mappings, sessions and SQL queries to achieve faster data loads to the data warehouse.

Implementing the Business Change Requests (BCR) at the client side.

Analyze the various project requirements and transferring the knowledge to offshore team.

Ensuring that the team has the required technical expertise to execute the project. This involves the identification of key technical areas related to the project and conducting training sessions to plug any gaps.

Clarifying the issues of the offshore team members and helping them in the proper execution of the project.

Worked on real time implementation using SIF

Worked on hub customization and user exits.

Environment: Informatica Multidomain MDM 9.7.1, Informatica 10.1,JBoss 5.1, DB2, Address Doctor 5, IDD, Informatica Data Quality 9.5.1 Hotfix 3, Informatica power center 9.6.1.

Company :UST-GLOBAL Aug 2009-Feb 2010

Client: Safeway

Domain: Retail

Role:Project Lead

Safeway is an American supermarket chain.It is the second largest supermarket chain in North America.

PPI(Promotional Pricing Integration) is the project initiative for removing the manual intervention in different Promotional activities by Safeway. All promotional prices were manually entered into Safeway’s execution systems, even though items, prices and effective dates were all planned in applications like RTPP. PPI system will provide the ability to automate this manual entry using committed promotion prices generated by PPO application.

Responsibilities:

Designing and Developing programs based on clients requirement

Profile the Data from the current Source systems and identify Data Quality rules for cleansing and standardization.

Working with the DQ Architect in understanding the current state of Data

Worked on performance tuning of the complex transformations, mappings, sessions and SQL queries to achieve faster data loads to the data warehouse.

Provided support for Onsite coordinator,acted as quality gate for the offshore deliverable.

Communicate with the client regarding project progress, meet delivery milestones, and resolve issues related to delivery of projects.

Environment:

Informatica power center 9.6.1,Informatica Data Quality 9.5.1 Hotfix 3

Company: WIPRO Jan 2008-Aug 2009

Client: Bank of New York Mellon

Domain: Banking

Role: Senior Developer/Team Lead

The Bank of New York Mellon Corporation, commonly referred to as BNY Mellon, is an American multinational banking and financial services corporation formed on July 1, 2007 as a result of the merger of The Bank of New York and Mellon Financial Corporation. It is the oldest banking corporation in the United States.

The objective of the integration exercise for cash and custody project between legacy BNY and heritage Mellon is to make available a Client Information Warehouse (CIW), which would support the workbench reporting requirements.As part of this, the Client Information Warehouse (CIW) database was build and hosted on the mainframe infrastructure which was used by the heritage Mellon CRS system. CIW will initially be populated by custody data sourced from legacy CDW system. CIW will eventually be extended to become the custody data warehouse for BNYM integrating data from both legacy BNY CDW and heritage Mellon custody and accounting systems known as CMS (Custody Management System) and IAS (Institutional Accounting Systems).

Responsibilities:

Responsible for Business Analysis and Requirements Collection.

Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Parsed high-level design specification to simple ETL coding and mapping standards.

Designed and customized data models for Data warehouse supporting data from multiple sources on real time

Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.

Created mapping documents to outline data flow from sources to targets.

Involved in Dimensional modeling (Star Schema) of the Data warehouse

Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.

Maintained stored transformation rules and targets definitions using Informatica repository Manager.

Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.

Developed mapping parameters and variables to support SQL override.

Created mapplets to use them in different mappings.

Developed mappings to load into staging tables and then to Dimensions and Facts.

Used existing ETL standards to develop these mappings.

Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.

Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.

Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

Modified existing mappings for enhancements of new business requirements.

Used Debugger to test the mappings and fixed the bugs.

Involved in Performance tuning at source, target, mappings, sessions, and system levels.

Prepared migration document to move the mappings from development to testing and then to production repositories.

Environment:

Informatica Power Centre,DB2,Mainframe

Company :T-SYSTEMS Dec 2006-Jan 2008

Domain: Telecommunication

Position: Developer

T-Systems is a German global IT services and consulting company headquartered in Frankfurt. Founded in October 2000, it is a subsidiary of Deutsche Telekom AG.

This project is related to Telecom Industry and KONTES is a customer oriented reorganisation of subscriber services involving data processing systems. Kontes is break down in to two parts.Andi which deals with orders and customer data2. Orka which deals with local network such as telephone lines.

Responsibilities:

Involved in analyzing functional specifications provided by data architect and involved in creating technical specification documents for all the mappings. Involved in developing logical and physical data models that captures current state/future state data elements and data flows using Erwin.

Involved in converting data marts from logical design to physical design and involved in defining data types, constraints, indexes, generating schema in the database.

Involved in creating automated scripts and defining storage parameters for the objects in the database.

Involved in defining various facts and dimensions in the data mart to include fact less facts, aggregate and summary facts.

Involved in reviewing source system and proposed data acquisition strategy.

Involved in gathering of business scope and technical requirements and created technical specifications.

Developed complex mappings and SCD type-I, Type-II and Type III mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router and SQL transformations. Created complex mapplets for reusable purposes.

Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.

Created synonyms for copies of time dimensions, used the sequence generator transformation type to create sequences for generalized dimension keys, stored procedure transformation type for encoding and decoding functions and Lookup transformation to identify slowly changing dimensions.

Fine-tuned existing Informatica maps for performance optimization.

Worked on Informatica Designer tools –Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer and Server Manager to create and monitor sessions and batches.

Involved in the development of Informatica mappings and also tuned for better performance.

Debugged mappings by creating logic that assigns a severity level to each error, and sending the error rows to error table so that they can be corrected and re-loaded into a target system.

Tools: Informatica Powercenter,Informatica PowerExchange,DB2

Company :HSBC-GLT June 2005-Dec 2006

Client: HSBC

Domain: Banking

Role:Developer

HSBC is a British multinational banking and financial services company headquartered in London, United Kingdom. It is one of the world's largest banks.

Auto Data Warehouse is a System, which provides Business Users the ability to do analysis over a time. It also helps Business Users to understand Behavior Understanding, Profitability of Customer over time, Usage and Profitability of Products,Retention of Profitable Customers, Marketing Promotions.

Responsibilities:

Gathered business requirements from Business Analyst.

Supporting Mainframe tasks in both source and target side.

Designed and implemented appropriate ETL mappings to extract and transform data from various sources to meet requirements.

Designed and developed Informatica ETL mappings to extract master and transactional data from heterogeneous data feeds and load

Installed and Configured the Informatica Client tools.

Worked on loading of data from several flat files to XML Targets.

Designed the procedures for getting the data from all systems to Data Warehousing system.

Created the environment for Staging area, loading the Staging area with data from multiple sources.

Analyzed business process workflows and assisted in the development of ETL procedures for moving data from source to target systems.

Used workflow manager for session management, database connection management and scheduling of jobs.

Created UNIX shell scripts for Informatica ETL tool to automate sessions.

Monitored sessions using the workflow monitor, which were scheduled, running, completed or failed. Debugged mappings for failed sessions.

Environment: Informatica Power centre,DB2,COBOL,JCL



Contact this candidate