Post Job Free
Sign in

Data Analyst Informatica Mdm

Location:
Elgin, IL
Posted:
May 24, 2024

Contact this candidate

Resume:

Hema

Sr. Data Analyst

PROFESSIONAL SUMMARY:

Possess 8+ Years of experience in ETL Informatica PowerCenter with good business understanding and knowledge of Extraction, Transformation and Loading of data from heterogeneous source systems like Flat files, Excel, XML, Oracle, SQL Server.

Hands on ETL experience using Informatica 10.x/9.x/8.x (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager).

Experience working on Informatica MDM to design, develop, test and review & optimize Informatica MDM (Siperian).

Expertise in creating Mappings, Trust and Validation rules, Match Path, Match Column, Match rules,

Merge properties and Batch Group creation.

Experience in creation and maintenance of entity objects, hierarchies, entity types, relationship objects

Relationship types using Hierarchy tool to enable Hierarchy Manager (HM) in MDM HUB implementation.

Hands on experience in design and configuration of landing tables, staging tables, base objects, hierarchies, foreign - key relationships, lookups, query groups, queries/custom queries and packages.

Designed, Installed, Configured core Informatica/Siperian MDM Hub components such as Informatica MDM Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, Data Modeling.

Hands on experience in optimizing SQL scripts and improving performance loading warehouse.

Implemented data warehousing methodologies for Extraction, Transformation and Loading using Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor.

Designed and developed Reference Integrity, Technical and Business Data Quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality environment.

Deployed workflows as an application to run them and tuned the mappings for better performance.

Usage of Informatica mapping variables/parameters and session variables where necessary.

Worked with Dimensional Data warehouses in Star and Snowflake Schemas, created slowly changing (SCD) Type1/2/3dimension mappings using Ralph Kimball methodology.

Experience in coding using Oracle 11g/10g/9i/8i, SQL Server 2000/2005/2008, SQL, PL/SQL procedures, functions, triggers and exceptions.

Developed Batch Jobs using UNIX Shell scripts to automate the process of loading, pushing and pulling data from different servers

Good knowledge in OLAP/OLTP System Study, Analysis and E-R Modeling, developing database Schemas like Star Schema, Snowflake Schema, Conforming Dimensions and Slowly Changing Dimensions used in relational, dimensional and multi-dimensional modeling.

Excellent communication and interpersonal skills. Ability to work effectively as a team member as well as an individual.

TECHNICAL SKILLS:

ETL Tools

Informatica PowerCenter 10.x/9.x/8.x, Informatica Data Quality, Power Exchange

RDBMS

Oracle11g/10g/9i, SQL Server 2000/2005/2008, Teradata

Job Scheduling Tools

AutoSys, Control M

Languages

SQL, PL/SQL, Unix Shell Scripting

Database frontend tools

Oracle PLSQL Developer, Oracle TOAD.

PROFESSIONAL EXPERIENCE:

Bank of the West, San Francisco, CA Aug 2021 – Till Date

Data Analyst

Description:

Bank of the West is one of the largest banks in the US and with Enterprise Data Management they have brought in multiple domains into the MDM from 60+ source systems. We validate and calculate the measure, schedule daily jobs and load it to database. With Informatica MDM, Enterprise Customer Master enables a complete view of the customer by creating and maintaining customers ‘Golden Record’ based on Account no and volume of the customer, a single instance of core customer and prospect data to be loaded to Bank of the West.

Responsibilities:

Involved with Data Steward Team for designing, documenting, and configuring Informatica Data Director for supporting management of MDM data.

Worked with Reltio Connected Data Platform to provide a complete and accurate profile of a customer with all internal and external data sources together in one easy flow to use in business interface.

Have worked extensively in configuring the Reltio Data Model,match and merge rules and proficient in working with reltio APIs.

Pertaining to data integration I have worked with Reltio Integration Hub and managed Reference Data Management(RDM),User Management, UI Configuration and also handled LCA and workflow.

Involved in the designing, analysis, and development for bringing in customer, account, product, employee and their hierarchy and relationships among them.

Worked with ETL Developers in creating External Batches to execute mappings, Mapplets using Informatica workflow designer to integrate Shire’s data from varied sources like Oracle, DB2, flat files and SQL databases and loaded into landing tables of Informatica MDM Hub.

Involved in requirement gathering, data Analysis, and user meetings, discussing the issues to resolve and translated the user inputs into ETL design documents.

Involved in the data analysis for source and target systems and good understanding of Enterprise Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema.

Coordinate with offshore team and client, update daily status report to client.

Validate the data in source and target systems using PL/SQL queries.

SQL queries for retrieving information from the database based on the requirement from source and validate the master data in MDM in comparison with the source.

Part of the DCA team which identified the gaps between source and MDM repository. Planned and executed the ETL transformations with the help of ETL developers in correcting more than 5 million records on various facts and dimensions.

Design and collaborate in development of user exits using Java and external front-end applications built over the entity 360 layouts for consumption of the MDM Data.

Environment: Informatica PowerCenter 10.4 (PowerCenter Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Informatica MDM 10.4 HF2, Power Mart, Informatica Power Exchange, Oracle 19c, PL/SQL, Data Warehouse, Autosys, SQL, UNIX Shell Scripts, Linux, Java, Scala, Hadoop, AWS.

Prudential Financial, Newark, NJ July 2019 – July 2021

Sr. ETL/Informatica Developer

Description:

Prudential Financial companies serve individual and institutional customers worldwide and include The Prudential Insurance Company of America, one of the largest life insurance companies in the U.S. These companies offer a variety of products and services, including mutual funds, annuities, real estate brokerage franchises, relocation services, and more. Involved in the development and implementation of goals, policies, priorities, and procedures relating to financial management, budget, and accounting. Analyzes monthly actual results versus plan and forecast.

Responsibilities:

Designed developed, implemented ETL process to extract, transform, and load (ETL) data from inbound flat files and various source systems and loaded into Data Base using the Informatica PowerCenter.

Worked extensively on different types of transformations like Source qualifier, expression, Aggregator, Router, filter, update strategy, Connected and Un-connected lookup, sorter, Normalizer, SQL transformation and sequence generator.

Involved with Data Steward Team for designing, documenting, and configuring Informatica Data Director for supporting management of MDM data.

Worked with ETL Developers in creating External Batches to execute mappings, Mapplets using Informatica workflow designer to integrate Shire’s data from varied sources like Oracle, DB2, flat files and SQL databases and loaded into landing tables of Informatica MDM Hub.

Deployed new MDM Hub for portals in conjunction with user interface on IDD application.

Configured match rule set property by enabling search by rules in MDM according to Business Rules.

Involved in Data modeling and design of data warehouse in star schema methodology with conformed and granular dimensions and FACT tables.

Using Informatica Repository Manager, maintained all the repositories of various applications, created users, user groups, security access control.

Analyzing existing transactional database schemas and designing star schema models to support the user reporting needs and requirements.

Developing the Mappings to move the data loaded from Landing to Stage by using various cleanse functions.

Experienced in writing SQL queries for retrieving information from the database based on the requirement.

Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.

Interaction with direct Business Users and Data Architect for changes to data warehouse design on an on-going basis.

Extracted data from Oracle and SQL Server then used Teradata for data warehousing.

Extracted data from multiple relational databases, i.e., Oracle 11g, Teradata to implement the business requirement.

Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.

Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.

Responsible for designing, developing and testing the software (Informatica, PL SQL, UNIX shell scripts) to maintain the data marks (Extract data, Transform data, Load data).

Designed the ETL processes using Informatica to load data from Oracle, DB2 and Flat Files to staging database and from staging to the target database.

Implemented the best practices for the creation of mappings, sessions, workflows and performance optimization.

Developed mappings and workflows as per business logic, quality and coding standards prescribed for the module by using Informatica PowerCenter.

Worked with Informatica Power Exchange tools to give on demand access to business users.

Involved in migration of mappings and sessions from development repository to production repository.

Involved in full Software Development Life Cycle (SDLC) - Business Requirements Analysis, preparation of Technical Design documents, Data Analysis, Logical and Physical database design, Coding, Testing, Implementing, and deploying to business users.

Involved in the design, development and testing of the PL/SQL stored procedures, packages for the ETL processes.

Created Unix Shell Scripts for ETL jobs, session log cleanup, dynamic parameter and maintained shell scripts for data conversion.

Used Autosys Scheduler to Create, Schedule and control the batch jobs.

Involved in the process design documentation of the Project

Environment: Informatica PowerCenter 10.1/9.6.1 (PowerCenter Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Informatica MDM, Power Mart, Informatica Power Exchange, Oracle11g/10g, PL/SQL, Data Warehouse, Oracle11g, Autosys, SQL, UNIX Shell Scripts, Linux.

Client: CVS Health, Richardson, TX. April 2017 – June 2019

Role: ETL Developer.

Project Description:

Objective of this project shared data Repository is to capture new vitality program customer’s data, policies and group policies plans. Data is coming from various sources like SQL Server, Mainframe etc., which will be loaded into EDW based on different frequencies as per the requirement. The entire ETL process consists of source systems, staging area, Data warehouse and Datamart.

Studied,developed and implemented EHR and EMRsystem,also collaborated with clinical service providers and administrators to test systems for efficacy.Configured,maintained and enhanced the HER(Athena Health),also monitored daily issue resolution and supported the clinical information systems.

Responsibilities:

Involved in requirement gathering, data Analysis, and user meetings, discussing the issues to resolve and translated the user inputs into ETL design documents.

Extraction, Transformation and Load was performed using Informatica Power Center to build Data warehouse. Worked on Informatica PowerCenter tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets and Transformation Developer.

Experienced in creating and maintaining entity objects, hierarchies, entity types, relationship objects and relationship types using Hierarchy tool to enable Hierarchy Manager (HM) in MDM HUB implementation and Informatica Data Director (IDD).

Participated in the development and implementation of the MDM decommissioning project using Informatica PowerCenter that reduced the cost and time of implementation and development.

Worked on data cleansing using the cleanse functions in Informatica MDM.

Translated Business Requirements into Informatica mappings to build Data Warehouse by using Informatica Designer, which populated the data into the target Star Schema

Extracted data from various source systems like Oracle, SQL Server, XML and flat files and loaded it into relational data warehouse and flat files.

Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.

Done the analysis from PL/SQL code like procedures, packages and records.

Worked with Informatica PowerCenter 8.6 tools like Designer, Workflow Manager, Repository Manager, source analyser, warehouse designer, mapping & mapplets designer and Transformation developer.

Developed Informatica mappings using source qualifier, joiner, Lookups (connected and unconnected), expression, filter, router, and aggregator, sorter, update strategy and normalizer transformations.

Involved in making the changes to the existing data models to accommodate the new requirements.

Responsible for Optimizing the ETL loads and redesigning ETL Interfaces which were not providing the accurate data for the business.

Worked with mapping parameters and variables, session parameters, PMCMD commands, email tasks.

Worked with SQL override, Lookup override, Lookup caches, Index and Data caches while designing the mappings also Worked with TOAD SQL*Loader for loading the data from external files into RDBMS tables and fine-tuned the SQL queries using Explain Plan and TKProf to speed up the session runs.

Handled Informatica administration work like migrating the code, creation of users, creating folders, worked with Shortcuts across shared, non-shared folders and wrote Autosys scripts for scheduling the workflows.

Responsible for fixing PLSQL procedures for getting the accurate data.

Created and executed unit test plans based on system and validation requirements. Worked closely with Business during the testing phase and fixed bugs that were reported.

Developed reports which used Conditional blocks/Variables and associated them with queries to show data only when the conditions were fulfilled.

Environment: Informatica MDM, Informatica PowerCenter 8.6, Oracle, KProf, TOAD, SQL*Loader PL/SQL Toad, Autosys, UNIX, Azure Cloud and Python.

Qwest Communications International, Denver, CO April 2015 – March 2017

ETL/Informatica Developer

Responsibilities:

Developed Data Flow diagrams to create Mappings and Test plans. Specifically, these data flow diagrams ranged from OLTP systems to staging to Data warehouse.

Designed and developed Informatica Mappings from OLTP to Stage and Stage to DWH and effectively used Informatica Transformations like Source Qualifier, Joiner, Expression, Router, Aggregator, Connected and Unconnected Lookup, Normalizer, Update Strategy etc.

Designed relational data warehouse with constraints for importing data. Implemented multiple types of database objects such as primary keys, foreign keys, stored procedures, triggers, views and functions.

Created Data Flow Mappings to extract data from source system and load it to staging area.

Updated newest data into Data Warehouse using Slowly Changing Dimensions (Type2) Transformation.

Developed complex mappings, mapplets using Informatica workflow designer to integrate data from varied sources like Teradata, Oracle, SQL Setorver, Flat files and loaded into target.

Worked on performance tuning by identifying the bottlenecks in Sources, Targets, and Mapping enhanced Performance for Informatica sessions using large data files by using partitions.

Worked with Static cache, Persistent, Dynamic cache, Index cache, Data cache and target-based commit interval in order to improve the performance at session level.

Developed strategies for Incremental data extractions as well data migration to load into Target databases and assisted in designing Logical/Physical Data Models.

Good knowledge on Data Masking performed for preserving the referential integrity of the user data.

Build several Unix Shell scripts for PL/SQL programs to schedule them on Control M.

Involved in the design, development and testing of the PL/SQL stored procedures, packages for the ETL processes.

Created Unix Shell Scripts for ETL jobs, session log cleanup, dynamic parameter and maintained shell scripts for data conversion.

Built UNIX script and Autosys scheduler jobs to perform PNE batch jobs, File and Data Archival jobs

Prepared Unit Test plan and efficient unit test documentation was created along with Unit test cases for the developed code.

Validate the data in source and target systems using PL/SQL queries.

Involved in migration of mappings and sessions from development repository to production repository.

Environment: Informatica PowerCenter 9.5.1, Power Exchange 9.5.1, AS-400(DB2), Oracle 11g, SQL/P SQL, SQL Server 2005/2K, AIX, Windows XP, LINUX, UNIX, Control M.



Contact this candidate