Post Job Free

Resume

Sign in

Data Informatica

Location:
Los Angeles, CA
Posted:
June 06, 2020

Contact this candidate

Resume:

Anjan Banerjee

(MDM Architect)

Phone: 310-***-**** Email: addnex@r.postjobfree.com

Professional Summary

** ***** ** ** ********** in analysis, design, development, testing and implementation of Data Modelling, Data Governance, Data warehouses, Data Migration, Data Integration, Master Data Management, Data Quality and Cloud Computing as Sr. Developer/Consultant and Architect.

Highlights:

Extensive experience in Informatica MDM, IDQ, ActiveVOS, IDD, Provisioning, Power Center, AXON,TDM, ILM, IICS, Java, TIBCO, Mulesoft and AWS.

Worked in various business domains like Insurance, Pharmaceuticals, Auto Finance, Supply Chain,Health Care and Telecommunications.

Experience in Leading a team of 10-15 people with excellent Customer Service Experience or Delivery Team Leadership experience.

Working around constraining factors such as Key Challenges, environmental impact and project budget.

Playing a part in project and team management.

Working closely with a team of other professionals and architectural technologists.

In-depth understanding of the domain and pertinent technologies.

Identify and address architectural challenges. Create models and assess alternative approaches.

Understand what technical issues are key to success and prototype/experiment/simulate towards the optimul solutions.

Prepare architectural documents and presentations.

Influence business strategy and Translate business strategy into technical vision and strategy.

Understand customer and market trends and Capture customer, organizational and business requirements on the architecture.

Work with developers with defined architecture and implement the solutions with quality within defined period of time.

Extensive experience in IDQ Address validator to validate and standardize Address, Informatica DAAS services to validate phone number and email addesss.

Worked on AWS to install MDM and IDQ in AWS.

Configured Axon for unison search, data quality scorecard based on completeness, conformance, validity, accuracy and timeliness.

Worked on Business glossary and metadata manager.

Technical Skill Set

Infrastructure

AWS

Data Modelling Tool

Erwin, ER/Studio, MS-Visio, Toad Data Modeller.

Informatica Products

Informatica PowerCenter, Power Exchange, Metadata Manager, IDQ,Analyst, IDD, MDM, ILM,TDM,AXON,ActiveVOS, Provisioning Tool, IICS.

Other ETL Tools

SSIS,Oracle Warehouse Builder, Mainframe with COBOL, PL1, AS400, IICS.

Reporting Tools

Tableau, Cognos, OBIEE,SSRS.

Databases

MarkLogic,Oracle,VSAM, DB2, PS Files and MS Access, Netezza, SQL Server.

Tools and Utilities

TOAD, SQL Developer, SQL Navigator, SQL Server Management Studio, SQL* plus, SQL* Loader, Autosys, CA, UC4, Changeman, SCLM, FileAid, Fault Analyzer,TIDAL.

ChangeManagement Tool

JIRA,SCLM,ServiceNOW, HPSM, Remedy Force.

Languages and Web Technologies

HTML, XML, XSD, JavaScript, SQL, PL/SQL, C, UNIX Shell Script, Perl Script,COBOL, PL1.

Operating Systems

Windows,UNIX,OZ/OS

Experience Summary

Client: Change Healthcare,GA Oct 2019 – Till Date

Role: Sr. MDM Architect

Change Healthcare is a provider of revenue and payment cycle management and clinical information exchange solutions, connecting payers, providers, and patients in the U.S. healthcare system. My engagement was specific to build the Customer master.

I’m working as MDM Architect and primary my role is to design and architect end to end MDM solutions.I help customer to build the centralized Master data across the organization. Besides, helping customer with roadmap for business operations and business analytics, application development related to data quality, masterdata management and data governance. Besides, building Inbound and Outbound integrations are also key area that I focus. We use cloud hosted MDM platform to build master data.

Responsibilities:

Design and Develop MDM data model.

Real-time and batch integration using TIBCO BW plug-ins for different SFDC and CRM applications.

Building D&B hierarchy, Sales Hierarchy and Billing Hierarchy.

Define data quality roadmap, data quality rules.

Define match rules, trust and validation.

Design work items and data stewardship screens in MDM.

Design MDM persistent ID.

Client: Charter Communications, MO March 2019 – Oct 2019

Role: Sr. MDM Consultant/Architect

Charter Communications, Inc. is an American telecommunications and mass media company that offers its services to consumers and businesses under the branding of Spectrum. They took the initiative to create master data for their B2B customer.

I’m working as MDM Architect and my role is to design and architect MDM end to end solutions business operations, business analytics, and application development related to data quality, masterdata management and data governance and MDM integration both in real-time and batch mode.My job is to help business leaders to use master data and build centralized repository of golden copy of customer information that will be shared across the organization. I’m also responsible for MDM integration to other applications across the organization. We use Informatica MDM Customer 360 (10.3 HF1), IDQ and DAAS services for data quality and master data management.

Responsibilities:

Design and Develop MDM data model.

Real-time and batch integration using MDM HUB and Business entity Services.

Design Data Stewardship and user interface screen from C360 UI.

Design one step and two steps approval process from Customer 360 UI.

Define system and trust.

Design MDM persistent ID for Customer and Address as Global ID from MDM to share with outbound application.

Design BES REST calls to integrate source data into MDM real-time.

Publish MDM data real-time to outbound data lake through JMS queue.

Execute SIF services to call different MDM functionalities from cron jobs.

Built Informatica data quality DAAS services(REST API) to validate email address and phone numbers.

Integrate Pitney Bowes for Address validation from IDQ.

Design data profiling rules, generate score cards and trend charts for souce data and MDM data.

Design error handling modules in IDQ and MDM.

Extensive exercise to MDM performance tuning.

Worked with Informatica directlty to implement the best practices.

Client: Mayo Clinic, MN Sep 2018 -March 2019

Role: Sr. MDM Consultant/Architect

The Mayo Clinic is a nonprofit academic medical center based in Rochester, Minnesota, focused on integrated clinical practice, education, and research. I’m working on a project for Mayo Medical Laboratory, part of Mayo Clinic. Mayo Medical Laboratory is doing their Lab Patient Master and part of this exercise, we are focusing on “what” solution must do to meet the their business needs. Our goal is to align the business vision for the solution, including the accompanying capabilities that are needed to enable the vision, and the conceptual technology approach. We are concentrating on Data profiling, Defining Business Process maps, Defining Functional Requirement, Defining Conceptual Architecture, Defining Logical Model and defining Analytical needs and prototypes.

Responsibilities:

Building Data governance Roadmap and helping client to build centralized Master Data for Patients .The scope includes but not limited to

Build Master Data Management Architecture for Patient Domain.

Designing Realtime Data injestion from internal and external Laboratory, Hospitals, Clinic etc.

Implementing and configuring Informatica IDQ 10.x to perform data validation, data profiling, data cleansing, creating scrore card and metrics.

Built Informatica data quality DAAS services(REST API) to validate email address and phone numbers.

Used IDQ address validator to validate and standardize addresses.

Current business process maps, data flow and overall system design.

Data source profiling and format validations.

Mastering requirements.

Integration and real-time needs.

Current validation process.

Current and future data throughput and volume.

Reporting and analysis needs.

Critical institutional roadmap items

‘Quick win’ opportunities.

Design includes

oConceptual and physical system architecture.

oDesign all data flow and process maps.

oVolume needs and related system sizing.

oComponent selection (inclusion/exclusion of Informatica system components)

oMatch / merge business rules and process.

oReal time integration between different sources and MDM and between MDM and Outbound application.

oHierarchy designing for Partners’ Lab, External Patient, MML Patients.

Client: LA Care, CA Dec 2017-Sep 2018

Role: MDM Architect

L.A. Care as the top-ranked health insurer in L.A. L.A Care Health Plan is an independent public agency created by the state of California to provide health coverage to low-income Los Angeles County residents. As a part of their practices, they have chosen Informatica products to create master data that comprises of processes, governance, policies and standards to consistently define and manage the critical data of the organization’s Providers, Patients, Hospitals, Plan Partners and Marketing Strategy and so on to create a single point of reference.Informatica MDM, IDQ, IDD, Power Center are widely used in these projects . They are also using Big data platform like Mark Logic as a source system and Mulesoft as middleware for Realtime Integration.

Responsibilities:

Worked on MDM into Cloud(AWS).

Worked on Micro Services and Integration with MuleSoft.

Created Conceptual, Logical and Physical Data Model for Mastering Data of Providers like Hospital, Physician, Mid Level, PPG (Contracting Partner), non Physician and Ancillary.

Creating Hierarchy and Relationship understanding the Busineed needs and how they can be setup in User Interface screen(IDD).

Built cleansing and standardization rules, mapplets in IDQ.

Created IDQ user interface screens from IDQ Analyst tool for user interactions like data prifling, scorecard generation, trend chart generation etc.

Design end to end integration with different sourecs both Realtime and batch. Design Realtime and batch publishing layer from MDM.

Requirement gathering, Requirement analysis and analysing of different source data, Data Profiling, generation of score card and Trend chart.

Configured Base object,staging and landing tables.

Configured Delta detection, RAW table, History tables.

Defined System Trust and Validation rules for base object columns for Golden Records.

Providing solutions to Realtime and Batch Integration with Inbound and Outbound sources.

Working on SIF for Realtime and Batch Inbound Integration.

Developed User Exists for Data Customization during the Post Stage and Post Load processes using Java.

Configured User Interface from IDD application.

Configured Entity 360 from Provisioing tool.

Developed BES (Business Entity Services) for Outbound application for Data Pool from MDM.

Created custom services from provisioning tool to serve approval process generated from external applications.

Configured Enterprise catalog and unison search from Axon.

Configured data quality score and rule types from Axon based on completeness, conformance, validity, accuracy and timeliness.

Environment: ER/Studio, MarkLogic, Informatica MDM 10.2, IDQ 10.1, Power Center 10.1, Mulesoft, ActiveVOS, IDD, Provisioning Tool 10.2, AXON, Oracle, Unix.

Client: St. Joseph Health, Anaheim, Orange County, CA Feb 2016-Dec 2017

Role: Sr. MDM Consultant

St. Joseph Health is a group of hospitalis spread across the 3 geographic regions in Northen California, Southern California and Texas. St. Joseph Health (SJH) is an integrated Catholic health care delivery system sponsored by the St. Joseph Health Ministry. Here, St. Joseph Health is created their centralized Master Data or Golden Data for Providers, Patients and Payers across all the hospitals. Moreover, they are also creating Golden records of all the items used for patients, their Manufactures’ and Vendors information as a supply Chain Project. Informatica MDM, IDQ, IDD, Power Center are widely used in these projects .

Responsibilities:

Working as a Senior Informatica consultant and MDM Architect in Data Governance and Supply Chain projects.

Requirement gathering, Requirement analysis and analysing of different source data, Data Profiling, Score card generation and Creating MDM Data Model.

MDM Installation on application Server JBOSS. MDM version upgrade.

Hands on Informatica IDQ experience in Data profiling, Name Standadization, Address Validation, Name Parsing, Date fields validations, Phone number Standization.

Creating IDQ Error Report with the help of Informatica IDQ and integrate IDQ with MDM and ETL.

Create conceptual, Logical and Physical data Model for MDM Base object tables.

Provide solution of the ETL process to load Source data from various source applications to MDM Landing tables.

Create DQ maplets for Name Standadization, Name Parsing, Error Report generation and Address validation using Address doctor in IDQ.

Configured Staging tables, foreign key relationships, lookup tables, queries, query groups, packages and custom cleanse functions in MDM.

Defined System Trust and Validation rules for base object columns for Golden Records.

Configuration of SIF and run them using webService call from Power Center.

Working with User Exits, message queue and configured Batch viewer and Batch groups.

Unmerge, Batch unmerge, Soft Delete and Hard Delete in MDM.

IDD configuration for Hierarchy, Smart Search, REST APIs, Composite Services, ActiveVOS.

Configuration of IDD for Data Steward for manual merge and unmerge.

Create different views from Golden records for downstream application, EDW.

Worked on Business glossary and metadata manager for Business definition and metadata manager.

Environment: Erwin, Informatica MDM 10.1/9.7, IDQ 9.6, Power Center 9.6, TIBCO, ActiveVOS, IDD, Provisioning Tool 10, AXON, SQL Server, Oracle,SSIS,SSRS.

Client: Mattel, Inc. El Segundo, Los Angeles, CA May 2015-Jan 2016

Role: Senior Informatica Consultant(MDM Specialist)

Mattel Inc. is an oldest and leading global toy manufacturing company and their Business model runs with the principle of Supply chain management. That menas It includes the movement and storage of raw materials, work-in-process inventory, and finished goods from point of origin to point of consumption.. The company’s data comes from different operational sources with CSV, AS400 and DB2 formats and is then loaded into the Data Warehouse built on Oracle and Netezza using Informatica. Besides, they create their Master data of all the Suppliers, Vendors and Manufacturers in their centralized MDM hub.

Responsibilities:

Working as a Senior Informatica consultant in Master Data Management and Data Migration projects.

Requirement analysis and analysing of different source data, data Profiling.

As a part of Migration project, generated reports from different source systems, datamart and loaded them into ODS. The new Supply chain tool was fed with the ODS data for gap analysis between the existing system and the new system.

Developed complex Informatica mappings to load the data from various sources using different transformations.

Discovered the data anomalies in Source data by data Profiling and standardized them with the IDQ mapplets and integrate them with MDM and ETL.

MDM version upgrade.

Involved in developing the mappings for moving data from sources ( Flat Files, AS400 data) to Staging (STG) and then to DWH and datamart.

Created MDM Hub Console, Configured Base Objects and developed customed MDM mapping with clease functions for Supply Chain project.

Configuration of user exits and IDD.

Developed Slowly Changing Dimensions mappings (Type1, Type 2) as per the business requirements.

Writing various Unix Scriptings including workflow triggering based on various criteria, FTP of files, control files etc.

Used Debugger to remove bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.

Performance tuning and Error handling both in ETL and MDM sides.

Involved in unit testing and integration testing.

Experienced with FTP process, scheduling tools like CA, UC4 to setup jobs and their dependencies.

Client: Toyota Financial Services, Torrance, CA Feb 2010- May 2015

Role: Technical Lead/Senior Developer(Data Warehouse, MDM and Data Governance)

Toyota Financial Services Ltd. is an auto finance company that focuses on providing Loan and/or Lease to its’ customers to buy or use a new/used cars.. The company’s data comes from different operational sources like flat-files, Oracle and DB2 and is then loaded into the Data Warehouse built on Oracle using Informatica.

Responsibilities:

Worked as an ETL lead in Data ware Housing, Data Migration and Data Integration projects.

Worked as Data Quality and Master data Management developer.

Created IDQ mappings to bring the source data from heterogeneous sources.

Used COBOL layout in Informatica using Power Exchange adaptor.

Configured Base objects, created system and trust for MDM Hub console.

Developed conceptual Data Model for MDM and worked as MDM admin.

Extensively used Informatica Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer and Mapplet Designer.

Developed complex Informatica mappings to load the data from various sources using different transformations like source qualifier, connected & unconnected lookup, expression, aggregator, joiner, filter, normalizer, update strategy, TC control, union, shorter, XML generator, XML parser, rank and router transformations.

Did data profiling, data cleasing and data standardization in IDQ.

Created IDQ mapplets and integrate them with ETL and MDM.

Experience in defining and configuring landing tables, staging tables, base objects, lookups, query groups, queries/custom queries, packages, hierarchies and foreign-key relationships in MDM.

Experience in configuring Entity Base Objects, Entity Types, Relationship Base Objects, Relationship Types, Profiles using Hierarchy tool.

Involved in developing the mappings for moving data from disparate sources (like XML, Flat Files, Mainframe, Tera data and Oracle) to Staging (STG) and then to DWH.

Created reusable transformations and mapplets based on the business rules to easy the development process and responsible for documenting the changes

Developed Slowly Changing Dimensions mappings (Type1, Type 2) as per the business requirements.

Preparation of interactive dashboards, ad hoc queries, notifications and alerts, enterprise and financial reporting, scorecard and strategy management with the help of OBIEE reporting tool.

Extracted and transformed data from high volume data sets of fixed width, delimited and relational sources to load into target systems.

Extensive work experience in error handling and performance improvement in database, ETL and MDM.

Extracting Incremental data (Change Data Capture) from Source Systems using Informatica Power Exchange.

Used Debugger wizard to remove bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.

Performed constraint based loading and checking target table before inserting rows to avoid primary key violation.

Involved in unit testing and integration testing.

Involved in FTP process to transfer the files from one server to other.

Worked with Autosys team to setup jobs and their dependencies.

Environment: Informatica Power Center 9.5.1, Power Exchange, Flat files, XML, Mainframes, Oracle 11i, DB2, PL/SQL, TOAD, WINSQL, UNIX,Netezza.

Client: Abbott Laboratories, Libertyville, IL Mar 2008 – Jan 2010

Role: Sr. Informatica Developer

Abbott's Pharmaceutical Products Group aims to create a top-tier global pharmaceutical business- one that is dedicated to discovering, developing and marketing breakthrough drugs that improve patient health. Abbott Datawarehouse tracks the information about the ABBOTT Laboratories customer details, sales management, and charge backs by maintaining the information in the oracle database as target database and then generating the reports based on the requirements.

Responsibilities:

Coordinated with business users to understand the needs and implement the same into a functional data warehouse design.

Interpreted logical and physical data models for Business users to determine common data definitions.

Worked with various Mainframe files like VSAM, DB2 and COBOL copybooks.

Developed various mappings to load data into ODS using different transformations and mapplets.

Designed and developed mappings using Source Qualifier, Expression, Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner and Rank transformations.

Wrote SQL-Overrides and used filter conditions in source qualifier, thereby improved the performance of the mapping using informatica designer.

Used Sequence Generator to create Dimension Keys and Update Strategy to insert records into the target table in staging and Data Mart.

Used the Debugger in debugging some critical mappings to check the data flow from instance to instance.

Optimized/Tuned mappings for better performance and efficiency .

Preparation of various reports, dashboards, ad hoc queries with the help of OBIEE reporting tool.

Developed UNIX shell scripts to format the session log files and to extract the information from error logs.

Worked closely with reporting team and generated reports using Business Objects.

Wrote complex SQL queries to test the data generated by ETL process against target data base.

Checked session and error logs to troubleshoot problems, and also used debugger for complex problem trouble shooting.

Developed unit test cases and performed unit testing for all the developed mappings.

Environment: Informatica Power Center 8.6/8.1.1, Oracle 11g, PL/SQL, SQL developer, SAP ECC, Business Objects XIR2, MS-Access, Windows NT, HP UNIX

Client:Metlife June 2006 - Feb 2008

Role: Informatica Developer

MetLife, Inc. is a leading global provider of insurance, annuities and employee benefit programs. Informatica platform was used to provide the solution that envelops data from different platforms and transforms the data as required in a presentable manner that is easy to use and understand for decisioning and judgement.

Responsibilities:

Created, designed, and implemented specifications and developed ETL processes to support the development and implementation of data warehouse projects using Informatica.

Prepared technical specifications to develop Informatica ETL mappings to load data into various tables confirming to the business rules.

Mapplets and Reusable transformations were used to prevent redundancy of transformation usage and maintainability.

Wrote SQL overrides in Source Qualifier according to business requirements.

Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.

Created stored procedure in PL/SQL.

Used debugger to test the data flow and fix the mappings.

Scheduled Sessions on the Informatica Server using Workflow Manager and Workflow Monitor.

Wrote and modified Unix Korn Shell scripts to handle dependencies between workflows and log the failure information.

Involved in performing Unit testing and Integration testing.

Environment: Informatica Power Center 7.1.1, Korn Shell Scripting, Toad, Oracle 9i, SQL Server 2000, XML, Windows XP, Unix(AIX), MS-Visio.

Education

Bachelor in Engineering



Contact this candidate