Post Job Free

Resume

Sign in

Data Salesforce

Location:
Dayton, OH
Posted:
March 02, 2021

Contact this candidate

Resume:

Professional Summary

Certified MDM & ETL Consultant with over 11 years of extensive experience in Design, Development, Testing and Maintenance of Data Warehouse & Data Mastering Applications

Extensive experience in using Informatica Power Center, Informatica Data Quality (IDQ), Informatica Multi Domain MDM, Informatica Data Director (IDD), HM, Entity360, Active VOS, Informatica Cloud (ICS & IICS) and Salesforce Integration

Designed and Modified MDM Data Models to incorporate changes in the system to include landing tables, staging tables, mappings, defined trust setting for sources, customized user exists, customized IDD applications for participating source systems

Worked with basic Administrative tasks by Managing Users, Data Management, Domain Management, Adhoc Reports, configuration, Installation, upgrade of Informatica MDM, Active VOS, Elastic Search, RDM, Dashboards over Salesforce UI to map and validate data propagated from stage environment’s

Experienced in upgrade of Informatica MDM environments from lower versions (9.x) to latest 10.x (10.4) and installation of Hotfixes’ and Emergency Bug Fixes (EBF) on top of existing environments

Experience in defining the development methodology to perform Merge and Match Rules on sourced data after a detailed analysis & experience in configuration of Informatica Data Director (IDD) to allow Data Governance of users, IT Managers & Data Stewards

Designed various mappings and Mapplets using different transformations such as key generator, match, labeler, case converter, standardize, Address Validator, parser and lookup using Informatica Data Quality (IDQ) toolset

Experience working in projects implementing cloud integration solutions using Salesforce CRM & Financial Services Cloud Packages using Salesforce Native Client Tools and Third-Party Tools for Data Migration and integration processes

Good understanding knowledge on both dimensional and relational modeling concepts such as Star-Schema Modeling, Snow Flaking, and ER modeling at logical, physical and presentation levels

Worked with Data Modeler’s in redesigning the existing system using tools like ERWIN, ER Studio and Oracle Designer for creating Logical and Physical Design of Database and ER Diagrams

Experience in implementing error routines, incremental loads, and Change Data Capture (CDC) and proven capabilities in integration mappings for Slowly Changing Dimensions

Hands on experience in Performance Tuning Informatica code and Database by identifying and resolving performance bottlenecks in various levels of data flow

Experience in Informatica Administration on Windows - Creating Domains, Repositories, Users, Folders (Shared & Unshared) and Deployment Groups with proven experience working over Integrating Database systems with cloud Extensive experience in working with development and maintenance of Integration between Salesforce and other Internal business applications

Worked on Design, Development, Testing and Debug various Data Synchronization, Mapping Configuration Tasks to fetch data from Staging Environment into desired target systems includes Salesforce and other Cloud Applications

Experience in creating PL/SQL Blocks, Stored Procedures, Functions, Triggers, Views, Materialized Views, Packages in different databases & work with UNIX Shell scripting as part of file manipulation, Scheduling, and text processing

Excellent Communication and Interpersonal skills with ability to work independently and in a group.

Technical Skills

Data Warehousing ETL

Informatica PowerCenter, Snowflake, Informatica Multi Domain MDM, IDD, Informatica Product Information Management (PIM), Informatica ActiveVOS, Informatica Data Quality (IDQ), Informatica Cloud & Salesforce Data Integration, Information Builders OMNI Gen Health Data, SAP Data Services

Data Modeling

Physical Modeling, Logical Modeling, Dimensional Modeling (Star Schema, Snowflake, Fact, Dimensions), Entities, Attributes, ER Diagrams, Erwin

Databases

Snowflake, PostgreSQL, Teradata, Oracle, MS SQL Server, Netezza, MS Access, IBM DB2

Programming & Reporting Tools

SQL, PL/SQL, Unix, HTML, Tableau, OBIEE, Business Objects, Oracle Analytics

Issue Tracking Tools

Rational Clear Quest, Quality Centre

Cloud Interface Tools

Salesforce CRM, Salesforce Financial Services Cloud, AWS Lambda, GIT Lab

Certifications

Informatica Developer Specialist, Informatica MDM 10.3 Developer Specialist

Other Tools

TOAD, SQL Server, SQL*Plus, Autosys, CRONTAB and Tidal, SQL Assistant, BTEQ, TPT, FastLoad, MultiLoad, FastExport, TPump, IDQ, DB Visualizer, Salesforce Data Loader

Professional Experience

T. Rowe Price, Owings Mills, MD Aug’20 Till Date

ETL Tech Lead

Responsibilities (ETL, Data Modernization & MDM Areas):

Coordinate with multiple business units on brainstorming and evaluating the cloud migration solution implementation and integrate existing architecture into cloud using seamless technologies

Organized a series of meetings with user group to understand the requirement and provide desired

Work with enterprise data architect to understand the existing ETL Framework and work with implementing the process of Data Modernization solution into Snowflake & AWS

Analyze the existing Mainframe Datasets on Power Exchange & Mainframe interface to get the storage information on incoming data from vendors/clients to estimate Storage Capacity for EC2 instance hosting the S3 Buckets

Worked on translation and migrating of current ETL solution in Informatica PowerCenter to use Informatica Intelligent Cloud Services, PySpark & AWS Serverless (Lambda) computing services

Lead the process of data migration using the utilities in IICS (Synchronization, Replication etc.,)

Design the data integration processes for AWS Aurora DB (PostgreSQL DB) & Salesforce by replacing existing OnPrem-Salesforce Solution

Work with IBM Admins to understand the existing data storage on Mainframe to determine the process of Data Migration into AWS S3

Advise the programmer group on data migration strategies and provide solutions in line with existing logic

Work closely with Salesforce users and address data integration issues between both AWS & SF worlds before integrating the solution in MDM System

Work with the Data Stewards & Developers to understand data issues and implement fixes at both Data ingestion as well as MDM IDD and e360

Led the effort of migrating existing code base from IDQ 9.x versions to 10.x, addressed the issues with code compliance, lineage & compatibility for a precise data delivery

Successfully disintegrated IDQ from current code base in Informatica PowerCenter to set it up as a self-reliable and sustaining application

Worked on stimulating the IDQ server to establish a precise communication channel between Front End applications and inhouse data HUB

Perform key column profiles to understand data patterns and tune Match Rules if needed be

Work as an Individual & Team player to drive a successful solution

Winsupply Inc., Dayton OH Aug’19 – Jul’20

MDM Architect/Sr. MDM Developer

Responsibilities (Data Ingestion, MDM)

Work with the team on End to End Informatica MDM Implementation

Administer existing MDM Application for Configuration enhancements, upgrade the application to latest client acquired software version

Worked on configuring Informatica MDM Hub server, Informatica ActiveVOS, Cleanse server and Address Doctor in Development, QA & Production Environments

Applied changes to existing MDM Model to incorporate new source systems and map attributes to existing MDM Model post making necessary changes to incoming data

Implemented Stage Jobs, Load Jobs, Match and Merge Jobs using the Batch Viewer, Automation Processes and created Batch Groups in the projects to stimulate parallel execution of Stage loads per source system

Changed existing Data Model to fit in data from New Source system(‘s) and have adjusted the Match Rules/ Match Rule Sets to enhance the Consolidation process

Worked on designing/modifying MDM data model to include landing tables, staging tables, mappings, defined trust setting for sources, customized user exists, customized IDD applications for new source systems

Worked with Data Stewards to understand data issues for New Source system(‘s) and resolve issues (Data Issues) to fit into existing model

Worked on design and implementation of subject area IDD solution and deployment procedures for full/incremental releases and automating the change list promotions to different environments

Configured Active Directory connections to let inhouse Users (Business Users) using their AD Credentials on to MDM HUB, IDD and Entity360 to make necessary changes over data based on requirement’s

Worked on enhancing data quality using data handling capabilities in MSSQL Server and addressed issues over accumulated data

Designed & Enhanced existing Match and Merge settings of the base table by creating Match Path Components, Match Columns and Rule sets to handle data updates

Worked on defining MDM batch load solution (Landing Table, Staging Table, Mappings, User Exits, Hard Delete Detection, Cleanse Function, and Batch Group Definition, Match Rules Definition) and automating the execution

Designed & Configured landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups, query groups, queries/custom queries, and packages

Peformed rule based Profiling on the client business data in understanding the depth of accuracy and duplicates which are needed for MDM implementation

Implemented Post Landing, Pre-Staging, Save Handler User Exits using Eclipse Mars and integrated those user exits with MDM Hub Console using User Object Registry module

Worked on resolving functional as well as technical queries by taking help of functional SME or client users

Interacted with business users in configuring Match setup and tool proposition for Data Quality Address Doctor

Altered Trust scores and set up Validation rules as per the business requirements

Worked on creating MDM Installation Run Book to catch up with activities easy with successive enhancements to the application

TrialCard Inc., Morrisville, NC Apr’18 – Jul’19

Sr. MDM Consultant

Responsibilities (DI & Master Data Management)

Work with the Source groups over the data from sourced environments and refine the data as per requirements for Data Mastering

Performed extensive Profiling over Sourced Data using iWay OMNI Gen Data Profiler to identify the Mastering keys and define Data Quality, Accuracy & Completeness

Work with the Data Integration group over Mastering Requirements and validate Mastering Areas for Health Check-up over Incoming Data

Work with Data Quality team (Offshore) daily over system requirements for Qualifying the Sourced Data into iWay Mastering Suite

Performed Data Remediation (Match/Merge & Consolidate) activities and created Base Tables, Stage Tables, defined foreign key relationships & configure lookup tables in data mastering suite

Worked on cleanse, match rules tuning & analyzed Match key distribution for potential hot spots by subject areas

Worked with SAP Data Services application to migrate data from Legacy Sourced area, designed & developed DI mappings for Address, Email & Phone Number validations, Loqate (Address Doctor for OMNI Health) services were integrated into OMNI Health Data

Worked with Data Integration Team to loopback Mastered Data into traditional Data Warehouse

Real-time integration with Frontend systems using iWay OMNI Health Web Services applications

Carried out unit testing & prepared test cases for system testing

T. Rowe Price, Owings Mills, MD May’16 – Mar’18

ETL/MDM Consultant

Responsibilities (ETL & MDM Areas):

Worked with the analyst team to refine the requirements and transform into Technical specification to implement over ETL Development and provide an acceptable solution for Project

Worked on developing various Migration as well as Integration ETL Data flows from the legacy Source Systems to accommodate user specifications

Used Informatica PowerCenter to process data from Legacy Systems and populate the stage tables to facilitate data consumption into Salesforce and MDM systems

Used Informatica IDQ to create & asses the Quality, Accuracy & Completeness of Data Profiles

Designed and Developed IDQ Mapplets for Address, Email and Phone number validation, integrated Address Doctor service with MDM

Worked with data stewards in creation of ActiveVOS manual tasks to get all the tasks displayed in ActiveVOS Central

Performed Configuration Management to Migrate Informatica mappings/sessions/workflows from Development to Test to production environment and involved in Production Support of Code Migrations from Lower Environment to Higher Environments

Worked on implementation of Land Process to load customer data set into MDM Stage tables from heterogeneous source systems

Used various transformations available in Informatica PowerCenter designer suite to create simple to complex mappings to derive desired data solution and addressed data manipulation in Informatica PowerCenter

Created and reviewed scripts to create new tables, views, queries to enhance the application using DB Visualizer

Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application

Managed Scheduling of Tasks to run any time without any operator intervention using Autosys

Salesforce Data Integration Developer

Performed the role of Data Integration developer by legacy sourced data with Salesforce.com

Involved in requirement gathering and Salesforce Data Model Design to allocate business needs and resource management from sourcing Environment to fit to business needs

Worked on integrating PostgreSQL with Salesforce Enterprise CRM as well as Financial Service Cloud (FSC) using Informatica Cloud as an integration tool

Actively coordinated with Salesforce Data Architect Team in development of a working solution to allow data flow into Enterprise CRM & FSC Cloud

Discussed the possibilities of implementing Financial Service Cloud (FSC) Roll by Lookup (RBL) functions and their probable implementation through ETL Process to avoid time consumption over Salesforce Backend Calculations to save time supplying data to customers

Managed Data Quality over Salesforce loads by addressing data fallouts and communication of issues over Apex class as well as Data Fallouts into Cloud (ETL Teams) with respective teams

Worked with Data team over data loads into CRM using Salesforce Native Clients (Salesforce Data Loader) as well as Third Party Tools (Workbench for Minor Test Loads) and Salesforce Import Wizard

Actively communicated with the team over APEX design flaws and Source data failures when attempted to load data into Salesforce FSC Cloud and worked on rectifying issues and reprocessing data flow to Cloud

Worked on writing windows batch scripts as a mode of data replication into PostgreSQL DB Tables and maintained through multiple environments

Excellus BCBS, Rochester, NY Sep’15 – Apr’16

Lead Informatica MDM Consultant

Responsibilities (ETL & MDM Areas):

The PBMS project dealt with the Customer information on the Plans they were registered with Excellus which would enable the system to populate their respective Claim, deductible and Out of Pocket Amount Information

Supported the Change Management process in addressing UAT and Production issues of failures and have sent out fixtures on time

Used Informatica PowerCenter as a Main ETL tool to extract data from FACETS as well as Source systems and populate the Landing tables for MDM Processes

Supported the Functional and Integration batches with Defect resolution by troubleshooting the Data Integration bugs, analyze the reason for failure and implement and document timely solutions as needed

Worked on data cleansing and standardization using Graphical Functions and Cleanse Lists in Informatica MDM Hub while loading the data from Landing to Stage tables

Analyzed the key functionalities of the Security master data and performed Data Analysis and abstracted the transactional nature of data

Used Parameter files to restore multiple DB connections to the sources and to share some arguments between different transformations

Used parallel processing capabilities, Database & Session-Partitioning and Target Table partitioning utilities

Liaising with ETL Designers/Developers, conducted meetings with ETL team to identify the Gaps in understanding the requirements, distributing the work between MDM and ETL teams

Performed Data Validation and integration of the data by writing complex SQL queries and by verifying the data extracted in reports using Cognos interface

Provided on call Production Support with critical client escalations and status to senior management

Comcast Corporation, West Chester, PA Nov’14 – Aug’15

Lead BI DW Developer (Informatica, Teradata and IDQ Areas)

Responsibilities:

This projects area of interests was mainly involved in integrating the data and architecture of the to be merged agencies who were specialized in the same area of the business

Interact with the user’s teams of all the agencies to get a proper understanding of incoming data which provided a base to proceed forward with development activities

Have used Informatica PowerCenter and Teradata Utilities to perform data transformation and Manipulation activities in the system

Developed UNIX shell scripts and used BTEQ, FastLoad, Multiload, and Fast Export utilities extensively to load to target database

The development activities involved in writing Simple o Complex Shell scripts and making changes to the existing scripts to allocate the data flow from Source to Target

Designed and developed IDQ solution to support MDM Projects and provided high-level design walkthrough on how the solution will be built for each requirement

Worked with various developer modules like profiling, standardization and matching

Incorporated the Data Quality Mapplets into PowerCenter Mappings and Implemented PC DQ mappings to test the standardized and cleansed new data

Have worked on creating the Base Layer to support user activities to generate reports and addressed any issues or concerns raised by User as well as the QA teams whenever the need aroused

Automated the jobs using UC4 Scheduler that helped in reducing manual intervention

CareFirst (Blue Cross Blue Shield of Maryland, Virginia, and Washington DC) Jan'14 – Oct’14

Informatica Developer

Responsibilities:

The project was to extract Plan data, commission’s data and claims data from the source OLTP system and create a enterprise data warehouse and Integrated Data mart

The plan data had Dental, Vision and Medical plan details. They were on the same source systems and were extracted to load plan tables in the EDW and the data mart. Views were created to separate the reporting business layers based on the plan types

Claims data was integrated and analyzed against customers cross-referenced by International Diagnostic Codes. This facilitated analysis of reversed ICD codes transactions

Analyzed business requirements and worked closely with the various application teams and business teams to develop ETL processes to be consistent across all environments

Loading the OLTP data into the Warehouse OLAP and built the Re-usable Mappings using Informatica PowerCenter

Extensively used pre-session and post-session variable assignment for simulating multi-threading scenarios for Load balancing and performance improvement

Extensively used Parameter file to override Mapping parameter, Mapping Variables, Workflow Variables, Session Parameters, FTP Session Parameters and Source-Target Application Connection parameters

Created various Documents such as Source-To-Target Data mapping Document, Unit Test Cases and Data Migration Document

Generated Dashboards with Quick filters, Parameters and sets to handle views more efficiently

Published Workbooks by creating user filters so that only appropriate teams can view it

Generated trending analysis reports to the BCBS management

Actively coordinated with testing team in the testing phase and helped the team to understand the dependency chain of the whole project

Provide post implementation support on user queries and provide fix to issues identified

Freddie Mac, McLean, VA Apr’13 - Dec'13

BI DW Developer

Responsibilities:

The project was to extract Plan data and claims data from the source OLTP system and create an enterprise data warehouse and Integrated Data mart

Involved in reviewing the existing design with the business partners and architectural team to incorporate new business requirements

Converted business requirement document into technical specification documents based on user requirements

Worked on migrating the existing source data on DB2 and Flat Files to Oracle environment using Informatica after transforming data using multiple transformations available in Informatica Suite

Responsible to design, develop, test and maintain the warehouse objects

Applied best practices such creating reusable objects and mapplets to use in multiple mappings involving complex design

Migrated repository objects, scripts from development environment to successive environments to perform testing on developed design after unit testing the same in development environment

Worked on maintaining warehouse standards and design the date as per user specifications

Redesigned the existing mapping document to incorporate user requirements

Reduced the runtime of the data flow by writing the data into temporary staging tables and enhance performance

Create and manipulate existing shell scripts to automate the jobs and materialized views on Linux Box using Autosys

Used parallel processing capabilities, Database & Session-Partitioning and Target Table partitioning utilities

Setting up sessions to schedule the loads at required frequency using Power Center, Workflow manager, PMCMD and also using scheduling tools such as Autosys and automated the jobs

State Farm Insurance Bloomington, IL July’12 to Mar’13

Project involved in collecting data related at the associate level (Agents) and calculates the commissions of agents associated with State Farm Insurance.

DirecTV Los Angeles, CA Dec’11 to Jun’12

Role: ETL Developer

On a daily basis we worked on listing out customers that purchased different network plans available at DirecTV and generate reports to submit the customer information to vendors of DirecTV.

Blue Cross Blue Shield of Michigan, Detroit, MI Mar’10 to Nov’11

Role: ETL Developer

New Sales Compensation System (NSCS): The NSCS project deals with calculations for compensations, premiums and commissions using Callidus True Comp after prestaging in Informatica Power Center 9.1/8.6.

Wilmington Savings Fund Society Bank, Wilmington, DE Jan’09 to Feb’10

ETL Developer

The aim of the project was to develop and implement a Data warehouse called as WSFS SPOE which will be an integral part of the WSFS processing services for managing all bank transactions and processes.



Contact this candidate