Post Job Free

Resume

Sign in

Data Informatica

Location:
Columbus, OH
Posted:
October 22, 2020

Contact this candidate

Resume:

Pukar brings a wealth of cross-industry information technology experience in the enterprise data

integration, master data management, data quality, data transformation, data masking, data profiling, data analysis and data governance domains. He has extensive Data Warehouse experience using Informatica PowerCenter, relational databases like Oracle 11g/10g, DB2 and Netezza/Teradata Data Warehouse Appliance. Pukar possesses a unique aptitude for understanding the business challenge placed in front of him and being able to translate those specifications into a technical solution.

He is well versed with Database and Data Warehouse concepts such as dimensional data modeling, OLTP, OLAP, Star and Snowflake Schemas, RDBMS, Multidimensional Modeling, Views, Physical and Logical models

Master Data Management (MDM)

Data Transformation (ETL)

Data Standardization & Enrichment

Data Modeling

Requirements Gathering and Validation

Source to Target Mapping

Business Rule Transformation

System & Process Documentation

Data quality rules

MDM Hub Configuration

Informatica Data Director (IDD) & Business Entity (BE) Configuration

Data Governance, Human task

Data Profiling and Scorecards

Functional and Technical Test Case Creation, Execution, and Management

Requirements and Design documentation

Data Masking, Data Migration

Experience

Nationwide Insurance

Contractor- Sr. Application Developer August 2018- Present

Distribution Partner Information Management (DPIM) – This application maintains all the information regarding Agency and/or Agent Information that writes policy via Nationwide Insurance or other 3rd party carriers. There are about twelve different sources systems and equally number of downstream consumers. Agency/Agent Information goes though Match and Merge process to maintain the golden copy of the record.

Agency Locator Feed:

DPIM is responsible for providing Locator feed to Nationwide Agency Search (https://agency.nationwide.com/). This XML feed includes both Independent Agency as well exclusive agencies. Agencies could be found with agency Name or Location of the Agency.

Contact Management:

CTM is responsible for managing all the communications that goes out to NW Policy Holders. Email, Text, Voice Calls, Direct Mail are some of the major types of communication. All interactions along with contact point are managed and securely shared with other applications and business partners as needed.

Worked as Tech Lead for all ETL related effort.

Worked on Design Solution for current ETL Process.

Worked closely with Business leads to gather the requirement and proposed technical solution.

Worked heavily to move the on-prem application to AWS.

Designed approach for Cloud Migration with POC.

Manage Splunk that reads logs of services with alrets.

Performed performance tuning on ETL processes.

Provided On-Call support during the Production implementation and 45 days’ warranty period.

Shared knowledge across the team and helped team members to resolve data issues reported by business.

Environment: Informatica PowerCenter 10.2, IBM MDM, Informatica Developer 10.1, DB2, Oracle, ESP Scheduler, Ruby Test Automation Suite, Perl Scripting, Informatica Analyst, Oracle SQL Developer, Teradata, Netezza, API. EFTS, AWS, Docker, Kubernetes. Informatica Cloud, Docker, Kubernetes, Splunk.

Kforce June 2018- August 2018

Sr. Consultant

Kforce is an award-winning professional staffing firm that provides strategic partnership in the areas of Technology and Finance & Accounting services.

C360 Configuration-

Mastering customer data using Customer 360 data model and Data enhancement using DUN & Bradstreet data source.

Installed MDM HUB 10.2 & IDQ 10.1 HF2

Configured C360 data model

Configured DUNS, NetSuite and Salesforce as Source System and set their trust settings

Developed mapping from Landing to Stage and configured batch groups

Configured IDD application and Provisioning tool

Configured MDM's hierarchy manager to align with the new data process

Developed IDQ mapping to create Data quality rules

Used Informatica Analyst to Profile data and scorecard them

Used Informatica Administrator to Execute / Monitor Jobs/Mapping.

Created IDQ workflow to automate job execution.

Used Informatica DAAS service to enrich email/phone data

Delivered MDM operational guide and Technical design document

Environment: Informatica MDM Multidomain 10.2 HF2, Informatica Developer 10.1, Oracle, Shell Scripting, Informatica Analyst, Oracle SQL Developer

Infoverity LLC

Sr. Consultant November 2016- June 2018

Infoverity is an information technology consulting firm. We enable entities around the world to optimize business process and implement enterprise solutions.

OCLC

Mastering Customer & Product Data-

Mastering the OCLC customer data coming from seven different source systems along with data standardization and data enhancement using DAAS. Also, mastered Product Data coming from legacy system.

Responsible for all aspects of MDM hub configuration, including:

oLanding, Staging, Base Objects

oHierarchies, Security Access Manager, Batch Groups

oDesign of all aspects of IDD configuration

Completed technical configuration, match tuning and unit testing

Completed address validation using Address Doctor and email enrichment using DAAS Process.

Designed and configured, the IDD interface; Subject Areas, Users/Roles, Custom Saved Queries provisioning tool; Smart Search, Business Entity, Business Entity Services

Configured MDM Hub and user interface to meet client requirements

Designed batch process for updating 3rd party enriched data

Configured MDM's hierarchy manager to align with the new data process

Delivered MDM operational guide and Technical design document.

Worked closely with Informatica support for any issue resolution and bug reporting

Trained OCLC resource to continue support of MDM HUB and more customized enhancements.

Environment: Informatica MDM Multidomain 10.2 HF2, Informatica Developer 10.1, DB2, Oracle, Control-M Scheduler, Shell Scripting, Informatica Analyst, Oracle SQL Developer

Oriental Trading Company

Sr. Informatica Consultant –

Installed Informatica PowerCenter 10.2

Performed performance tuning on ETL processes including Informatica code and SQL.

Enriched Email and Phone number using DAAS Service.

Used PowerExchange for Hadoop to load data from RDMS to new Hive tables.

Read and write to the Data files on HDFS using PowerExchange for Hadoop adapter.

Delivered Technical design document

Environment: Informatica PowerCenter 10.2, MSSQL, Shell Scripting, Informatica Analyst, HDFS, HIVE, PowerExchange, Linux, Infromatica Developer Client 10.1.

Dovers

Informatica Administrator-

Installed Informatica PowerCenter 10.2 to Production and QA environment.

Made sure all the services are up and running.

Monitored the services’ usage.

Provided Installation and monitoring guide.

Environment: Informatica PowerCenter 10.2, SQL Server 2012 R2, MSSQL.

Intertek

Test Data Management (TDM) Consultant

Customer Information Masking- Sensitive customer’s data needed masking for Dev and Test Environment for corporate security compliances. This project includes connecting to Production Table, fetch the data, Mask the data that has been identified as sensitive data and load the masked value in either same table as source or in different table in same production region or in different region. Profiling of data identifies sensitive data by the naming of column or value in the column.

Responsible for PowerCenter and Test Data Management installation and deployment

Configured Model Repository, PowerCenter Integration, Domain Repository and Data Integration Services.

Configured PowerCenter to mask consumer data at rest via PowerCenter mappings

Provided data masking policy approach for all customers’ data types

Used Informatica TDM to connect to source data, mask them and update to same source or different Targets.

Custom build rules for masking the data depending upon data type and nature of data.

Generate workflow, monitor workflow and collect statistics of the job session execution.

Debug installation issues, development issues by checking logs generated during execution.

Created and updated all the knowledge transfer documents.

Worked with the Team directly and provided the guidance’s wherever needed.

Used Informatica TDM out of box masking rules, generation rules.

Created custom masking rule and generation rule and applied to the project.

Environment: Informatica PowerCenter 961, Informatica TDM 971, MSSQL, Oracle, SQL Server 2012, T-SQL, SQL, MSSQL Management Studio.

Nationwide Financial

Sr. IT Application Developer- Application Development Center June 2015- November 2016

Actuarial Modernization is our strategy for addressing the ever changing and rapidly increasing demands of actuarial services that support our business growth and long-term strategy within the financial services division of Nationwide. It simplifies business processes through a robust data infrastructure and standardized processes and enable realization of actuarial potential through allowing more focus on analytical activities.

Performed performance tuning on ETL processes including Informatica code and SQL.

Provided On-Call support during the Production implementation and 45 days’ warranty period.

Worked on Version Control and Migration of code across DEV-TEST-PT-PROD Environment.

Created the deployment groups to move the code across environments.

Performed Unit Testing and Regression testing.

Implemented the project using Data Integration Framework (DIF).

Used Informatica Analyst to perform data profiling to gather the technical requirements of the existing system.

Used Perl Scripting to write the parameter files for data load.

Wrote SQL scripts and PL/SQL scripts to extract data from a database and for testing purposes.

Environment: Informatica PowerCenter 9.x, Informatica Data Quality IDQ, Data Integration Framework DIF, Linux, Perl Scripting, PL/SQL, Ruby, Teradata, Netezza, Oracle, AQT, Toad for Oracle, ESP, CICS, CA WA Workstation, CA Software Change Management Workbench, SAP BO.

Nationwide Financial

Sr. IT Application Developer January 2014- June 2015

RADS Data Warehouse team is responsible for on-going production support, walk-up requests and operational activities to support NF Data Warehouse. The Data warehouse suite includes life ODS, Marketing data mart and over 20+ databases. The team also owns the CPPF (Customized Portfolio & Performance) High Availability Database – which is a key datasource for many websites and NF applications. In additional to run support, the team also maintains our “Data Quality Framework” that is leveraged by other RADS team and partners with the ETL Build team to deliver projects.

Studied and Supported current warehouse and made enhancements to existing application.

Created mappings to perform the tasks such as cleaning the data and populate that into staging tables, Populating the Enterprise Data Warehouse by transforming the data into business needs & Populating the Data Mart with only required information.

Provided EDW production support and resolved the issues in timely manner. Also, worked on maintenance items.

Developed PL/SQL procedures/packages to kick off the SQL loader control files/procedures to load the data into Oracle.

Designed and implemented the process to get the various feeds from OLTP system into the staging area.

Worked with the DBA team and Data modeler to put database changes in place for the ETL processes.

Performed performance tuning on ETL processes including informatica code and SQL.

Environment: Informatica PowerCenter 9.x, Linux, Shell Scripting, PL/SQL, Ruby, Teradata, Netezza, Oracle, AQT, Toad for Oracle, ESP, CICS, CA Workstation, CA Software Change Management Workbench, SAP BO, SharePoint, Service Center

HCR-ManorCare Inc April 2011 – December 2013

IT Application Developer- Data warehouse team

Corporate Data Warehouse (CDW) is a corporate data repository that makes accessible historical and current business relevant data. The CDW's primary purpose is to centralize corporate data from disparate data sources residing within Freddie Mac's transaction processing and end-user systems into one relational database for decision support, management reporting, ad-hoc and trend analysis, modeling and data mining purposes. It supports the collection, integration and transformation of high volumes of data, with data structures ranging from simple to highly complex.

Created mappings to perform the tasks such as cleaning the data and populate that into staging tables, Populating the Enterprise Data Warehouse by transforming the data into business needs & Populating the Data Mart with only required information.

Involved in creating staging Tables, Indexes, Views.

Analyze, design, build, test and implement change requests for certain areas in the production data warehouse as per the requirements.

Involved in Developing OLAP models like facts, measures and dimensions.

Created Reusable Transformations and Mapplets to use in Multiple Mappings and also worked with shortcuts.

Implemented various Slowly Changing Dimensions - Type I & II, as per the requirements.

Worked with the Quality Assurance team to build the test cases to perform unit, Integration, functional and performance Testing.

Environment: Informatica Power Center 8.1, Power Exchange 8.1, Linux, Perl/Shell Scripting, Oracle 9i, PL/SQL, SharePoint 2010, Toad for Oracle

Ninth Wave Technologies May 2010 – April 2011

Informatica Developer

This application is mainly used for managing and utilizing the resources effectively and project allocations in the sector. It provides interfaces for Resource Management that offers cross-project hierarchical pool of people to assign to projects by skill sets for a specific month, Program Management that streamline program initiation and identifying of sub-projects and Skill capacity analysis to identify available skill set for a specific period.

Development, Testing, Implementation & Training of users.

Prepared ETL Specifications and design documents to help develop mappings.

Performed requirements gathering and Gap analysis for the ETL load.

Worked as a key player in Database Modeling, Design and Normalization

Created mappings for Historical and Incremental Loads.

Prepared and maintained mapping specification documentation.

Used Debugger to test the mapping and fixed the bugs.

Imported data from different sources to Oracle tables using Oracle SQL*Loader

Wrote Oracle PL/SQL, stored procedures and triggers to populate data.

Create technical design documents and deployment documents for production migration.

Environment: Informatica Power Center 8.1, Power Exchange 8.1, Linux, Shell Scripting, Oracle 9i, PL/SQL, SharePoint 2007, Toad for Oracle, Informatica Scheduler

Education

MS in Computer Science – Bowling Green State University, Bowling Green, OH



Contact this candidate