Post Job Free

Resume

Sign in

Manager Data

Location:
Wilmington, DE
Posted:
February 08, 2021

Contact this candidate

Resume:

Krishnaveni Murugaiah

PROFESSIONAL SUMMARY

** ***** ** ** ********** with extensive Data Warehousing implementations across various industries.

Over 12 years of ETL and data integration experience in developing ETL mappings and scripts using Ab initio, Talend, Hadoop, Talend, Informatica Power Center9.x/8.x, IBM DataStage, IBM Cognos, Crystal Reports Data Masking, Test Data Management.

Good experience in gathering the requirement by interacting with Business and documenting the same as User Stories in Jira.

Experience in preparing End-to-End process flow diagram.

Good exposure in Agile Development Project and Agile Processes.

Good exposure in preparing process flow diagrams and business flows in MS Office Tools.

Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.

Expertise in Business Model development with Dimensions, Hierarchies, Measures, Partitioning, Aggregation Rules, Time Series, Cache Management.

Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.

Strong knowledge of Entity-Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).

Experience in integration of various data sources like Oracle, Teradata, DB2, SQL server and MS access and non-relational sources like flat files into staging area.

Expertise in full life cycle Business Intelligence implementations and have an understanding of all aspects of an implementation project using OBIEE.

Experience in creating Reusable Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.

Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Assignment, Worklets, Control).

Experienced in UNIX work environment, file transfers, job scheduling and error handling.

Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.

Experienced in developing Dimensional Hierarchies, Level Based Measures, and adding multiple sources to business model objects.

Experienced in configuring and set up of Biosecurity using LDAP and External Database Tables and configuring object level and database level security.

Experience in writing, testing and implementation of the PL/SQL triggers, stored procedures, functions, packages.

Involved in Unit testing, System testing to check whether the data loads into target are accurate.

Proficient in interaction with the business users by conducting meetings with the clients in Requirements Analysis phase.

Extensive functional and technical exposure. Experience working on high-visibility projects.

Assign work and provide technical oversight to onshore and offshore developers

Excellent analytical/ communication skills and good team player.

EDUCATION

M. Tech in VLSI Design, Sathyabama University, Tamilnadu, India, April 2007

Bachelors in Engineering, B.E., EEE, Anna University, Tamilnadu, India April 2005

PROFESSIONAL EXPERIENCE

Health Net Inc, Los Angeles, CA & Chennai, India

Role: ETL Tech Lead

Feb 2015 – Present

Interacted with Business and gathered the requirement in Jira as User Stories for Agile Development.

In-charge of Data Engineering Team, leading Innovation in expert rules system and involved in Solution and Migration Strategy.

Involved in Design phase and documenting the project details

Developed solutions for the complex business requirements that include developing new systems, migration/reverse engineering of existing systems and other non-functional application improvements.

Migrate the data from legacy system to Oracle Database and perform ETL Operations using various tools (Hadoop / Ab initio / Informatica)

Gathered requirements from Business Analyst’s in understanding the Data Profile requirements for claims processing system.

Interacted with the Business users to identify the process metrics and also involved in the complete lifecycle of the project.

Implementing end-to-end data masking project, including strategy, requirement analysis, design, and development and testing phases.

Perform big data processing using Hadoop, MapReduce, Sqoop, Oozie, and Impala

Worked with Test Data Management (TDM) workbench for Data Masking Transformation and loaded masked data into the Target tables using custom defined rules in Ab initio.

Made use of transformations like Data Masking, Transaction Control and SQL Transformation to handle complex logic in (Hadoop in Informatica Power Center.

Developed complex Informatica TDM masking rules for PHI attributes & followed hybrid approach in masking data utilizing both TDM and Informatica Power Center.

Performed ETL Operations using Ab initio Tools.

Also designed the dimensional modeling for building the entities in the database.

Created entity relationship diagrams and dimensional modeling for the Data warehousing systems.

Worked with Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.

Worked on developing the Slowing Changing Dimensions Type1, Type2, Type 3 as per the business criteria.

Incorporated the star schema dimensional modeling due to the faster retrieval of data so the user can generate ad hoc reports in data warehousing.

Used the existing Policy Packs such as PCI, PHI, PII as a part of data masking using the data domains sets.

Defined the Custom policies, multiple domains each one tied up with multiple Rules to integrate them for synthetic data sets as part of data masking.

Created dimensional model for the reporting system by identifying required dimensions and facts using Erwin.

Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model.

Developed several re-usable transformations out of which one is threshold maintenance for the Date attributes to be masked in Informatica Power center.

Developed User, Role, Menu, Permission entities and their Referential Foreign Key Integrity in the SQL Server Database to achieve the User Role methodologies.

Data from the Test Data Repository is moved into the remaining regions of the application as a part of Deployment process.

Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.

Extensively worked on the partitioning the pipeline for parallel data processing in the Informatica Power Center.

Create and managing schema objects such as tables, views, indexes, stored procedures, and triggers & maintaining Referential Integrity.

Automated the Batch job process by scheduling the workflows using the PMCMD commands in Informatica.

Alert the user during the batch process by sending out email notification from Informatica.

Work on the Technical design documentation with the detail aspects of the data flow.

Schedule and maintain packages by daily, weekly and monthly using SQL Server Agent in SSMS.

Wrote complex SQL Queries in SQL Server to validate the masked data while execution of the workflows via workflow manager.

Prepared Logical Mapping Document (LMD) which depicts the Project level scope & type of data masking applied to the individual attributes with detailed information.

Environment: Informatica Power Center 9.6.1 (Designer, Workflow Manager, Monitor, Repository

Manager), Test Data Management (TDM), SQL Server Integration Service (SSIS), SQL Server, Data Modeler, Microsoft Visual Studio (SSDT), Power Shell Script, Stored Procedures, SQL, Visio, Microsoft Suite, Flat files.

Express Scripts Inc, St. Louis, MO & Chennai, India

Sr. ETL Developer.

May 2011 – Feb 2015

Responsibilities:

Interacted with the Business users to identify the process metrics and gathered requirements and documented the same in JIRA as user stories.

Prepared End-to-End process flow diagrams to trace the requirement change and received Business Signoff.

Designed ETL high level structure on how to retrieve & load the data from 25 different partner system.

Worked on Informatica Power Center Designer tool - Source Analyzer, Target designer, Mapping & Mapplet Designer and Transformation Designer.

Implemented the parallel concurrent execution of the session in order to process the data.

Worked with SQL, PL/SQL procedures and functions, stored procedures and packages within the

Informatica Designer mappings.

Extensively used Informatica to extract, transform data from heterogeneous source systems and

load the data into the Target database.

Implemented slowly changing dimensions Type 1 & Type 2 Temporal Dimension & facts according to the requirements.

Developed Re-usable transformations, Mapplets to use them for data load to data warehouse

database (Oracle).

Extensively worked on Partitioning tables for better handling of huge volume of data.

Prepared ETL mapping documents in data warehouse for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.

Designed a STAR schema for the detailed data marts and Plan data marts involving confirmed dimensions.

Created and maintained the Data Model repository as per company standards.

Involved in Unit & System testing as Regression Testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.

Migrated repository objects, services and scripts from development environment to production

environment. Extensive experience in troubleshooting and solving migration issues and production Issues.

Created Mapplets, reusable transformations and used them in different mappings.

Tuned Informatica mappings and sessions for optimum performance.

Assisted the other ETL developers in solving complex scenarios and coordinated with source

systems owners with day-to-day ETL progress monitoring.

Converted the PL/SQL Procedures to Informatica mappings and at the same time created

procedures in the database level for optimum performance of the mappings.

Investigating and fixing the bugs occurred in the production environment and providing the on-call support.

Performed Unit testing and maintained test logs and test cases for all the mappings.

Perform to review and apply Structured Query Language (SQL) fixes to the data in the CalWIN

Oracle Production Databases.

Highly skilled in Performance tuning - analyzing query plans and SQL tuning using EXPLAIN

PLAN and hints.

Developed SQL Loader scripts in order to create a temporary table and load the data.

Extensively worked on Reference Tables (RT) change Implementation projects, where validated the records in order to make sure the temp functionality is intact.

Ensured the feasibility of the logical and physical design models.

Used Erwin tool to design the database changes to add and updated new database objects and

process the database builds for these changes into various environments like system test /UAT and training environments.

Performed database refresh activities using Data Pump utilities (expdp/impdp) and SQL Loader.

Developed a UNIX shell script which reads the Sql ldr information, grants, and synonym scripts.

As part of the SQL review extensively worked on SQL tuning / Optimization and drastically

changed the response time using appropriate hints when needed.

Worked on Immediate fix in Production via SQL pack, carefully reviewed and applied

them which are very critical for the Business needs.

Was the technical lead in the process of decommissioning the SQL Station software and replaced?

with SVN repository for Check in & check Out of files.

Also performed the data refreshes from production to the downstream instances for system testing or Integration testing.

Worked with the system testing team & BI team within the CalWIN account for the providing access to instances and also create test scenarios for testing related task.

Environment: Informatica Power Center 9.6.1 (Designer, Workflow Manager, Monitor, Repository

Manager), Oracle R12c / 11g, SQL/PLSQL, Erwin Data Modeler, CalWIN application, TOAD, SVN

repository, UNIX, WINSCP, CUBE-D, PVCS Dimensions, HP Service Manager, CA Service Desk, HP

PPM, HP ALM

The Hanover Insurance Group, Inc, Worcester, MA & Chennai, India

ETL Developer

Sep 2007 – Apr 2011

Responsibilities:

Interacted with Business and documented the requirement as User Stories and Received signoff.

Performed the gapping exercise to identify the Gaps.

Extensively involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using Informatica.

Worked with heterogeneous source to Extracted data from Oracle database, XML and flat files and loaded to a relational Oracle warehouse.

Created data mappings to extract data from different source files, transform the data using Filter, Update Strategy, Aggregator, Expression, Joiner Transformations and then loaded into data warehouse.

Worked on SQL Server Integration Services (SSIS) to integrate and analyze data from multiple homogeneous and heterogeneous information sources.

Set and follow Informatica best practices, such as creating shared objects in shared for reusability and standard naming convention of ETL objects, design complex Informatica transformations, Mapplets, mappings, reusable sessions, worklets and workflows.

Evaluate business requirements to come up with Informatica mapping design that adheres to Informatica standards.

Implemented Slowly Changing dimension type2 methodology for accessing the full history of accounts and transaction information.

Used Update Strategy Transformation to update the Target Dimension tables, type2 updates where we insert the new record and update the old record in the target so we can track the changes in the future.

Developed various mapplets that were then included into the mappings as part of data warehouse.

Used Workflow Manager to read data from sources, and write data to target databases and manage sessions.

Develop SSIS packages to Extract, Transform and load data using different transformations such as Lookup, Derived Columns, Condition Split, Aggregate, and Pivot Transformation.

Developed mappings, sessions and workflows in Informatica Power Center.

Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.

Developed standard and reusable mappings and mapplets using various transformations like Expression, Aggregator, Joiner, Router, Lookup (Connected and Unconnected) and Filter.

Performed tuning of SQL queries and Stored Procedure for speedy extraction of data to resolve and troubleshoot issues in OLTP environment.

Troubleshooting of long running sessions and fixing the issues related to it.

Worked with Variables and Parameters in the mappings to pass the values between sessions.

Involved in the development of PL/SQL stored procedures, functions and packages to process business data in OLTP system.

Worked with the testing team to resolve bugs related to day one ETL mappings before production.

Creating the weekly project status reports, tracking the progress of tasks according to schedule and reporting any risks and contingency plan to management and business users.

Involved in meetings with production team for issues related to Deployment, maintenance, future enhancements, backup and crisis management of DW.

Worked with production team to resolve data issues in Production database of OLAP and OLTP systems.

Resolved issues related to Enterprise data warehouse (EDW), stored procedures in OLTP system and analyzed, design and develop ETL strategies.

Environment: Informatica Power Center 9.6 (Designer, Workflow Manager, Monitor, Repository Manager), SSIS, SSMS, SQL/ PLSQL, SQL Server, Oracle 11g, Windows Server 2003, TOAD, ERWIN Data Modeler, WINSCP, Shell Scripting, Putty.

Technical Skills:

ERP

Health Care and Insurance Product Claims Processing system and Policy Servicing systems

ETL / BI

Ab initio, Hadoop, Talend, Informatica Power Center (Designer, Workflow Manager, Workflow Monitor, Repository manager), Test Data Management (TDM), SQL Server Integration Service (SSIS), DAC, OBIEE 11.x/10.x, Dashboards, Answers, Delivers, BI Publisher.

Databases

Oracle RDBMS, MongoDB, Oracle 10g/12C, SQL Server, DB2, My SQL, MS-Access. Editors (SQL Navigator, Toad).

Data Modeling

ERWIN, Visio.

Languages

SQL, PL/SQL, PERL, UNIX Shell Scripting (K-Shell, C-Shell) Power Shell Scripting, JAVA Script, J2EE.

Web Technologies

Seibel Analytics, HTML, JAVASCRIPT, ASP and PHP

Tools

OBIEE 10g/11g (Admin/Desktop/Plus), Forms 9i/10g, Reports 9i/10g, Toad, SQL Navigator, SQL*Plus, SQL*Load, JDeveloper, Discoverer 9i/10g, Developer 2000, XML Publisher, SVN, TFS, Perforce, Panayi, Clear Quest, Clear Case, Perforce, More4apps, Kintana, PVCS, Quality Center.

Operating System

UNIX, Linux (Red hat enterprise) and WINDOWS family.



Contact this candidate