Post Job Free
Sign in

Data Integration Warehouse

Location:
Voorhees, NJ, 08043
Posted:
August 25, 2024

Contact this candidate

Resume:

Vipin Lakshminilayam

Technical Architect /Technical Lead

Technical Lad/Architect Professional with more than 16 years of experience in Data Warehouse and Data Integration technologies, serving clients from Telecommunication, Banking, Insurance and Manufacturing industries. Excellent Industry experience in Designing, Implementing, Supporting, and performance tuning the Data Integration/warehousing solutions. Continuously aims to reduce total cost of solutions, improve availability and redundancy.

Core Competencies

Design and Development of Data Integration solutions

Design and develop data models and data marts

Design and implement data flow from multiple source systems

Data Warehousing and Data Migration

Requirement gathering, analysis and provide optimal technical solutions

Develop technical and design specifications.

Development and testing review

Leadership and communication

Performance tuning

Process improvement and process modeling

Executing Onsite offshore model team structure

Managing and expanding cross functional team

Data Analytics

Tools and Technologies

Informatica Intelligent Cloud Services (IICS)

IDMC

Informatica PowerCenter 7.1, 8.6, 9.1,10.2, 10.5

Informatica DataQuality 10.1.1

SAP BODS

Business Objects XI 3.1

Apache Airflow

Databricks

Snowflake

Oracle

PL/SQL

Teradata

Teradata - BTEQ

SQL Server

DB2

SAP EC2

HDFS

UNIX

Spark

JIRA

Agile

ServiceNow

HP ALM

Docker

Kubernetes

Spark

Git

Certifications: AWS Certified Solutions Architect - Associate

Responsibilities

Having more than 16 years of experience in Data Integration/ Data Warehouse technologies with a strong background designing and developing data loading strategies, extracting data from multiple sources, and implementing transform and load.

Designed and developed multiple data flows from heterogenous source like Data lakes, HDFS, flat files and Oracle to data warehouse./datamart by leveraging Informatica, Teradata bteq, pyhton/shell scripting, spark and IICS technologies.

Analyze, optimize, and redesigning long running ETL jobs, queries, and scripts.

Design database views and models for optimal performance.

Extensively worked on projects with Informatica Power Center, Informatica Data Quality, SAP BODS and Talend and reporting tools like Business objects and OBIEE.

Extensive experience in Databases like Teradata, Oracle and procedure language like PL/SQL and BTEQs

Solid knowledge in performance tuning, query optimization and best practices in the industry

Hands on experience in guiding teams towards the successful fruition of project delivery. Conversant with maintaining high standards of ETL Solution development by coaching and mentoring. In depth knowledge of managing large teams by ensuring best practices and analytic skills

Good knowledge on Scheduling tools like Airlfow, Cybermation and Control-M

Define technical requirements and document plans for project lifecycle deployment including the scheduling of project deliverables, budgets and timelines.

Direct all IT deployments and job scheduling with oversight for vendor and consultant management.

Coordinate internal IT projects ensuring that implementations are ready as required by client deadlines.

Proven ability to resolve complex problems, quickly diagnosing and identifying issues and determining the proper resolutions.

Excellent Stakeholder management skills and experience in working on Business, Data, Application and Technology domains

Entry level expertise in Informatica MDM

Experience in complete execution of the project in Agile Methodology by estimating the user stories (requirements), Mentoring and providing technical support to the Team.

Educational Qualifications

Bachelor of Technology in Computer Science and Engineering, National Institute of Technology Silchar (Formally known a REC)

Projects Worked on at Virtusa

Comcast – EBI Data Operations - Sales and Compensation: August 2021 – Till Data

Role

ETL Architect/Technical Lead

Technology

Informatica Intelligent Could Services (IICS), Informatica Power Center 10.5.2, Teradata, BTEQ, PySpark, Unix Scripting, Python, JIRA, ServiceNow, Airflow, Databricks, HDFS, AWS, MinIO, Data Lake, Snowflake, Docker and Kubernetes

Description

EBI - Sales and Compensation team is responsible for managing data for Sales and commissions for Sales Reps in different Sales Channel. Data from multiple channels hosted in different platform like Datalake, AWS, HDFS, Oracle and flat files are loaded to Teradata hosted data warehouse. Data extracts were send to different business vendor from the data warehouse and Tableau reporting was performed by multiple business divisions.

Responsibilities

Designed and developed multiple data flows from heterogenous source like Data lakes, HDFS, flat files and Oracle to Teradata data warehouse by leveraging Teradata bteq, pyhton/shell scripting, spark and IICS technologies.

Analyze, optimize, and redesigning long running bteqs and scripts.

Design database views and models for tableau reporting for optimal performance.

Work along with Data Stewardship and data architect team, gather requirements, estimate, and design optimal solutions.

Schedule and develop data pipeline in Apache Airflow

Lead sprint planning and daily scrum calls.

M&T Bank: Enterprise Party Hub – Master Data Management - Sep 2020- to June 2021

Role

ETL Architect/Technical Lead

Technology

Informatica Power Center 10.1.1, Informatica Data Quality 10.1.1, Informatica MDM, Unix Scripting, Oracle, PL/SQL, JIRA, ServiceNow, Atomic Scheduler

Description

M&T Enterprise Party Hub is a Master data management system in which sources the Golden Party records derived from different source systems. EPH is receiving Party information around 30 different source systems including real time and determine Golden party based on the trust hierarchy. EPH is the sources of the Truth for multiple application and new sources are getting added to provide more value to the customers.

Responsibilities

Perform ETL Architect role to add new source system to EPH hub

Analyze the backlog tickets and allocate it to Kanban board for implementation

Create PL/SQL program for ETL jobs

Optimize existing SQL script for better prformace

Analyze source system data and incorporate with EPH hub

Implement new Business requirements, design and implement the solution

Support MDM technical team to provide data to EPH Staging layer

Support SIT and UAT with test data and production deployment

Provide technical solution to production support team for daily batch failures

Technical reviews and development support to offshore team

BlueCross BlueShield of Western New York: HN - EIS Data Integration and Delivery- Insurance

- Apr 2018- Aug 2020

Role

Associate Architect/Technical Lead

Technology

Informatica Power Center 10.2, Unix Scripting, Oracle, PL/SQL, Snowflake

JIRA, ServiceNow, FACET, CAVE Market Basket System

Description

HN- EIS Data Integration and Delivery is a multi-year program in which ETL team is responsible for enhancement and maintaining the existing ETL interfaces which load data from multiple source system like FACET to Data Warehouses and upstream ETL interfaces along with Vendor files. ETL team is also responsible for new projects and develop ETL solutions to provide data for new vendors

Responsibilities

Analysis the business requirement from DA and design the ETL solution

Develop ETL interface to support business requirement including vendor data

Create Stored procedures and functions in PL/SQL for ETL mapping

Maintenance and enhancement of existing ETL mappings

Technical reviews and development support to offshore team

Complete the backlog tickets and allocate it to offshore team

Experience Prior to Virtusa - Wipro Limited

Marsh & McLennan Companies: CANSYS OF12 Integration - Insurance – Jun 2017 - Feb 2018

Role

Technical Lead

Technology

Informatica Power center 9.1, Oracle, Business Object 4.2, Oracle, PL/SQL, Unix Scripting

Description

Marsh Canada financial system is getting migrated to Oracle Financial 12 which will be source for multiple interfaces and financial reporting. DataMart is created to stage data from OF12 which will be the using as source for multiple reports and interfaces.

ETL team is responsible to create Informatica Interface which loads data from OF12 application to Datamart and Business Object reports created for displaying daily transaction and monthly revenue.

Responsibilities

Analysis the business requirement and design the ETL solution

Develop for DataMart development including DDL and DML scripts

Create Informatica ETL job to load DataMart from OF12

Collect requirement for Business Object reporting

Responsible for Unit testing, QA, UAT testing for the CANSYS ETL and BO application

Responsible for Production changes and post Golive support

Corning Incorporate: HCM Replacement – Manufacturing – June 2016—May 2017

Role

Technical Lead

Technology

Informatica Powercenter 9.1, Oracle, SQL Server, PL/SQL, Unix Scripting, PeopleSoft, SuccessFactor

Description

Project HR2.0 is a global implementation project in which replace Coring Human resource system PeopleSoft into new HR management system called SuccessFactor. First phase of this project is to replace the Human Resource module and second phase is planned to replace Payroll module and rest of the business functional areas.

PeopleSoft supply data for 74 Human Resources related interfaces using Informatica PowerCenter as ETL tool. ETL Interfaces team is responsible to accommodate this source system changes for all 74 interfaces and ensure business will be able to perform as usual after this change.

Responsibilities

Work closely with the solution architect, data modeler, business analyst understands business requirements, providing expert knowledge and solutions on data warehousing, ensuring delivery of business needs in a timely cost-effective manner.

Source-target mappings, ETL design/development, solution tunings and optimization, define testing strategy and implementation

Provide reviews of deliverables and identify gaps/areas for improvement

Provide/review detailed technical specifications and ETL documentation required in each phase of the SDLC, and comply to internal standards and guidelines

Identify opportunities to optimize the ETL environment, implement monitoring, quality and validation processes to ensure data accuracy and integrity

Provide post-implementation support, respond to problems quickly and provide both short and long-term resolutions

Sealed Air: BI/BW support - Manufacturing – March 2015 - May 2016

Role

Technical Lead

Technology

Talend, Business Object 6.5 and XI 3.1, SAP, Linux, Oracle

Description

Sealed Air BI/BW support is a multi-year program in which Wipro BI team is responsible for supporting different Talend ETL interfaces and Business Object for reporting. Talend connect multiple productions data sources and supply data to different interfaces through both real time and batches t and ensure that business users are getting accurate data for their process.

Business Objects are using as a reporting tool and support team is responsible for all technical issues and users access managements. Issues are address by service tickets and there are projects which will require fresh development are also part of this program.

Responsibilities

Working as a technical lead in this program

Resolve end user`s technical issues related to data

Resolve Business Object reports issues and user access control

Develop new interfaces for Projects which are coming as new Business requirement

Resolve issues proactively and ensure quality of service.

Lead technical team for all developments and issues

Schneider Electric: Installed Base Management - Manufacturing – May 2014 - Feb 2015

Role

Technical Lead

Technology

Talend, Jaspersoft, MySQL, Linux, AWS

Description

Complexities in Supply Chain make tracking of assets more challenging between countries and distribution centers. Low data quality in heterogeneous data sources and significant partner lead sales and distribution restrict Schneider in tracing of asset. Diversity in cultures, operations, processes and systems across different countries limits a standardized approach to asset traceability.

This application is to provide solution to standardize asset tracking and generate Pipeline of Opportunities in all countries for different offers like Modernization, Service Plan and Preventive Maintenance based on technical cleansing. Installed Base Datamart helps to identify potential leads and corresponding business opportunity. To achieve target Installed Base Datamart require information from different source system and they are mentioned in below sections.

Responsibilities

Worked as Data analytic during the initial stage of the project

Identified the source systems and scope of data for the solution

Identified the relation between source systems and subject areas

Designed Datamart for this solution

Perform database tuning for better performance

Design logical and physical model of Datamart

Design ETL requirement specification document and ETL design

Develop ETL mapping to load data into Datamart

Lead technical team for all developments and issues.

Outotec: Outotec MDM Near term Solution – Manufacturing- Oct 2013 - Apr 2014

Role

Technical Lead

Technology

SAP BODS, MS SQL Server 2008, Unix, Oracle

Description

In order to bring in the required level of Optimization to the running of plants, Outotec plans to leverage Information technology to build an Operations & Maintenance Platform (O&M Platform). The way that O&M PLATFORM is envisaged to operate is through realizing an Information Hub. The Hub will collect readings from plant equipment instrumentation and store it for analysis by the Engineers and Maintenance Experts. The Hub will also take note of Failure Events that will trigger notifications for Action. In alignment with the prescribed Master Data strategy it has been decided to implement the architecture prescribed for the near term solution will be based on the capabilities of ETL and Data Quality Tools.

Responsibilities

Understand the business requirement from onsite client

Proposed technical solution based on the business requirement

Delivered a Proof of Concept based on the business requirement

Presented the proof of concept to client and solution was well appreciated

Design technical solution for the business requirement

Manage small team of 3 people in size

Electrolux: Compass SAP Global Rollout – Manufacturing- Sep 2011 - Sep 2013

Role

Technical Lead

Technology

Informatica DataQuality, Informatica Power Center, LSMW, SAP

Description

Compass SAP Global Rollout is a multi-country data migration project which helps Customer to implement a global solution for multiple country business. A global standard Meta Data Model is used for all conversion extracts during the Compass roll-outs. The Meta Data Model is based on the SAP data model, containing those elements where the project is expecting input from legacy systems and/or local business. The Meta Data Model is divided into a number of Conversion Objects. Each conversion object can be split up over several extract files.

Responsibilities

Onsite role to coordinate between Onsite and offshore team.

Collect requirement from business and process CPE

Ensure on deliverable which must be in time and meet client requirement.

Help offshore team for technical solution.

Set up Data Migration Environment for each Mock load and Production load

Data Quality lead for the project

Johnson and Johnson: Crossroads Migration Track - Manufacturing -Dec 2009 - Sep 2011

Role

Senior ETL Developer

Technology

Informatica DataQuality, Informatica Power Center, LSMW, SAP

Description

Crossroads Migration Track program helped Johnson and Johnson to create a global ERP system which collects data from different companies which were acquired by JnJ. The data is extracted from legacy systems from multiple sources and is loaded into SAP systems. Almost 25 objects are developed for migration of legacy data into SAP.

Responsibilities

Written the Technical Design Specification (TDS) of all the three objects.

Develop Code.

Written unit test cases and perform unit testing for all the three objects.

Did TUT, TAT, Cut Over Rehearsal, Integration test Cycle, Iterative Loads,

System Testing and all the iterative loads of all the three objects.

Provides coding and design guidelines in the project.

Mahindra Satyam

Johnson and Johnson: Crossroads Migration Track – Manufacturing- Feb 2008—Sep 2009

Role

ETL Developer

Technology

Informatica DataQuality, Informatica Power Center, DB2, LSMW, SAP

Description

Crossroads Migration Track program helped Johnson and Johnson to create a global ERP system which collects data from different companies which were acquired by JnJ. The data is extracted from legacy systems from multiple sources and is loaded into SAP systems. Almost 25 objects are developed for migration of legacy data into SAP.

Responsibilities

Developed mappings, workflows and sessions.

Prepared mapping design documents.

Worked on Test Cases and Test Results.

Contact Information:

Name

Vipin Lakshminilayam

Primary Phone

+1-856-**-*****

Primary Email Id

*******************@*****.***

Current Address

1138 Bibbs Rd, Voorhees

New Jersey 08043

Visa Status

H1-B (i-140 Approved)



Contact this candidate