Post Job Free

Resume

Sign in

Data Project

Location:
United States
Posted:
July 11, 2016

Contact this candidate

Resume:

Kapil Khandelwal

Phone: 408-***-**** acvnj6@r.postjobfree.com

PROFESSIONAL SUMMARY

* ***** ** ** ********** in Analysis, Design, Development, Implementation, Testing and Support of OLTP and OLAP systems.

9 years of experience in using Informatica Power Center (8.1/8.6/9.1/9.5/9.6) tools - Mapping Designer, Repository manager, Workflow Manager/Monitor.

Have extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using Informatica Power Center.

Experience in Hadoop and its ecosystem (Hive).

Experience on working with Hive SQLs to in and out data from Hadoop/HDFS cluster.

Expertise in Data Warehouse/Data mart and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modelling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.

Expertise in creating and managing Data Pipelines using ETL workflows and Performance tuning.

Strong PL/SQL programming skills including development of Stored Procedures, Functions, Cursors, Triggers and Packages using PL/SQL.

Experience in implementation of Data platform for Marketing Campaign.

Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP applications.

Involved in project planning, scheduling, System design, Functional Specification, Design specification, coding and system test plan.

Have extensive experience in Onsite / Offshore Client Handling.

Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.

Proficiency in developing SQL with various relational databases like Oracle, Teradata.

Experience in development of database objects like tables, indexes, Materialized Views, Procedures and synonyms

Experience in RDBMS development including coding using SQL*Plus, PL/SQL Developer, SQL*Loader, TOAD and good understanding of Oracle data dictionary

Experienced in providing production support and investigating and resolving system defects and fixing bug.

Excellent Analytical and problem solving skills, systematic approach to programming tasks and good interpersonal skills.

Working experience of Tableau Desktop. Ability to design and deploy rich Graphic visualizations with Drill Down and Drop down menu option and Parameterized using Tableau

Preparing Dashboards using calculations, parameters in Tableau

Knowledge in various reporting objects like Facts, Attributes, Hierarchies, Transformations, filters, prompts, Calculated fields, Sets, Groups, Parameters etc., in Tableau

Understanding of Agile methodology.

Education Qualifications:

Bachelors in Computer Science

Certifications:

Teradata basic V2R5 certified with 100%.

Skill set:

Data warehousing /BI Tools

Informatica Power Centre tool 8.x/9.x, Informatica Developer tool 9.x/10.x,Tableau 8.x/9.x, Knowledge about Hadoop database (HDFS, HIVE, Sqoop, Impala), OBIEE, Business Objects, ODI.

Languages

C/C++, SQL, PL/SQL, Shell Scripting

CASE Tool and Other tools

PVCS, JIRA, Quality Centre, Remedy, Dollar Universe Scheduler tool

Database

Oracle 11i/10g/9i, Teradata, Hadoop

Operating Systems

Windows XP,7, Unix

Professional Experience

Client: Western Union, SanFrancisco, USA

Role: Data Engineer

Location: SFO, CA

Duration: Feb-2015 to till date

Environment: Informatica Power Center 9.x, Hadoop, Hive, Netezza, UNIX, Tableau Desktop and server 8.x/9.x

Project 1: Customer Journey Marketing Campaigns.

Design and Implemented Hadoop data Platform to further streamline click-through and transactional data and enhance user engagement for Western Union. Leveraging big data technology automated the process for data collection, transformation and representation for Marketing Campaigns which simplified business users to drive data insights on huge volume of data.

Responsibities:

Work with the Application end Clients on Requirements gathering.

Work with Business Owners to develop key business questions and to build data-sets that answer those questions. Handling offshore team and assigning tasks to the team.

Prepared Design Docs and Project Plans.

Analyzed the Data streams/source to ingest the required data into the Data platform to server biz needs.

Assist team in data management concerns when needed.

Perform design, creation, interpretation and management of large data-sets to achieve business goals.

Attended story meetings and making sure the project follows agile methodology norms.

Designed and Developed complex mappings and workflows using Informatica Designer, Workflow manager for implementation of Data Platform.

Build mechanism (scripts) for encryption and decryption of Biz files and FTP to 3rd Party Vendor servers.

Worked with transformations like Lookup, Expression, Router, Filter, Update strategy, Stored Procedure etc.

Involved in Evaluation of data-sets for accuracy and quality along with the ETL code Testing.

Designed and build the ROI Model, demonstrating the performance of the launched campaigns.

Build Tableau Data source model for representing the Key Metrics for Campaign performance.

Build data visualization view representing the Key Metrics for Campaign performance in form of graphs and charts using Tableau Desktop.

Used Tableau desktop features such as hierarchy to slice and dice the metrics based on Time and other Dimensions, quick filters to filter out the data based on dimensions, extracts for better performance on the reports.

Published the Reports and dashboards with scheduled extracts on the Server.

Involved in Unit and UAT testing for the Metrics generated.

Managed 6- Member offshore development team, discussed on the requirements with them, reviewed the ETL code developed by them, helped them with technical and functional bottleneck.

Identified and solved issues concerning data management to improve data quality.

Project 2: KPI Scorecard

Designed and developed BI solution to fulfill the needs of Analytics team. Solutions built portrait’s clear picture of the Customer drop out ratio in their journey with Western Union. Thus helping out biz to analyze the end goals and trends and make key decision out of it.

Responsibities:

Work with the Application Clients for Gathering Requirements.

Meeting with Biz owners on daily basis, updating them the status of the project and understanding the future requirement.

Handling offshore team of size 4. Assigning tasks to the team as per the team’s availability.

Attended story meetings and coordinating with other teams and MassMutual vendors.

Responsible for driving the project in agile methodology.

Designed and Developed complex mappings and workflows using Informatica Designer, Workflow manager.

Worked with transformations like Lookup, Aggregator, Expression, Router, Filter, Update strategy, Stored Procedure etc.

Designing and developed ETL piece to back and forth data from Hadoop and hive tables.

Created hive queries to build aggregate tables for rolling up the metrics at date level.

Created Jobs & Job plans for the Workflows, SQL jobs & UNIX scripts.

Tuned performance of Informatica sessions for large data files by implementing pipeline partitioning and increasing block size, data cache size, sequence buffer length and target based commit interval and resolved bottlenecks.

Optimized the mappings by changing the logic and reduced running time and encouraged the usage of reusable objects.

Exclusively used ETL to load data from flat files to Oracle Database

Involved in Development and Testing.

Development of Tableau Sheets/Reports with use of calculated fields and parameters.

Developed reports to slice and dice on Biz metrics.

Publishing of reports and scheduling extracts on the Server.

Performed Unit, Integration testing to validate reports and worked closely with the users to resolve the ongoing production Issues.

Reviewed offshore developed ETL scripts and provided feedback along with technical and functional help.

Managed 6-Member offshore Development team.

Client: Cisco Systems Inc, San Jose, USA

Role: ETL Lead

Location: Pune, India/San Jose, CA

Duration: June-2007 to Jan-2015

Environment:Informatica Power Center 8.x/9.x, Oracle 9i/11i/10g, PLSQL, OBIEE, Dollar Universe tool, Kintana, Quality Centre, Tableau Desktop 8.x

Project 1: Product Quality Management System

Cisco Systems being leader in Networking for the Internet and a major supplier of Internet/Networking equipment & solutions sale there Hardware products which do result in returns/failures. Project (Quality Management System) deals with the Cisco returned products and show case the key failure metrics for each and very Cisco’s returned products. The project enables Failure Analysis Operations team @ Cisco to conducts monthly operations reviews to track performance and identify areas for improvement along with prioritizing high priority issues to have the Product failure fix.

Responsibities:

Worked with the Application Clients for Gathering Requirements.

Handling offshore team of size 8. Assigning tasks to the team as per the team’s availability.

Attended story meetings and coordinating with other teams and MassMutual vendors.

Designed and Developed complex mappings and workflows using Informatica Designer, Workflow manager.

Worked with transformations like Lookup, Aggregator, Expression, Router, Filter, Update strategy, Stored Procedure etc.

Used transformations such as source qualifier, aggregator, expression, lookup, router, filter, update strategy, joiner, SQL transformation, Union and stored procedure.

The tables had to be populated for use in daily load for the initial load ‘inserts’ and then updated using incremental loads and update strategy transformation.

Worked Extensively with Slowly Changing Dimensions i.e. Type1, Type2 & Type3.

Created Jobs & Job plans for the Workflows, SQL jobs & UNIX scripts by Dollar Universe tool.

All the Job plans are scheduled through Dollar Universe tool.

Tuned performance of Informatica sessions for large data files by implementing pipeline partitioning and increasing block size, data cache size, sequence buffer length and target based commit interval and resolved bottlenecks.

Optimized the mappings by changing the logic and reduced running time and encouraged the usage of reusable objects.

Exclusively used ETL to load data from flat files to Teradata and Oracle Database

Involved in Development, Testing and Production support.

Reviewed ETL Codes developed by offshore team and provided technical and functional help.

Managed 6-Member offshore Development team.

Project 2: ETL behavioral system

This project delivers a framework/system which highlights the ETL behavior or performance trends based on Tableau repots and charts using which users can flag out the bottleneck ETL piece/code and take necessary actions over it. Reports built are completely on the ETL logs data and can be sliced and dice based on time dimension

Responsibilities:

Played key role in requirement gathering, analysis, configuring, designing and implementation for the project.

Preparation of High Level & Low Level Design documents wrt to the requirements gathered.

Build data pipelines for extraction of ETL logs, parsing ETL logs, ingestion of data to target schema, implementation of business calculations/defined rules.

Developed Tableau reports representing Key Metrics reports using tableau desktop tool

Involved in unit testing and system integration testing of the developed code.

Packaging of the developed component to be deployed in production environment

Project 3: Change Request Automation Tool

The project provides a framework “CRA (Change request Automation) tool” that servers the end user with

automation of change request impact analysis. The tool helps the end user to reduce the manual task

involved in analyzing the change request wrt to their application. The tool will automatically notify the user

via email in case of their application is being impacted due to a change request being submitted by the

release team.

Responsibilities:

Involved in Gathering the requirement, analyzing the requirement and finalizing them.

Involved in meeting with stakeholders on explaining them on the pros and cons of their requirement.

Preparing of project execution and implementation plan.

Involved in preparation and review of the project documents

Drive the project as a lead and took all the responsibility to deliver it without any failure.

Ensuring that the quality of the project is up to the mark.

Attended user design sessions, studied user requirements, completed detail design analysis and wrote design specs and regularly reported update to the management.

Coordinated with offshore and onsite team and worked as a module lead.

Developed ETL job workflow with error/exception handling and rollback framework.

Designed mappings that loaded data from flat-files to the staging tables.

Developed Informatica mappings, sessions and workflows.

Extensively worked with Repository Manager, Designer, Workflow Manager and Workflow Monitor.

Used Informatica variables to implement complex logic for Informatica mappings.

Worked with flat files, Oracle Sources and Targets.

Scheduled the jobs using Dollar Universe.

Designed the Migration Plan document to migrate the Mappings and work-flows to the Testing and Production environments.

Conducted peer-to-peer code review meetings.

Interacted with key users and assisted them with various data issues, understood data needs and assisted them with Data analysis.

Project 4: Defect Monitoring System

The project aimed to provide the metrics which globally highlight the quality and performance of the Cisco’s Software Products. It tracks the defects raised by QE or end customer for Software products and also showcase the key metrics related to those defects. It basically represents/show case the entire lifecycle of the concerns (defects) raised for any kind of Software products failure form the time it was created till it gets resolved and the product is again released to the market.

Responsibilities:

Informatica Power Center 8.1.6 was used to extract data from various sources and load into Data Mart.

Used transformations such as source qualifier, aggregator, expression, lookup, router, filter, update strategy, joiner, transaction control and stored procedure.

Worked with pre and post sessions and extracted data from defect tracking tool into staging area.

Identified facts and dimensions tables.

Tuned sources, targets, mappings and sessions to improve the performance of data load.

Migrated Informatica mappings, sessions, and workflows to production instances.

Created ETL jobs via job scheduler tool.

Developed and tuned Informatica mappings and Mapplets for optimum performance, dependencies and batch design.

Created several Informatica mappings to populate data into dimensions and fact tables.

Worked with team members to identify and resolve various Informatica and database related issues.

Developed PL/SQL scripts.

Involved in preparation and review of project and deployment plans and preparation of High Level & Low Level Design documents wrt to the requirements gathered.

Resolving the defects raised by the users within defined SLA as part of which carried trouble shooting of mappings/ETL Codes and logs

Preparing test cases for the change requests after completing the development of CR’s by oracle team.

Managed offshore team, discussed on the requirements with them, reviewed the ETL code developed by them, helped them with technical and functional bottleneck.

Mentored new team members.

Project 5: HW-Quality DM

The project serves a decision support system that assists the business to analyze the actual manufacturing trends, collaborative planning and analysis with the end goal to track results. It also helps the Business to bring together metrics that reflect globally coordinated activities of marketing and quality controls of the hardware products in Cisco.

Responsibilities:

Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build the DataMart.

Developed a standard ETL framework to enable the reusability of similar logic across the board. Involved in System Documentation of Dataflow and methodology.

Developed mappings to extract data from SQL Server, Flat files, RDBMS and load into DataMart using the Power Center.

Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to DataMart.

Developed Slowly Changing Dimension mappings.

Used mapplets for use in mappings thereby saving valuable design time and effort

Used Informatica Workflow Manager to create, execute and monitor sessions, Worklets and workflows.

Involved in writing PL/SQL packages, procedure and Functions.

Scheduled the ETL/Jobs via job Scheduling tool Dollar Universe.

Implemented various Performance Tuning techniques on Sources, Targets, Mappings, Workflows and database tuning.

Created reports using OBIEE for end users.

Preparation of Checklists, identification of Test Cases, preparing Test Plans.

Integrating and deploying in production.

Involved in production support and troubleshooting Data Quality and Enhancement Issues.

Performed unit testing for all the development activities and involved during the design phase of the project for requirement analysis

Identified tables/Scripts required for full load, daily delta-load and monthly load and documented ETL process for each load.

Project 6: Customer Experience Dashboard

The project delivers metrics that measure the customer's overall end-to-end experience, in areas such as customer relationship/ satisfaction, operational performance, and quality.

It specifically provides

• Consolidated internal snapshots of how Cisco is performing with Service Provider and Enterprise customer

• Trends to identify corrective actions before they become more serious issues, and

• Comparison of metrics across accounts to help drive more consistent and positive behavior

Responsibilities:

Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.

Imported data from various Sources transformed and loaded into Data Warehouse Targets using Informatica.

Developing Informatica mappings & tuning them when necessary.

Unit and System Testing of developed mapping.

Documentation to describe program development, logic, coding, testing.

Involved in production support and maintenance.



Contact this candidate