Post Job Free
Sign in

Data Project

Location:
India
Posted:
August 05, 2016

Contact this candidate

Resume:

Yashwant Prajapati

Email: *********.********@*****.***, Phone: (M) 91-735*******

Professional Summary

Having 11 years 6 months of experience in solution design, analysis, Data modeling, development, testing, implementation and supporting of the various Data Warehousing applications.

Comprehensive understanding of full development lifecycle and is actively involved in all phases.

Have successfully implemented the data warehousing and ODS projects in batch mode, real time and near real time modes.

Applied performance tuning techniques at various stages while handling huge amount of data.

Informatica Power center 8 certified developer with over 10 years of experience in building various kinds of interfaces / integrations for different layers of a data warehouse.

Knowledge of Informatica Data Transformation in handling excel, xml formats.

Good knowledge in writing UNIX Shell Scripts. Also worked on awk, perl scripting.

Knowledge of Teradata architecture and utilities BTEQ, fast load, mload.

Very good experience in managing the team efficiently and providing the on time optimum project deliveries.

Provide the effort estimations at various stages of the development life cycle.

Six Sigma – Implemented Lean methodology in project and improved performance

Good knowledge of agile methodology while working in agile project.

Strong work ethics, taking ownership of all duties and responsibilities and meeting management expectations

Technical Experience

Data-warehouse ETL Tools - Informatica 9.5.1, 9.1, 8.6.1, 8.5.1, 8.1.1

Programming Languages - UNIX shell scripting, Awk & Perl scripting, SQL

Databases – Oracle 9i, 10g, 11g, Teradata V13, V12, DB2 UDB 8.1

Data Transmission Servers - AS2 server (TCM-3.4), Sterling Commerce (Ordernet)

Scheduler - Maestro Scheduler (Job Scheduling Console 1.4), TWS, Control M

Operating Systems - UNIX (HP-UX B.11.23), Windows XP

Other Tools - Toad 9.5, Aqua data 4.7, SQL*PLUS, Putty, Visio, PL/SQL developer

Professional Experience

Project# 1: DUCO Framework

Client: Deutsche Bank

Duration: Dec 2015 – till date

Role: AVP, ETL Designer

Domain: Global Markets

Location: Deutsche Bank of India, Pune

Description:

Building a framework that will automate the reconciliation process for markets clearing division. Informatica Data transformation and power centre are used to perform the ETL for converting the files from any format into the CSV format and applying all types of transformations.

Responsibilities:

Perform file format conversion using Informatica data transformation to convert excel/xml file to CSV format.

Review business requirements, clarify all queries and provide effort estimations.

Perform Disaster recovery test for DUCO application and publish the test scripts.

Design and Develop the interface to validate and transform the source feeds of any format into desired format.

Perform unit testing of the code. Prepare the ETL design and unit test documents.

Deploy the code into higher environments using Nexus.

Scheduling and monitoring the jobs in Control M.

Environment: Linux

Tools / DB: Informatica 9.5.1, Oracle 11g, Linux, Control M

Project# 2: MASSPAY DWH and ODS

Client: Barclays Bank

Duration: Jan 2014 – Dec 2015

Role: ETL Designer and Lead

Domain: Corporate Banking

Location: Barclays Technology Centre India, Pune

Description:

Data warehouse

Building a Data Warehouse that will process the SEPA payments transactions for Barclays. The project processes the payments data into the DWH in 4 different layers.

The “acquire and publish” layer extracts the static and transactional data from the MASSPAY system and publishes on oracle layer where it can be used by any other downstream system or any other project. Thus extracting once and making it available for all.

SOR layer processes these data into DWH where sparse history is maintained for every entity.

MI layer populates the reporting data model to provide the data for generating the reports.

SDA layer archives the data to meet the regulatory requirement of the bank.

ODS

Building a Corporate banking neutral operational data store that will process the corporate data in real time and provide the status updates to business in near real time (15 minutes).

Real time data processing from source to the staging layer using Informatica data replicator (IDR).

Execute micro batches in run continuous mode to update the ODS layer in near real time.

Write reports logic in ETL so that business can query and generate the reports at any time from the ODS database.

Perform auditing and reconciliations at different stages to avoid data loss.

Responsibilities:

Leading the ETL team while having the complete delivery responsibility on self.

Review FRD, IRD documents and collect the requirements; identify data granularity, data volume, scope, dependency and criticality of each source system.

Performing the ETL estimations for stage 0, various CRs for different projects.

Design the effective ETL process which meet all the NFRs and provide the data for ETL excel reports and COGNOS reports on time.

Prepare the ETL detailed design document relating the business requirement to technical components, describing the complete flow of data, assumptions and risks.

Develop informatica SCD1, SCD2 jobs for batch processing of data into different layers.

CLOB, BLOB data processing. De-bulking BLOB data into individual XML transactions.

ODS – Design, Develop ETL jobs in IDR for real time data processing from source to stage.

ODS – Design, Develop ETL jobs to process the real time updates from staging into ODS layer through micro batches.

Perform unit testing of the code. Prepare the test documents.

Prepare code deployment sheet, DDL scripts, checklist for acceptance and production migration of the jobs.

Scheduling the jobs in Tivoli workload scheduler TWS.

Environment: Linux

Tools / DB: Informatica 9.1, Oracle 11g, Teradata, Linux, SVN

Project# 3: RAPID

Client: Barclays Bank

Duration: July 2012 – Dec 2013

Role: ETL Designer and Lead

Domain: Corporate Banking

Location: Barclays Technology Centre India, Pune

Description:

A Corporate banking project that build the operational data store (ODS) to report the progress of a loan cycle. Integrates data from disparate source systems Salesforce, Zeus, Siebel and BED LDAP which mark the different stages of a Loan cycle. The reports will help business users to have a single view of corporate Loan process hence will reduce the manual effort spent in communicating with each system to know the status of a loan.

The reports provide a near real time view (every 15 min) of Salesforce and Siebel data. MQ series setup has been done to get the real time updates from Siebel.

Informatica LDAP connector has been used to pull the data from Hierarchical data directly from Barclays LDAP.

Salesforce plugin has been used to fetch the customer and user data from Salesforce. .

Responsibilities:

Review IRD, NFR. Collect requirements from business analyst; identify data granularity, data volume, scope, dependency and criticality of each source system.

Design a concrete data model defining the relationship among various attributes.

Design the ETL process to load the database effectively.

Prepare the ETL detailed design document and mapping document describing the complete flow of data, assumptions and risks.

Perform effort estimation for ETL build & unit testing at various stages of development cycle.

Create informatica jobs for batch processing and real time processing of data.

Applying business logic using SCD1.

Perform unit testing of the code. Prepare the test documents.

Prepare code deployment sheet, DDL scripts, checklist for acceptance and production migration of the jobs.

Scheduling the jobs in Tivoli workload scheduler TWS.

Environment: Linux

Tools / DB: Informatica 9.1, 8.6.1, Oracle 11g, MQ series, LDAP connector, Salesforce plugin

Project# 4: KPN BI Innovations

Client: KPN, The Netherlands

Duration: Oct 2010 – June 2012

Role: ETL Designer & Sr. Developer, Tech lead

Domain: Communications

Location: KPN Netherlands. Cognizant Technology Solutions, Pune

Description:

A re-engineering project which aims to retire the old data warehouse and build a new Teradata CLDM data warehouse to empower the business users in analysis and decision making. The project working culture is based on agile methodology. Source systems provide the data in flat file, relational format (oracle). That is loaded in CLDM database using SCD1 & SCD2 through Pushdown Optimization. This data is aggregated in semantic layer to build the OBIEE reports. On top of it the presentation layer creates application specific data marts.

Responsibilities:

Analyse business requirements, perform effort estimation and build the ETL logic.

Applying performance tuning techniques at different stages while handling billions of records.

Prepare informatica jobs for different ETL steps used to load the data in different layers of the data warehouse (CLDM data warehouse, SL layer and Data marts).

Applying business logic (Insert, SCD1, SCD2 etc) using Pushdown Optimization Techniques, fast load, Update strategy etc.

Perform unit testing of the code. Prepare the test documents.

Prepare code deployment sheet, DDL scripts, checklist for acceptance and production migration of the jobs.

Prepare job dependency sheet for UC4 scheduling team.

Environment: Linux, UNIX bash shell scripting

Tools / DB: Informatica 8.6.1, Teradata V 13, Oracle 9i, SVN

Project# 5: GE Commercial Finance Teradata Offshore

Client: GE

Duration: Nov 2009 – Oct 2010

Role: ETL Designer & Sr. Developer, Tech lead leading a team of 5 people

Domain: Finance

Location: Tata Consultancy Services Ltd., Mumbai (India)

Description:

It is a pure development project which uses informatica tool, teradata utilities and unix to perform various types of tasks (insert, upsert, change data capture, logical delete etc).

Responsibilities:

Understand business requirements, perform effort estimation and build the logic based on the ETL specs.

Prepare hybrid jobs to improve performance by using mapping and teradata utilities (BTEQ, fast load, mload etc).

Perform data validations for these jobs.

Test the code and prepare checklist for integration and production migration of the jobs.

Environment: UNIX bash shell scripting

Tools / DB: informatica 8.6.1, Teradata V 12, Oracle 9i, star team, PL/SQL developer

Project# 6: Walgreens Billing Payment Offshore

Client: Walgreens, Deerfield, IL

Duration: July 2009 – Oct 2009

Role: ETL Designer

Domain: Retail

Location: Tata Consultancy Services Ltd., Noida (India)

Description:

It’s a reengineering project that develops the Walgreens’ billing and payment process using ETL tool Data Stage and UNIX.

Currently the accounting users in Walgreens perform most of the billing and payment functionalities manually. The implementation of this project will reduce the manual dependency and speed up the cycle processing.

This project is also introducing the prompt pay logic in the Walgreens B&P system after that the prompt pay pharmacies will get the vouchers in very less time.

Responsibilities:

Understanding business and technical requirements provided by the client.

Preparing High level and low level design documents.

Coding the ETL logic in Data stage.

Environment: Unix Korn Shell scripting

Tools/DB: Data Stage 8.0.1, CVS, Oracle 9i

Project# 7: Paramount Data Services EAI

Client: Paramount Pictures Corporation, USA

Duration: July 2006- June 2009

Role: ETL developer, Module leader leading a team of 6 people

Domain: Media & Entertainment

Location: Tata Consultancy Services Ltd, Gandhinagar, Gujarat, India

Description:

It is a supply chain management project which uses EDI standards to process the data to various trading partners (TP).

Paramount EAI team receives data from various TPs in EDI X12, EDIFACT, XML, text formats. It processes the data to TPs and various internal clients like SAP, VMI, FIS, Peoplesoft, Studio, Hyperion, Data warehouse etc.

To process the data to these systems, EAI transforms it into their required formats like for SAP, EAI transforms data into IDOC format; for VMI, data is transformed into text format etc.

EAI is responsible for successful posting of data from the given source to the required target.

It uses Informatica, UNIX, Oracle, Constellar HUB and Ascential data stage for data transformation.

Responsibilities:

To deal with development, enhancement and production support tasks.

Development / Enhancement –

1.Troubleshoot the job abends, propose the permanent resolution, implement the new logic using scripting, ETL tool (Informatica), perform testing and then migrate it into production with proper documentation after client approval.

2.Creating new interfaces for the new business requirements.

3.Reengineering the interfaces from ETL tool (data stage, constellar hub) into Informatica, unix.

4.Writing the logic to automate the process of reconciliation.

5.Modifying the existing interfaces based on business requirements.

Support –

1.I was responsible for successful completion of business interfaces and to make sure the correct, on-time data transmission to the right target from the given source using EDI standards.

2.Data transmission uses EDI logic to extract, transform and load the data to the target. Data is in the form of X12, EDIFACT, XML, Flat files.

3.Resolving the issues relating to any job failure and to communicate with respective trading partner to confirm the successful posting of data.

4.Handling the outage independently during night shifts by attending meetings and having group chat with the clients and taking the appropriate action during the outage.

Environment: UNIX Shell scripting, AWK, Perl scripting, SQL, Oracle 9i

Tools: Informatica 8.6.1, Informatica 8.5.1, Informatica 8.1.1, IBM Maestro Scheduler 1.4, Toad 9.5, Aqua data 4.7, SQL*Plus, Putty, Visio

Project# 8: Prudential GIDW

Client: Prudential, USA

Duration: April 2005 – Jun 2006

Role: SQL Developer

Domain: Insurance

Location: Tata Consultancy Services Ltd, Chennai, India

Description:

A data warehousing application, interlinking different group insurance systems within Prudential.

It is a group insurance data warehousing (GIDW) project. Data comes in flat files which are processed to data warehouse by following proper ETL process.

Data is cleaned in staging area and finally correct data is loaded into the warehouse tables. DB2 is used at all stages to store the data.

GIDW team works on generating data warehouse reports. It receives business requirements for creating logic for new report generation from prudential clients.

The SQL program generates the monthly snapshot reports that business users need on priority to take the decision to improve the business cycle.

Responsibilities

Analyzing Business and technical requirements.

Discussing with team and clients about the doubts and any newly added process

Coding the programs in DB2 SQL to generate the monthly snapshot reports for the use.rs

Generating UTP and perform testing of the DB2 code.

Migrating the program into production after proper reviews from both onsite and offshore

Bug fixing – working on other business tickets having any data issue

Environment: UNIX Shell scripting, DB2 UDB 8.1

Certifications

Certification

Date Certified

Informatica (R - PowerCenter 8 Architecture and Administration)

Feb 2010

Informatica (U - PowerCenter 8 Advanced Mapping Design)

March 2009

Informatica (S - PowerCenter 8 Mapping Design)

December 2008

IBM DB2 UDB V8.1 700 Exam

October 2006

Qualifications

Name of the Examination

University/Board

Year of Passing

Percentage of Marks

BE ( Electrical )

M.B.M. Engg. College, J.N.V.University, jodhpur (Raj.)

2004

66

Senior Secondary Examination

Board of Secondary Education, Rajasthan, Ajmer

1999

80

Secondary Examination

Board of Secondary Education, Rajasthan, Ajmer

1997

86

Career Profile

(List the companies you have worked with, recent ones first)

Dates

Organization

28-Dec-2015 – till date

Deutsche Bank

2-July-2012 – 24-Dec-2015

Barclays Bank

25-October-2010 to 29-June-2012

Cognizant Technology Solutions PVT. Ltd.

17-Feb-2005 to 22-Oct 2010

TATA Consultancy Services Ltd.

Personal Details

Date of Birth: 21-05-1981

Nationality: INDIAN

Sex: Male

Marital Status: Married

Permanent Address: 42, Kumharwara, Nayakwari, Udaipur (Raj.) - 313001



Contact this candidate