Post Job Free

Resume

Sign in

Data Customer Service

Location:
Dubai, DU, United Arab Emirates
Salary:
15000
Posted:
October 24, 2013

Contact this candidate

Resume:

Priyanka

Mobile No: +971-*********

Alternate No: +91-529****** E mail: acafk7@r.postjobfree.com

Residence: Dubai, UAE.

Visa Status: Spouse Visa

Data Warehousing Developer/Tester

Banking ~ Capital Markets ~ ETL ~ Informatica ~ Oracle ~ 3.8+ Years’ Experience

Performance driven and committed Data Warehouse Professional with 3.8+ years of experience in working on data

loading and Software Quality & Testing for risk calculation and risk analysis. Effective use of Informatica 8.x and

Oracle for deriving data from legacy systems and managing cleansing, processing and loading of information in data

marts and data warehouse. Critical analysis on trade data.

In depth knowledge on concepts related to data mining and data warehousing. Worked closely with clients in

understanding their requirements, analyse the business data and perform testing of complex designs provided by

Informatica.

Leverage Informatica PowerCenter 8.x to facilitate architecting, developing and testing of mappings and mapplets.

Formulate workflows for complex requirements. Contribute extensively during production releases. Effective usage of

Oracle in creating data base for loading trade data. Worked on RAC architecture.

Utilize UNIX Shell Scripting to manage workflows, archive source files and send mails during contingencies. Complete

knowledge on SIT/QA/UAT testing environments.

Proven strengths in handling projects from initiation to planning to final execution. Demonstrate leadership in

overcoming challenges during project implementation. Highly committed to quality and customer service. Continuous

learner keeping track of new and cutting edge concepts/technologies. Dependable team player with excellent

relationship building skills. Logical and analytical bent of mind with strengths to quickly grasp and debug issues.

AREAS OF PROFICIENCY

Meet Stringent Delivery Deadlines Oracle

Informatica

ETL Testing Defect Tracking Tools(QC, JIRA) All stages SDLC

Effective Client Interaction Understanding on requirement

Business Analysis

SQL PL/SQL Informatica Performance

Tuning

Unix Informatica B2B Data Transformation Autosys

Sql Server Data warehousing concepts Data mining

Positive Client Experience Functioning of Capital Markets Communication Skills

Defect Free Execution Personnel Management Analytical/Problem Solving

Skills

TECHNICAL PROFICIENCY

Data Warehousing: Informatica, Oracle, Unix Shell Scripting, SQL and Autosys.

Workflow Manager Tools: Task Developer, Workflow and Work let Designer .

Concepts: Data Warehousing and Data Mining.

Tools: Ab Initio, Cognos, Perforce Versioning and Autosys.

QA Tools: QC, JIRA

Languages: C, C++ and JAVA.

CERTIFICATIONS

Certified on Informatica LEVEL1 Mapping Developer.

ACADEMIC CREDENTIALS

B.E., ECE, Sri Venkateswara College of Engineering.

PROFESSIONAL WORK EXPERIENCE

Associate Consultant, Capgemini, March 2010 to Present

PROJECTS HANDLED

Project1: RFDW (Risk and Finance Data Warehouse)

Client: Barclays Bank

Employer: Capgemini India pvt.ltd

Environment: Informatica 8.6.2, Oracle, unix, Autosys, Perforce

Duration: June 2010 to September 2012 & March 2013 to October 2013

Role: Senior ETL Analyst/Developer/Tester

Barclays Capital, the investment banking division of Barclays Bank PLC has developed a unique business model for

delivering an entire gamut of solutions for catering to strategic advisory, financing and risk management requirements

of their clients (Conglomerates, Government Agencies and Institutions).

Description

RFDW presents a unified repository of front office data to the downstream systems with regards to risk and finance

business functions. RFDW is configured with the strategic architecture program to facilitate provision of EOD data for

the firm’s risk and finance teams. The Risk and Profit and Loss (P&L) information is accumulated from upstream front

office systems for ensuring rendition of common Risk and P&L information to downstream risk and finance systems.

The objective of the project is to leverage upstream front office systems to ensure the availability of data files on the

RFDW server for their next trading day.

In addition data files are recorded in the LHS main tables based on the business logic elucidated within the mapping

document.

Then utilizing unstructured data transformation the RHS data is derived to formulate multiple itypes and ievents. In

turn the data is authenticated with LHS tables and entered into snap tables. After which the authentic ievent

notifications are delivered to the message queue. The EVM (Event Manager) receives the messages and provides

downstream applications access to the messages.

Key Responsibilities

To get clear understanding on requirement from client.

Analyse the impact of the requirement.

Created logic to calculate RISK for Explain, Valuations and Positions on monthly basis.

Effectively worked on creating database for the given systems using RAC model.

Implemented Partitions, hints as part of performance tuning.

Created Procedures, function for snapping records, using it for creating unique ids.

Build effective logic to implement the given requirement. Ensure the logic has minimum impact on the business.

Assume responsibility for managing development work functions related to upgraded requirements and ad hoc defect

fixes (CR).

Ensure utilization of various Informatica transformations such as unstructured data transformation, union and stored

procedure transformation.

Used middleware messaging service (JMS).

Oversee effective usage of mapplets in mappings. Develop sessions and workflows for batch and concurrent

processing.

Generate Instruments using BLOOMBERG Tickr for RNN system.

Deliver high quality reliable software as per client’s requirements. Develop a generic code which could be utilized by

multiple users without making any alterations to it.

Handle unit testing, smoke testing and performance testing. Ensure adherence to best practice methodology during

project implementation.

Prepare Test Plans, analyse the testing requirements.

Prepared Test Scenarios by creating Mock data based on the different test cases.

Make sure the Test case is passed.

Act as a key point of contact for interfacing with clients and for resolving contingencies.

Assume responsibility for gaining expertise on an automation tool, recommending methods for augmenting the

efficiency of the automation tool and ensuring utilization of the tool by the client and Developers for testing, resulting in

considerable cost savings for the client.

Attend daily Client and PROD calls to update the status and ensure daily PROD run is successful.

Project2: DFS (Data Feed Service)

Client: Barclays Capital

Employer: Capgemini India pvt.ltd

Environment: Informatica 8.6.2, SQL server, unix

Duration: October 2012 to February 2013

Role: Offshore Technical Lead

Description

DFS is an enhancement of RFDW to directly notify RISK information to the downstream Valkyrie by skipping EVM.

The objective of the project is to use sql server database to load RISK data directly into Valkyrie machine and send

notification to Valkyrie web service.

This project also focuses on performance tuning of Risk data load.

Key Responsibilities

Manage offshore resources, allocate tasks to resource, manage task tracker.

To get clear understanding on requirement from client.

Analyse the impact of the requirement.

Created logic to calculate RISK for Explain, Valuations and Positions on monthly basis.

Effectively worked on creating database for the given systems using RAC model.

Implemented Partitions, hints as part of performance tuning.

Created Procedures, function for snapping records, using it for creating unique ids.

Build effective logic to implement the given requirement. Ensure the logic has minimum impact on the business.

Analyse, design and develop ETL model to efficiently snap the RISK data into sql server data base.

Effectively worked on performance tuning.

Handle unit testing, smoke testing and performance testing. Ensure adherence to best practice methodology during

project implementation.

Prepare Test Plans, analyse the testing requirements.

Prepared Test Scenarios by creating Mock data based on the different test cases.

Make sure the Test case is passed.

Act as a key point of contact for interfacing with clients and for resolving contingencies.

Assume responsibility for gaining expertise on an automation tool, recommending methods for augmenting the

efficiency of the automation tool and ensuring utilization of the tool by the client and Developers for testing, resulting in

considerable cost savings for the client.

Calculate RISK for Explain, Valuations and Positions on monthly basis.

Attend daily Client and PROD calls to update the status and ensure daily PROD run is successful.

TRAINING PROGRAMS ATTENDED

Trained on Unix, SQL, PL/SQL, Data Warehousing, Data Mining, Informatica, Ab Initio, Cognos, Capgemini L&C

Program.

Trained on C, C++ and JAVA, NIIT.

Trained at Pentasoft Technologies.

PERSONAL DETAILS

Date of Birth: 26 11 – 1987.

Nationality: Indian

Residency: Dubai

Visa: Spouse Visa

Languages Spoken: English, Tamil, Telugu and Hindi.



Contact this candidate