Post Job Free
Sign in

Customer Data

Location:
Visakhapatnam, AP, India
Posted:
September 23, 2014

Contact this candidate

Resume:

Preeti Roul

P ***/*, DIAT(D.U), Girinagar, Near Khadakwasla Dam, Pune: 411025

Mobile: 888-***-****

Landline:020-********

Email : ******.****@*****.***

Career Objective:

To pursue a growth oriented career with a progressive company that provides

a scope to apply my knowledge and skills that would help me contribute my

best to the organization.

Personal Attributes:

Good interpersonal, communication and problem solving skills, ability to

grasp & willingness to learn new concepts and ability to organize and work

in a team.

Educational Qualification:

Masters in Computer Application(MCA), with Distinction from Pune

University, India - 2004.

Bachelors in Computer Science(BCS), with 1st Class from Pune University,

India - 2001.

Experience (around 6.0 years)

Previous Employer: Nomura (formerly Lehman Brothers Services Ltd,

Mumbai,India.)

Designation: Developer

Period: May 2008 -to- 27 July 2010

Previous Employer: Tata Consultancy Services Ltd, Mumbai, India.

Designation: Asst System Engineer.

Period: September 2007 -to- April 2008

Previous Employer: IBM Global Service India Pvt. Ltd, Pune,

India.

Designation: Software Engineer.

Period: August 2004 -to- August 2007

Areas of Proficiency: Skill Set:

Operating Systems: Win 2000, Win 95/98, UNIX

Programming Languages: (AIX) Unix Shell Scripting, (Core Java, JSP,

Servlets, Visual

Basic, .NET exposure)

Database : Teradata, Sybase(Exposure)

Tools: Teradata Load Utilities Bteq, Fastload,

Multiload, Fast

Export, Acuate, (Informatica,

Jasper iReport and BO exposure.)

Certifications:

1. Certified Teradata Professional V2R5

2. Teradata Certified SQL Specialist V2R5

3. Six Sigma White Belt

Paper Presented:

Participated in the Telecom Sharenet held in Pune in IBM in Nov 06 and

presented paper on "Telecom Industry Standards- SID"

Projects: Prime Services Reporting

Duration: Oct 2008 - July 2010

Environment: Sybase,Unix,Java

Client: Nomura

I have worked on Sybase development along with basic Unix commands and

shell scripting. I have a basic knowledge of Java concepts too.

I was working as a Developer in Prime Services reporting team including

gathering of requirements, analysis of impact on existing reports and new

development in reports and Sybase.

My responsibilities Includes:

1. Development of projects that involve: Databases (Sybase 12.5, Data

Warehouses etc ) and Reports (Actuate ERD Pro 9, Unix)

3. Detail analysis of business requirements and implementation of the same

along with impact analysis on existing functionalities.

Projects: Asia DB Merge

Duration: May 2008 - Sept 2008

Environment: Sybase,Unix,Java

Client: Lehman Brothers - Prime Brokerage

The team is responsible for maintaining the data warehouse for the

reporting suite of the Prime Brokerage division of the bank.

Overview of overall requirement

Provide a globally consistent Prime Brokerage Report by offering same

set of tools, reports, accuracy and availability of reports,

regardless of Client / Account's region.

To requirement of this Program are split across 4 different Phases.

Phase I

Sync-up the Asia environment with production software of US/EU

environment. Modify software (loader, reports, scripts etc) to

maintain current reporting functionality (this might need some Asia

specific changes be done in Main Cycle too).

Phase II

Consolidate US/EU and Asia reports to a single set of Global Reports.

Develop data feeds and modify reports to close regional data and

reporting gaps.

Improve SOD mark to market prices, FX Rates for reporting.

Phase III

Support local time zone specific SLAs to deliver reports. Build

relevant reconciliations in place to check accuracy of reports.

Phase IV

Develop or enhance Prime Services applications outside of reporting to

support global clients and client services.

Responsibilities:

. Analysis

. Coding

. Unit testing, System Testing & UAT

. Migration & Implementation

Projects: EDW ONE PMS - HFS BI

Duration: November 2007 - April 2008

Environment: UNIX, Teradata Loading Utilities, Teradata Sql,

Informatica, BO

Client: Group Client - GE-CF.

Migration of HFS portfolio (with exception of Life Sciences

portfolio) from CEF to VFS on legacy side is mandating souring a

number of EDW Core attributes from VFS staging area and adding /

reorganizing existing EDW tables / fields for VFS to make sure HFS

Reporting continue to work post migration.

This includes :

. Sourcing existing attributes in EDW Core tables from TBID tables

(these are currently populated as nulls)

. Adding new attributes to existing EDW Core tables and sourcing

them from TBID

. Adding new tables to EDW Core and sourcing its attributes from

TBID

. Planning and executing X-ref update strategy to make sure

currently available history data for HFS gets reported correctly

post conversion

Responsibilities:

. Analysis and Design

. All mapping validation including tables and column mapping.

. Coding

. Unit testing, System Testing & UAT

. Migration & Implementation

Projects: Embarq Finance Information System

Duration: April 2006 - April 2007

Environment: UNIX, Teradata Loading Utilities, Teradata Sql.

Client: Embarq, USA.

Embarq data warehouse deals with the customer and revenue information.

The information is regarding LTD exists for customer and revenue

information.

The wireless data is taken from Visage.

The environments comprises of Mainframe and Unix boxes.

The data from the mainframe and Unix boxes are now stored into the Teradata

Database .

The customer information from the existing systems is stored in the form of

delta files which need to be maintained as part of the warehouse. The

revenue information is to be rolled up to the corresponding subsystems for

which the customer is a subscriber, the taxes or discount which are

allocated on his billing amount which is calculated for each bill cycle

year and month.

The Customer data is processed by CM Load Process.

There is a separate flow to process Revenue data.

This Embarq database is around 8Tera Byte.

Populate the Source Bill Customer ID on TFR9 D6255.026/PR000658

Duration: Feb 2007 - April 2007

Environment: UNIX, Teradata Loading Utilities, Teradata Sql.

Client: Embarq. USA.

Description:

Currently the SRC_BILL_CUST_ID in the TFR9_LOCAL_SRC_INVC_CHG table is not

being populated for non-wireless customers. The column should be populated

with the 13 digit CRB Account Number. This change accurately reflects the

source customer identifier making it easier for business users to report on

revenue by customer.This approach is based on populating the

SRC_BILL_CUST_IDs in TFR9_LOCAL_SRC_INVC_CHG where OS_BILL_SYS_CD is 'LT'

and the 1st 4 positions of SRC_FILE_CD is not WRLS .

We would be taking the value of SRC_BILL_CUST_ID from the corresponding

staging

tables as per my analysis and discussions with business clients.

We would also be sweeping the TFRJ and TFR9 tables to update all 'LT'

records where the 1st 4 positions of SRC_FILE_CD is not WRLS.

Responsibilities:

. Analysis and Design

. All mapping validation including tables and column mapping.

. Coding

. Unit testing, System Testing & UAT

. Migration & Implementation

Populate Source Invoice Date on TFR9 D6255.025/PR000665

Environment: UNIX, Teradata Loading Utilities, Teradata Sql.

Client: Embarq. USA.

Description:

SRC_INV_DT in TFR9_LOCAL-SRC_INVC_CHG is not being populated for any CRB

accounts except for wireless accounts. The SRC_INV_DT needs to be populated

for CRB records on the revenue fact table.

When loading TFR9 and all related revenue fact and aggregate tables,

SRC_INV_DT should be populated with TFHQ _BILL_DT field from the

TFHQ_CRB_CUSTOMER table for ALL CRB accounts from the corresponding

customer record (based on OS_CUST_ID join).

This population activity should be done at the stage where we are

populating the TFRJ

table itself so that we would not miss any records at a later stage. This

would ensure

that the fact table is correctly populated and not be overlooked.

Responsibilities:

. Analysis and Design

. All mapping validation including tables and column mapping.

. Coding

. Unit testing, System Testing & UAT.

. Migration & Implementation

Populate the SRC_LTD_BUS_UNIT_CD on TFR9 for CASS Records D6255.013

Environment: UNIX, Teradata Loading Utilities, Teradata Sql.

Client: Embarq. USA.

Description:

SRC_LTD_BUS_UNIT_CD on TFR9 is not populated for CASS records but is

populated for all other Embarq source systems. The business unit code

(company number) needs to be populated for CASS records on the revenue fact

and aggregate tables. Impacted applications include B5L and L7L

This approach is based on populating the SRC_LTD_BUS_UNIT_CD s with the

last

two characters of the TF75_CUST.SRC_LTD_BUS_UNIT_CD from the corresponding

customer record (based on OS_CUST_ID join).

Responsibilities:

. Analysis and Design

. All mapping validation including tables and column mapping.

. Coding

. Unit testing, System Testing & UAT

. Migration & Implementation

Create Archive Jobs for EQ Customer and Revenue Data Warehouse Tables

D6255.020

Environment: UNIX, Teradata Loading Utilities, Teradata Sql.

Client: Embarq. USA.

Description: It has been discovered that several tables in the EQ Customer

and Revenue Data Warehouse are not being archived resulting in a lack of

backup of critical files. The purpose of this project is to create the jobs

that will archive the tables not currently being archived.

For Archiving tables, approach is based upon identification of type of

archive(Table or individual archive, mass archive, database archive and

partition archive

) for a particular table. Tables excluded from the database archive must be

individual or mass archived. Mass archive for the table is based on the

table size and its loading schedule. PPI tables if loaded for the partition

only, we can do Partition archive for them.

Existing midrange code has to be changed which will now populate

TF59_TBL_ARC table. This table will be used by partition archive job.

We also have to run a PSR to populate all the history bill yr months in

TF59 table so that the mainframe job can read this value and archive the

table.

We have done this for each table in the revenue flow. Also we have done it

for tables present in job mb5lm526.

Responsibilities:

. Analysis

. Midrange Part Coding

. Unit testing, System Testing & UAT

. Migration & Implementation

Embarq Wireless Initiative Revenue D6093.55

Environment: UnixTeradata Loading Utilities, Teradata Sql.

Client: Embarq. USA.

Description:

This supports NLC wireless initiative.

We need to bring in Wireless revenue feed from CRB(LT) to provide reporting

functionality to the business clients.

B5L shall load CRB wireless revenue and usage feed into staging and

consolidated model tables tp provide business community with the data they

need for their reporting.

There are 5 new source/staging tables introduced. The 5 five files from

different sources/mainframe is put into these source tables.

We are supposed to remove the duplicates from these tables and populate the

cust_id from TFS9_OS_ID_XREF table. We also need to put all the data from

these tables into a single table using multi-statement insert. We need to

apply data conversions as necessary. The mb5ld201 job will use staging

tables to populate TFRJ_UNAPRVD_INVC_CHG and from there revenue will be

copied into TFR9_LOCAL_SRC_INVC_CHG after getting approved. Client also

changed some of the existing business rules with this SR.

Responsibilities:

. Analysis

. Done all mapping validation including tables and column mapping.

. Midrange Part Coding

. Unit testing, System Testing & UAT

. Migration & Implementation

Sprint Financial Information System:

Duration: December 2004 - April 2006

Environment: UNIX, Teradata Loading Utilities, Teradata Sql.

Client: Sprint Telecommunications. USA.

Description:

FIS is the financial information System which deals with the customer and

revenue information. The total environment is to integrate the multiple

data marts into unique data warehouse .The information is regarding three

subsystems LTD, PCS and GMG for which there exists customer and revenue

information. The environments comprises of Mainframe and Unix boxes.The

data from the mainframe and Unix boxes are to be stored into the Teradata

Database on the Unix box.

The customer information from the existing systems is stored in the form of

delta files which need to be maintained as part of the warehouse. The

revenue information is to be rolled up to the corresponding subsytems for

which the customer is a subscriber, the taxes or discount which are

allocated on his billing amount which is calculated for each bill cycle

year and month.

This Sprint database is around 40Tera Byte.

Customer Load Jobs Scheduling D3172.086

Environment: Unix, Teradata

Client: Sprint. USA.

Description:

This project is to provide the capability to meet the business needs of

getting the day's customer changes loaded into the warehouse by the next

morning and to increase the load windows as well as alter some of the

scheduling. Created 2 new traffic cop jobs(one for TF75 and TF10) that will

run every 30 minutes between 1:05 and 7:05 and will only kick off the CM

load if the wireless data is ready to load. If the Wireless data is ready

to load, it will load any customer data that is ready regardless of

division.

Responsibilities:

. Coding

. Unit testing

ECID/Name Reference Table D3172.102

Environment: Unix, Teradata

Client: Sprint. USA.

Description:

This project is required in order to keep the records of ECID match type

indicator(UM and CM) and the corresponding surviving customer name as

stored in TF75_CUST in reference tables. We have to populate one reference

table with Customer Match and the corresponding Customer name and another

reference table with Ultimate Match and corresponding Customer name. The

project gives the business the ability to track the ECID match type

indicator for particular records present in TF75_CUST table. Every time the

TF75_CUST table gets updated with ECID details, the reference tables will

be updated along with.

Responsibilities:

. Coding

. Unit testing

Daily update of new Segmentation on TF75 D3172.136

Environment: Unix, Teradata

Client: Sprint-Nextel. USA.

Description:

This project aims to support a daily update of new segmentation on TF75 on

ST_PROD_TBLS database. Basically the OS_MKT_MNR_SEG_CD and

os_lvl_0_vert_seg_cd columns of TF75_CUST will be populated from

report_minor_seg and lvl_0_vert_seg_cd columns respectively of the SEG_PERM

table of IM_TMP2_TBLS database. The corresponding current row(s) of TFH3 on

ST_PROD_TBLS will be updated accordingly.

Responsibilities:

. Coding

. Unit testing

Resourcing ECID for FIS and Daily Duns D4291

Environment: Unix, Teradata, Trillium, SyncSort, Abinitio

Client: Sprint. USA.

Description:

The purpose of this project is to convert the sourcing for Corporate ECID

to the One Sprint finance data warehouse(FIS). In addition, the Duns

matching portion of ECID needs to be converted from a weekly to a daily

process. Corporate ECID provides 5 levels of house holding for various

customers, including finance and marketing. The goal of the project is not

to rewite the logic of the matches, rather source the data from the FIS

warehouse instead of PAR/PINNACLE and add efficiencies that will both speed

up the delta process as well as allow the Duns matching to occur on a daily

basis, rather than the current weekly runs.

Responsibilities:

. Coding

. Unit testing

Production Tickets Resolved:

Issue 220

Description: 3 given a/c no are not showing up in ECID tables. These same 3

a/c are visible in SEG_PERM table, and also in T251 table(HU_PROD).

Identify why they are not visible in ECID.

I cloned the sample data of these tables right from the mainframe part and

ran the whole ECID process till its populated in SEG_PERM and ECID. No

problem with code. Client was mistaken about the a/c nos.

Skills required: Teradata, UNIX, Strong Business Knowledge of FIS

Issue 242

Description: The entry for DUNS 71771562 needs to be deleted from new DnB

data. This DUNS is not a customer of Sprint, but has the same address as a

customer. Our customer is incorrectly rolling up to this DUNS.

For resolving this issue I found that we just need a PSR to delete this

particular record from T619_DNB_BUS_WORK.

Skills required: Teradata, UNIX Strong Business Knowledge of FIS

Issue 245

Description: Need to change the T619_DOM_ULT_DUNS_NBR in production table

T619_DNB_BUS_WORK to a given value.

For resolving this issue I found that we just need a PSR to update this

particular record in T619_DNB_BUS_WORK.

Skills required: Teradata, UNIX, Strong Business Knowledge of FIS

Issue 254

Description: There appears to be an issue with the TF75_GEO_ID not matching

TFNX_CSA_ID in level 2. There are 37 instances and all are from base load.

I have gone through the data and found that the TF75_GEO_ID does not match

TFN2_CSA_ID in level 2. Code is ok. Sweep to update records in TF75 and

TFH3 with those in TFN2 is required.

Skills required: Teradata, UNIX, Strong Business Knowledge of FIS

Personal Information:

Date of birth: June 13th, 1980.

Marital Status: Married.

Nationality: Indian.

Languages Known: English, Hindi, Marathi, French.

Preeti Roul



Contact this candidate