Post Job Free

Resume

Sign in

Data Project

Location:
Wakefield, West York, United Kingdom
Posted:
September 23, 2015

Contact this candidate

Resume:

Experience Summary

I am an intelligent, motivated and enthusiastic IT professional, with over 15 years experience in Business Analysis; Data Warehouse Design; Data Modelling; Extract, Transform and Load (ETL) development (using such tool sets as Informatica Powercenter and IBM DataStage); Data Migration, Data Profiling, Relational Databases (such as DB2 and Oracle) and over 15 years experience in Programming.

The main focus of my career has been the technical design and development of ETL applications, Data Marts and Data Warehouses to provide OLAP / OLTP functionality using reporting tools such as Business Objects and OBIEE. I have gained considerable experience from working on projects Norwich Union, HMRC, DWP, ABN AMRO, LLOYDS TSB, SANTANDER UK, Linde and CSC (BBC TV) in the development of their Financial Data Warehouse (FDW / CDW) and Management Information Improvement Projects (MIIP).

I try to share my knowledge with colleagues by providing training where necessary and always giving the benefit of my experience from past projects. I work well on my own initiative with a high level of motivation and organisation.

Consultant Personal File

Consultant Name Sridhara Raju Pasuparthi Nationality British Citizen

Address 106, Westwood Drive, HP6 6RR E-mail acrthl@r.postjobfree.com

Education

M.B.A (Master of Business Administration) Bachelor of Computer Science

PROFESSIONAL EXPERIENCE

Data Warehousing Projects (Pages 2 -4)

Technologies used

BBC TV Licence - CSC UK ltd

Informatica, Trillium, Siebel, Unix.

Oracle, EBIS, HATS, DB2 and SQL server data bases.

LINDE ( BOC – British Oxygen)

Linde Transition LT2 Migration

IDQ, IDE, Informatica v9.5.1, SAP, Business Objects Release III, Oracle 11g, AIX and UC4

SANTANDER U.K.

CHIP ALFA

RDR – Retail Distribution Review

ALM MIS - Risk QRM

Informatica v8.6.1 / v9.1.x, Business Objects Release III, QRM V16, Oracle 10g /11g, PL/SQL, and Unix and Perl CGI

LLOYDS TSB ( LGB)

SCV (Single Customer View - GOLD) project across LLOYDS GROUP - (LTSB, HALIFAX &RBS)

Teradata, SAS, IBM Data Stage, Oracle 11g /, PL/SQL and Unix

ABN AMRO -- IBM

TIR -- Data Migration Project

FORTIS to ABN AMRO BANK

IBM Data Stage, Oracle 10g, Teradata, PL/SQL and Unix

Department of Work & Pensions ( DWP)

C S A – Child Support Agency

Appeal Services

Informatica v 7.1.3, v 8.1.1, Oracle 9i /10g, Teradata, PL/SQL, SAS, Unix and Perl CGI

CapGemini Projects ( H M R C )

BD4CT - Better Data For Corporation Tax

Informatica v7.x, Oracle 9i / 10g, PL/SQL, Business Objects VI, Unix, IBM Maestro Job Scheduler, SAS, XML and Perl CGI

Norwich Union - AVIVA

LFTP - Life Financial Transformation Project

www.NorwichUnion.com

Informatica 6.x, Oracle 9i,PL/SQL, Business Objects

Mainframes, DB2, Power Connect for Mainframe STRIVA, legacy source systems Unix and Perl CGI

Summary of Skills and Experience

Example: Skills – Programming

UNIX Shell, Informatica PowerCenter(v9.x)

Oracle, SQLServer, Teradata

Microsoft Analysis Services, Data Warehousing, Business Objects, OBIEE 10g

Source Control : Perforce, PVCS Admin, RCS, SCCS, CVS

Scheduling: Autosys, Control-M, IBM Mastro

Skills – Operating Systems/Databases

UNIX (Solaris, Linux, AIX, HP-UX)

MS Windows NT, 2000, XP

Role Summary

Senior Analyst Programmer (6 Years)

Project Leader / Technical Architect (2 Years)

Team Leader (3 Years)

Applications Support/Systems Admin (5 Years)

Industry Experience

Investment Banking (2 Years)

Financial Software House (6 Years)

IT Consultancy in Central Government (5 years)

Period

Organization

Roles & Responsibilities

Nov 2014 to till date

CSC UK - BBC TV Licencing

Working as a Senior Business Intelligence consultant on data migration project for BBC TV Licensing.

My role is to design and develop the end – end Informatica applications from data extraction stage to Target load ( EBIS, HATS and SIEBEL).

Development is a 7 layer model where Source data will be cleansed and creating a GOLD database on Address / Customer / Licence domains using IDQ’s Informatica Developer transformations like Address validation, De-duplication, Match e.t.c.

Migrating all IDQ objects to Informatica powercenter for scheduling and automated code releasing.

My role is to design, develop and roadmap the migration process and be as a spoc to delivarables.

Sep 2013 to Nov 2014

LINDE

Lead Consultant in setting up the frame work to build the GOLD / Master data after cleansing, profiling, Risk scoring and develop the migration process across 22 countries.

Data Quality / profiling / Cleansing done by Informatica Developer IDQ 9.5.1 and Scheduling / data migration Loading by Informatica PowerCenter Designer 9.5.1.

Lead Consultant in setting up the frame work to build the GOLD / Master data after cleansing, profiling, Risk scoring and develop the migration process across 22 countries.

Data Quality / profiling / Cleansing done by Informatica Developer IDQ 9.5.1 and Scheduling / data migration Loading by Informatica PowerCenter Designer 9.5.1.

Responsibilities:

IDQ 9.5.1 have many more new transformations compare with IDQ 8.6 like Address validation, Matching, Merging, Associate, Decision, labeler, Parser, Standardizer, Key generator, Exception, Consolidation, Comparison, Case Convertor and other powercenter transformations.

Created Profile report used IDE to know weakness and strengths of source data and also created Scorecards.

Worked closely with the end users and Business Analysts to understand the business and develop the transformation logic to be used in Informatica PowerCenter.

Responsible for identifying reusable logic to build several Mapplets in Informatica Developer IDQ and export them to IPC to schedule.

Extensively used IDE profiling for the database and flat file data and applied different strategies to get detail understanding about data.

Escalated Informatica MDM untrusted or probable records to Data steward to get final decision.

Cleansed, standardized, labeled and fix the data gaps in IDQ where it checks with reference tables to fix major business issues.

Created matching columns and rules to apply on Base tables in Informatica MDM.

Extensively worked for address validation and name matching (Standardization) task at IDQ and IDE(9.5.1).

Used IDE profile on given business data and applied column profile, Primary key profile, function dependency profile, foreign key profile and join profile.

Created XML data services and run through web services after deployed in applications.

Imported IDQ mapplet to Informatica power center and integrated in the regular mapping for cleansing, standardization and address validation issues.

Cleansed landing data with Informatica IDQ to load staging area in the Informatica MDM.

Used properties like Name, Unique, Null and data types for IDE profile

Genarated WSDL URL for deployed xml informatica process.

Creating separate table data for bad records in Informatica power center and IDQ to Sending the data SME to analyze more on this kind of data.

Created and applied rules to profile data for flat files and relational data by creating rules in IDE and case cleanse, parse, standardize data through mappings in IDQ and generated as Mapplets in PC.

Validated the web services using with SoapUI tool from source to target xml files.

Created Reference tables with help of IDE and used these reference tables at IDQ to standardize the ID’s, Names, Addresses and other source data.

Monitor the landing, staging, base, matching, Xref table changes to update or insert the new data with Informatica.

Tuned performance of Informatica power center session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.

Worked closely with the end users and Business Analysts to understand the business and develop the transformation logic to be used in Informatica PowerCenter.

Worked with business SMEs on developing the business rules for cleansing and Applied business rules using informatica Data Quality(IDQ) tool to cleanse data.

Designed and documented validation rules, error handling routines and testing strategies for the Informatica PowerCenter mappings.

Extensively used control-m scheduler to schedule Informatica workflows.

Nov 2011 to

31st August 2013

SANTANDER UK

ALM – QRM

Balance sheet Management (BSM)

Roles

&

Skill-set

In these 19 months, I’d worked on three major projects.

1.CHIP ALFA – It’s a loan and lease data interface added to the existing Financial Management Information System (FMIS) which has been developed using STAR SCHEMA Dimensional modelling with Business Objects as reporting tool.

2.Retail Distribution Review – Risk Profile Questionnaire - RDR – RPQ

Is a Finance Regulatory project and the objective is to provide additional data items on MarketDataBase, to meet the new FSA RDR rules :

The ending of investment product related commission

The sole income for advisers from advised investment sales will be an advice fee paid by the customer

Increased professional qualifications & standards for advisers

Increased transparency of how services are described to customers

Increased FSA reporting requirements on individual adviser earnings & complaints

3.Assests and Liability Management – Quantitative Risk Management --

ALM-QRM (BSM)

The ALM Data Warehouse (ALMDW) is a repository for source system contracts (Retail) and positions (Wholesale) where the contracts/positions from a multitude of different source systems are brought together and harmonised such that they can be treated uniformly. Once harmonised the source data is aggregated in a single pass to produce an input dataset suitable for QRM processing. A unified data layout common to all source systems, Retail and Wholesale, is used by the source systems to create a source input file for the ALMDW. This common layout consists of around 220 fields most of which will not be applicable to any given single source system, but collectively they represent the entire spectrum required for QRM and Liquidity.

Informactica PowerCenter V8.6.1 is the tool used to process the data through the harmonisation steps. My role involves :

Development/Maintenance of Business Intelligence and OLAP database for forecasting, reporting and consolidations for Banking Insight, Marketing, Finance department operations.

Design and develop the Financial DataWarehouse (FDW) using the Start Schema and Implement IDQ concepts / rules ( handling duplicates, in-valid data and data integrity) in all stages (Extact, Transform and Loading)

Use of Informatica Data Director (IDD) to maintain the Data Dictionary and Data Integrity between the sources and targets.

use of Informatica Data Profile (IDP) to create & analyse the data validity and data patterns across the domain.

I have done & written the gap analysis document by working in concert with the Users to identify and validate data required from various source systems.

Involved in implementation of Database design both logical, physical data models for the new feeds coming from 25 Contract Systems and Agent ( in DB2) into BI / Data warehouse.

Design and develop the FDW, to allow parallel loading by partitioning the sources, workflows and targets.

Extensive experience in designing and maintaining ETL load processes using Informactica Designer, Analyst service and Metadata Repository service

Performance tuning and scheduling data loads (Control-M & IBM MASTRO) involving huge volumes of data.

Data Extraction, Data staging, Data transformation and Loading from different data sources using Oracle PL/SQL Procedures, Import/Export Utility, SQL*Loader and develop Informatica mappings from staging to mart with reusable mapplets, transformations and worklets.

I have written the design documents such as High Level Design, System Requirement Solution Document, Release management, Change Request Management, production Support document for the newly built and existing databases.

July 2011 to

October 2011

LLOYDS TSB ( LGB)

SCV

(Single Customer View - GOLD)

project across LLOYDS GROUP - ( LTSB, HALIFAX & RBS)

GOLD is the Customer Insight programme that will enable Lloyds Banking Group to leverage insight driven from customer data across the group and turn deep customer understanding into value for us and for our customers. A major factor to enable delivery of this customer understanding is the ability to exploit key data assets. These assets relate to customers within individual lines of business, across the group, external feeds and 3rd parties.

As a BUSINESS ANALYST, my main role is to identify and decompose the business requirements to create shareholder value through progressing 5 main business strategies within, and across, each business unit.

Increase acquisitions – new customers, and new products to existing customer

Increase needs met per customer – average number of products held per customer

Increase average margin per customer – profitability to the bank of each customer relationship

Increase customer retention – preventative action so as to not lose the best customers

Decrease the cost to serve customers – will attribute to the average profit margin

June 2009 to

Mar 2011

IBM Netherlands, Amsterdam

TIR – CONVERTER

Data Migration Project

FORTIS to ABN AMRO BANK

IBM’s role is to develop the CONVERTER module in which all source feeds are Extracted from FORTIS system and apply the business logic to Transform into ABN AMRO structure and Load them in to target system.

As a Technical Analyst, these 22 months contracted to IBM Netherlands working on ABN AMRO BANK data migration project.

My role & primary objective is as a senior ETL TECHNICAL ANALYST on a data migration project to migrate all FORTIS bank customers and their financial activities with history to ABN AMRO BANK.

My Primary responsibilities are to Analyse & Design the ETL Technical Specifications which are used by the development and testing teams.

To be as a SPOC (single point of contact) to both Internal and External teams.

I have involved in analysis of the source system, involved in data modelling, involved in data cleansing (the data is level3), understanding the business requirements (spoke with users regarding the requirements), preparing the design documents such as BSD, HLD (High Level Design), LLD (Low Level Design) for all the 3 Releases.

I have written the Design guide lines document for receiving & loading the data into data warehouse tables using ETL tool and used Unix/Perl scripting for automating the workflows and for aggregating the data before loading into 3NF database.

Creation of migration mappings and scripts to extract the data from a source files using ETL techniques.

Provide the required business rules using several data sources, lookups and aggregations where required and successful loading in to an Oracle database.

Development standards to avoid any performance issues.

I have used Teradata database for analyzing the data and to prepare the mock-up reports.

Environment Oracle 11g, DB2, IBM Datastage, ERwin 4.1

Role Senior Data warehousing Consultant / TA

Responsibilities Data-modelling, Design, ETL tasks and developing reports

Oct 2007 to

Mar 2009

CSA – DWP, U.K.

Child Support Agency

During this period I worked as a senior Informatica/ Oracle ETL consultant on a data warehouse project for Department for Work and Pensions (DWP). The primary objective of the warehouse is to provide the Management Information (MI) data and reports that are required to make strategic decisions by government ministers and senior management.

I have written the design documents such as High Level Design, System Requirement Solution Document, and Data Dictionary for the newly built and existing databases using Teradata.

Involved in implementation of Database design both logical, physical data models using Erwin for the new feeds coming from AGENCY, Beneficiary and CSD (which are in Oracle and Teradata) into BI / Data warehouse.

Creation of migration mappings and scripts to extract the data from a source files using PL/SQL Procedures and SQL*Loader.

Writing PL/SQL Procedures for ETL processes also writing PL/SQL packages triggers to implement business rules and validations.

Writing Informatica mappings to track changes in the source data of the CS2 application and apply those changes to the target Oracle system to keep the pre cut-over version of the data in-line with the live copy of the system

I have used Teradata database for analyzing the data and to prepare the mock-up reports.

Tools used: Informatica v8, Oracle 10g, Teradata, Windows, Unix, Unix (Korn) scripting

March 2007 to

June 2007

APEEAL’s

D W P, U.K.

Department of Work & Pension

Providing principal consultancy to the Information Directorate APPEAL Management Information improvement project (MIIP) and performing the role of Senior Consultant of a leveraged data migration team which was concurrently involved in project releases, maintenance releases and live recovery actions.

Performing the role of principle consultant to a DWP Management Information project and being responsible for identifying and documenting customer requirements, liaising with the customer on a daily basis, providing the customer with status reports and ensuring their delivery within the agreed deadlines.

Assisting the DWP Management Information project by providing guidance on usage of the Captured Change Data (CCD) table and interpretation of the data held within.

Full life cycle development from design through to testing and delivery.

Performance tuning of Informatica PowerCenter workflows and Oracle SQL

Tools used: Informatica v7 and v8, Oracle 9i, 10g, Windows, Unix, Unix (Korn) scripting

Nov 2004 to

Feb 2007

CapGemini, U.K

INLAND REVENUE -

H M R C

BD4CT – PMI project

Fundamental to the success of PMI is the ability to integrate and co-ordinate information across the whole of the Inland Revenue estate. An extensible architecture which will take raw data from many sources (both internal and external to the Inland Revenue) and build from this data a model of the business which can be used to address all performance and management issues from the reporting of actual performance, through the setting of performance targets, to the analysis and modelling functions carried out by Analysis & Research.

Involved in all the stages of software development life cycle.

Was a team member in Development, Live Services and System Test.

Was a main point of contact in which ever stage I worked on …

Datawarehouse Consultant ( Development Technical Lead )

Working closely with business users and technical teams in a customer facing role to bring together requirements into a cohesive design whilst keeping focus on stringent timescales and external regulations.

High and Low level detailed design of the Informatica mappings.

Development of several of these large Informatica Powercenter mapping components whilst acting as the TECHNICAL LEADER for a small team of circa 10 developers.

Warehouse design and development comprised:

-To study Business end user requirements & existing source systems

-To create Logical dimension data model

-Develop the High / Low level schema design documents

-Develop the ETL Requirements / Informatica Technical documents

-By Using Informaica 6.2 / 7.1 develop the complex mappings which will transfer the data from data file to intermediate oracle tables ( ADS - Atomic Data Stage ) and apply business transformation logic based on the status ( Insert / Update / Delete ) to populate the Warehouse and Mart tables.

-Coding (ETL using Informatica and Oracle PL/SQL and Unix )

-Use of MASTRO Batch scheduler to schedule the jobs.

24*7 support for LIVE / PRE-PRODUCTION / UAT environments.

June 2002 –Nov 2004

Norwich Union Life

Finance Data HUB Designer/Developer

Current Norwich Union Finance Source and Target architecture relies on several different point-to-point interface solutions. LFTP is the Life Financial Transformation Project based on HUB and Spoke architecture to replace this architecture. It ensures that data is populated in a structured manner, and rules for data transformation are implemented in a single design.

LFTP

Consolidates data from different source systems (Life 70 OB, Life 70 Pension, PSA, Abacus, ConOB, Unisure etc.) into GL.

Sends financial transactions data to reconciliation engine, Smartstream for matching and

Stores lowest granular analytical data into financial data warehouse

I have mainly worked on following 3 subprojects.

1.Life 70 OB,

2.Life 70 PENSIONS,

3.Pensions and Schemes Account.

As part of the LFTP project, there is a requirement to load the above 3 Source systems data into Reconciliation system called Smartstream, General Ledger (GL) and financial data warehouse (FDW).

Above 3 projects are initiated to develop an interface between source system and Hub. Each system will have millions of records with different record types, and every record will be processed differently based on record type.

Application Scope

Develop Extract-Transform-Load (ETL) routines using Informatica PowerCenter 6.1 running in Oracle/Unix environment.

Write technical / design documentation

Responsibilities:

Creation of Design Document

Datawarehouse Schema Changes

Coding (ETL using Informatica and Oracle PL/SQL and Unix shell scripting)

Creation of Unit Test Plan and Unit Testing

Creation of Implementation notes.

July 2000 to May 2002

Wealth Management

Www.norwichunion.com

Senior Software Consultant

Worked, as a CONSULTANT to UNISYS SYSTEMS CORPORATION and the project client is NORWICH UNION.

An e-commerce project allows the customer to choose products / services which are available www.norwichunion.com. Customer can purchase / sell shares, can save money in his Norwich union (Wealth management) bank account. Customer can open a bank account and do the transactions like withdraw/deposit and all other bank functions like creating standing order instructions or creating direct debit instructions.

The contribution involves in Design of the Payment-Proxy module and development of Payment module which will validate the customer payment transaction (by checking with database, Solve SE, Debit / Credit card validation / authorization, Cheque Authentication) about products/services. Involved in developing CrossWorlds business objects with UML concepts Java (EJB) applications, which will act as user-friendly interactions to Cross World’s Connectors and Collaborations and storing client information to the related database (ODS).



Contact this candidate