Post Job Free
Sign in

Sap Mdg Bods

Location:
Charlotte, NC
Posted:
January 29, 2025

Contact this candidate

Resume:

Hari

+1-609-***-**** / ********.**@*****.***

Professional Summary

** ***** ** ********** ** IT, SAP MDG on S/4HANA 2022, SAP BODS, Info Steward,S/4HANA DATA MIGRATION, Data Services, Data Migration- LTMC, BW HANA, Tableau and 2years in Microsoft .NET and Software Testing

SAP MDG Data Modeling, UI, Process Modelling, DRF, BRF+, CBA, VALIDATIONS, WORKFLOWS

SAP MDG EDITION, MASS CHANGE, FILE UPLOAD, EXTENDING DATA MODELS, CHANG REQ

Expertise in SAP MDG 9.2/9.1 Data, Process, UI modelling and Data Replication in MDG-M/S/C/F Domains S/4HANA 1809/1909/2022

SAP MDG DATA DOMAINS MATERIAL MASTER/BUSINESS PARTNER (CUST, SUPPL)/FINANCE

Involved in Analysis, Design, Development, Testing, Implementation and Support phases of the Projects for both SAP MDG, SAP BW, BO, HANA, BODS and Microsoft .NET

Extensively worked on BW Composite Provider, ADSO, Info Cubes, DSO Objects, Info Objects, Info Sources, Data Sources, Transformations, DTP’s, Multiproviders, Transfer rules and Update rules

Good Knowledge in Generic extraction and LO Extraction

Extensively worked on Web Analyzer and BEx Query Development for Reports creation.

Experience in CO-PA extraction process and have reports built on it

Proficient in handling errors errors occurred during Profitability Analysis (CO-PA) data loads

Familiar loading data to CO-PA data targets COPAo01B and COPA_T02A

Involved in Business Objects Administration creating users and groups, scheduling reports, publishing reports, planning security, authentication and single sign-on in for the reports developed using Business Objects 4.2/4.0.

Used Open Hub Services to push data out of SAP BI using BODS

Worked with Match Transforms, Address Cleanse and Data Cleanse Transforms for US Data Cleansing in Data Quality in BODS.

Used Data Integrator and Platform Transforms extensively in BODS

Worked extensively with Sales & Distribution team to design data migration from Legacy system to SAP using BODS

Exporting and Importing meta data in Native HANA and expertise in managing Native Hana modeling concepts, Hana Live (RDS) views

Good experience on PL/ SQL(DDL, DML, DCL, Stored procedures) using Native HANA as a Database

Good experience with all the Data Provisioning techniques (SDI,SLT and SAP BODS), Modeling constructs (Attribute, Analytic, and Calculation) and Reporting solutions (BOBJ Suite of tools) with Native HANA information models as a source.

Experience in handling the Business ObjectsModules Designer, Web Intelligence, Central Management Console and InfoView, Crystal Reports, Lumira and Design Studio

Experience in Data Visualization tools LUMIRA, TABLEAU

Good knowledge in Universe designing, developing and generating complex, Adhoc Reports, Business views, List of values, reporting functionalities such as Master Detailed, Slice and Dice, drilling methodology, Filters, Ranking, Sections, Graphs and Breaks. SME in Dashboards, Best Practices in Dashboard Design, KPI Formulation & Web Reporting, developing information spaces and exploration sets in BOBJ

Exposure in Oil & Gas, Pharmaceutical, Retail, Telecommunication, Production and Sales Data Warehouses.

S/4 HANA migration cocpit (LTMC) - Migration Object Modeler (LTMOM)

Legacy Master Data (Business Partner/Material/Financials) & TD migration to S/4 HANA 1909

Good knowledge in Eclipse based BW/4HANA & S/4 HANA (1809/1909), SAP DATASPHERE

Technical Skill Set

ERP Packages : SAP BW 7.0, 7.3, 7.4, ECC 6.0,

Master Data Management : SAP MDG 9.2, 9.19.0, DRF, DM, PM, UI Modeling, BRF+, DQ SEARCH

Business Intelligence Tools : SAP Business Objects (WebI, Crystal reports2013), AO, OBIEE

Data Visualization : BO Lumira 1.31,2.0 Design Studio 1.6, Tableau 10,10.3

ETL Tools : BODS 4.1, 4.2, BODI, Talend 7.1.1, 7.2.1, SAP RDM s/4 hana

Data Replication Technique : SLT, SDI,SDA

Databases : HANA1.0/2.0, Oracle 10g/11g, SQL Server 2008/2012/2016

Microsoft Technologies : .NET 2005, C#

Programming Languages : ABAP/4-BW

Trainings

SAP MDG on S/4 HANA 2022/2021/1909/1809 (Material, BP, Customer, Supplier, Financial) Master Data

BIG Data Insight/ Talend Open Studio Data Integrator, Job Design,Joins,Filter,Oracle/Flat file load

SAP BPC 10.0

Lumira Discovery 2.0

Assignment Summary

Client : HALLMARK CARDS/KANSAS CITY, MO

Environment : SAP MDG on S/4HANA 2022, SAP BODS 4.3-14.3, SQL SERVER 2016, Teradata, Mainframe systems

Duration : Nov 2022 to Present

Key Roles and Responsibilities:

Extending DM’s (Material/Business Partner) in SAP MDG on S/4Hana

Streamline approval process design with SAP MDG workflow’s

Replicating & Troubleshooting data into SAP Data Services, Ariba SLP & 3rd party systems with DRF IDOC’s SOA in SAP MDG on S/4HANA

SAP MDG-C integration & Replication to SAP C4C (Cloud for Customer)

MDG CVI configuration for Customer

Setup MDG DRF filters to 3rd party systems

Governing UIBB’s at Workflow step level in MDG on S/4 hana 2022

Understand BADi for Parallel WF/Dynamic selection of Agents in Sap MDG

Inbound: migration from Mainframe files to CSG Merchandise planning tool using BODS 4.3

Move file location data from MicroStrategy to CSG Merchandise planning tool

Run stored procedures from SAP DS/BODS

IDOC’S/Bapi load, ABAP Dataflow using SAP BODS

Perform Data Conversions and SAP DS code migration

Data migration using Fiori based Migrate your Data(LTMC)

OutBound: Load Data from CSG Merchandise planning tool to RDW (Teradata) for MicroStrategy reporting

Client : ENERGIZER/SAINT LOUIS

Environment : SAP MDG on S/4HANA 2021, SAP BODS 4.2, SQL SERVER 2018, S/41909, SAP MDG9.2,Fiori

Duration : Feb 2022 to August 2022

Key Roles and Responsibilities:

Understand SAP Data Services, MDG HUB & CO-Deployment landscape

Design MDG BRF+ workflows based on User Agent approval

SAP MDG-S integration & Replication to SAP Ariba SLP

Define Parallel WF for Multi distribution center approval using BaDI in SAP MDG on S/4HANA

Authorization and Governance settings in SAP MDG for Business Users

Data migration from LEGACY to SAPS42021 using BODS 4.2

Migrate Data using IDOC interfaces/Bapi/Function Modules

Master Data: BP/C/S, Material

DRF: Material master Replication to ERP using ALE IDOCs

Perform Data Conversions

SAP MM data migration including Plant/storagelocation/taxclassification/Salesorgnization

Client : MOOG INC/NY

Environment : SAP MDG on S/4HANA 1909, SAP BODS 4.2, HANA2.0, S/41909, Fiori

Duration : July 2021 to Feb 2022

Key Roles and Responsibilities:

Change Request setup for Finance – OG domain using Edition in MDG 1909

Setup BRF+ Validations for MM data

Perform Data Services, Mass Change and File Upload in SAP MDG 1909

Activate BC Sets in MDG for Change request

SAP MDG-S Data Replication to SRM

MDG CVI configuration for Supplier

Data migration from LEGACY to S4HANA1909 using BODS 4.2

Migrate Data using IDOC interfaces

Master Data: BP/C/S, Routings, Document Info Record, BoM, Material, Equipment, Purchase Info Record migration

Perform Data Conversions

Migrate your data using Fiori App (Flat file/Staging Tables)

Knowledge in SAP SYNITI ADM field mapping, Datasource connections, executions

SAP MDG/MDQ – ConfigureMDQ (Remediation) on Product data with BRF+ validation rules

Client : TENNECO/MI

Environment : SAP BODS 4.2, BW 7.4, SAP ECC/CRM/SCM, AZURE, HANA 2.0, S/4Hana 1809

Duration : Dec 2020 to May 2021

Key Roles and Responsibilities:

Implementation/AMS

Data migration from SAP ECC & Flatfiles to Data Services, SAP HANA with BODS 4.2

Data migration from SAP ECC & Flatfiles to Microsoft Azure Data lake

Defect fix/Data Reconciliation

Data Services Production system Improvement measures

Perform Adhoc loads &Archive source file once data load complete to HANA 2.0

S/4 HANA migration cocpit (LTMC) - Migration Object Modeler (LTMOM)/Material/Costcenter etc

CPS redwood work load automations/Scheduling parameters/Pre conditions/Mappings similar to UC4

Client : ECOLAB/MN

Environment : MDG 9.1 on S/4HANA, SAP BODS 4.2, BW 7.4, SAP ECC/CRM/SCM, BIG DATA, AZURE, CLOUDERA,

Duration : July 2019 to July 2020

Key Roles and Responsibilities:

Implementation/AMS

Involved in SAP MDG Functional (Data, Process, UI) Modeling, Data Services, Data Replication Framework

Flex /Reuse modes (HUB & Co-Deployment) layout architecture

Used Hub deployment method for MDG implementation, data resides in MDG systems, approved data replicated to ECC for Customer/Vendor and Material. Finance data replicated using the SOA and ALE services

Material/Business Partner/Customer/Supplier/Financials Master Data Replication to S/4HANA

Create/Change/Block Unblock/Mass Change Material/Business Partner/Customer/Supplier

Understanding Type 1/Type3/Type 4 Entities in Data Modeling

Configuring Business Objects /Business activity/Logical actions for Change Request

BRF+ and Static Workflows steps in Decision Tables for Change Request

Troubleshoot FPM page errors in UI modeling

Perform Initial Load using SAP BODS in HUB-Reuse storage mode from ECC

Perform Data Quality Search & Duplicate check validation and setup HANA SEARCH for all Domains

Design and Monitor the Cutover checklist and Go-Live plans

Understand BADI’s/Interfaces/Methods/ Access Class for Material/Business Partner

Troubleshoot SAP MDG Change requests & WorkFlows during AMS activity

Designed and developed data ingestion (Replicate/ETL/ELT), conceptual, logical, physical data models, data marts, analytics and visualization, performance

Ingest Data Services, SAP ECC SCM Full & Delta loads to cloud-MS AZURE BLOB container

Ingest SAP CRM/ECC BW EXTRACTOR data & flatfiles to cloud-MS AZURE BLOB

Master Data Attribute, Text & Transactional Data

Build DSO/OpenHub and Ingest Nalco system data to AZURE BLOB

Collect and import BW transports to Production

Build Batchjobs and perform Delta Initialization in Sap Data Services

Automate Batch jobs with loadtype variable for Full, Delta and Adhoc loads

Expose custom extractors to ODP API to consume in Data Services

Import extractors in Query & CDC mode

Perform archive operation, MD/TD Data reconciliation with source

Define Primary keys, Delta type &Metadata for extractors (ECC, SCM, CRM)

Assist HADOOP team for Schema creation, schema enhancement, FL, DL, Data Validation

Automate the SAP DS jobs in UC4

Data migration LTMC – LTMOM to S/4 HANA using Files & Staging Tables with BODS/Data Services

Troubleshooting Data Service BW extractors Delta monitor

Recovering files from Archive to Azure blob

Analyze data in AZURE DATA BRICKS/AZURE DATA LAKE/AZURE DATA FACTORY

Recovering failed Delta from BW extractors (SCM) in ODQMON

Data Services Production system Improvement measures

Handling Data Reconciliation issues Azure DATA LAKE and ECC/CRM/BW

Setup Data Store connection to ATTUNITY and extract FULL load using BODS to Azure

Knowledge in PYTHON scripts

Client : DuPont/DE

Environment : SAP MDG 9.1, SAP BODS 4.2, Tableau, Sql server 2012, SAP HANA 2.0,BI 4.2, BW

Duration : December 2018 to May 2019

Key Roles and Responsibilities:

Implementation/Enhancements

Involved in Configuring SAP MDG Material, Supplier, Customer and Finance Master data

Extend Material/Customer/Supplier by new entity in SAP MDG 9.1

Involved in Data Services, Troubleshooting Data Replication DRF and Workflow issues

Define Change Request actions and understand scenario for Parallel Workflows

Data Replication and Monitoring Change Requests in SAP MDG

Troubleshoot BODS connectivity issues with source systems, Sql server and Hana

Develop code migration methodology for SAP BODS across Dev, Test, Prod landscape

Migrating Specialty Products Data(SPECCO),CORTEVA(AGCO) MD/TD

Master Data includes Cust Master/GL account/Profit center/Material Master/Trade/Plan/Company Code/Cost Element/Cost Center etc

Transactional Data Includes OpenOrders /Costcenter /Daily Sales/GL

Sources include SAP & Non Sap (Flatfiles/Json/Excel)

Worked on Data Quality Transforms Address Cleanse/Match/Data Cleanse

Read Inbound shared point files and batch numbers and load automatically to extraction layer

3 Layer Architecture - Extract data from source and load to Output Layer

Further load data from Staging Layer to Data Inbound/Data Hub

Target system data is posted to HANA 2.0

Perform Adhoc loads &Archive source file once data load complete to HANA 2.0

Analyze the reports built on Target System with Tableau

Written PL/SQL queries to extract source data to Data Services staging

Managed Error Handling and Success results to trace data migration

Worked on incremental loading for all the dimensions after initial load from stage to Data Mart (Star schema

Data migration to SAP HANA Enterprise and build Information models (Cal Views), SQL scripting

Involved in Unit and Integration Testing on reports and then exported them to Schedule Manager for Scheduling and Refreshing.

Performed Data Conversions/functions before migrating data

Worked on Data integration transforms History preserving, Table comparison, XML_pipeline

Setting up of Data store connection to read source data from multiple source systems in BODS

Extensively used BODS Management Console to schedule and execute jobs, manage repositories and perform metadata reporting.

Extracted larger volumes of data(SDI) using ABAP data flow transform into HANA DB in BODS

Created BODS Jobs, Work Flows, Data Flows, and Scripts using various Transforms (Integrator, Quality and Platform) to successfully load data from multiple sources into a desired target

Retrieved data using SQL Select statements as part of performance tuning in Data services 4.2

Worked on scripts while reading files from a folder using WHILE_Loop in Data services

Client : Canon/NY

Environment : BODS 4.2, BI 4.2(WebI), Lumira,Oracle 11g, XML, WebService, SAP HANA 2.0

Duration : April 2018 to November 2018

Key Roles and Responsibilities:

Handling priority/problem tickets/incidents/ housekeeping activities

Troubleshooting Data Services, BODS connectivity issues with source systems and Soap Webservice

Develop code migration methodology for SAP BODS across Dev, Test, Prod landscape

Migrating Internal, Partner, Admin, Principle, Marketing, Promotion, Service, Pricing and Super users

Master Data, User core attributes, Roles and Sellto Billto Information

Sources include CNA UAM Oracle schema and LZ(flatfile)

Target system Soap webservice and intern data is posted to Oracle db

Validate against HR database for internal or external user classification

Generating unique identity for each user using a function

Convert userlevel codes for internal and Partner users based on requirement

Worked on SQL transforms, DBlinks and Stored Procedures

Written PL/SQL queries to extract source data to Data Services staging

Consolidate data from source systems and build XML structure

Managed Data Services, Error Handling and Success results to trace every user migration

Worked on incremental loading for all the dimensions after initial load from stage to Data Mart (Star schema

Data migration to SAP HANA Enterprise and build Information models (Cal Views), SQL scripting

Sending and Receiving IDOC’s from SAP through BODS Integration systems

Involved in Unit and Integration Testing on reports and then exported them to Schedule Manager for Scheduling and Refreshing.

Written & Performed Data Conversions/functions in PL/SQL Oracle 11g before Data migration

Worked on BO universes IDT, WEBI, Crystalreports as per Business requirements.

Information Steward for data profiling, validating rules, scorecards, metadata management

Worked on Data integration transforms History preserving, Table comparison, XML_pipeline

Setting up of Data store connection to read source data from multiple source systems in BODS

Worked on scripts while reading files from a folder using WHILE_Loop in Data Services Extensively used BODS Management Consoleto schedule and execute jobs, manage repositories and perform metadata reporting.

Created BODS Jobs, Work Flows, Data Flows, and Scripts using various Transforms (Integrator, Quality and Platform) to successfully load data from multiple sources into a desired target

Extracted larger volumes of data(SDI) using ABAP data flow transform into HANA DB in BODS

Retrieved data using SQL Select statements as part of performance tuning in Data services 4.2

Extensively worked on data validation and reconciliation BW, ECC, BO, HANA, DATASERVICES

Client : AMGEN (Life sciences)/NJ &Bangalore

Environment : SAP HANA 1.0, BODS 4.2, BI 4.2, Tableau, BW 7.4, IP

Duration : July 2015 to Feb 2018

Key Roles and Responsibilities:

Handling priority/problem tickets/incidents/ housekeeping activities

Monitoring and Scheduling process chains.

Handling RFC’s through Service now tool

Created Native HANA information models Attribute, Analytical and Calculation Views

Worked on Explain Plan Tables, Visual Plan, Sql Tracing as part of Performance tuning in Native Hana

Configured and Replicated the data from SAP / Non-SAP sources to SAP HANA using SAP Landscape Transformation (SLT), extracted external system data using SDI/SDA

Built graphical Calculation views using Table functions as datasource for complex calculation in Native HANA

Worked on performance optimization of SQL Query/Models using Explain Plan, Visualization Plan and Expensive Statements trace

Good experience in working on HANA optimized Calculation Engine functions (CE functions datasource access& relational operators)

Built Calculation views on virtual tables with Smart Data Access(SDA) mechanism in HANA

Converted Attribute and Analytic view to Calculation views to reduce data transfer between engines in HANA

Created Sql script based calculation views

Analyzed and fixed real time replication server SLT delta failures in HANA

Followed Best practices/Performance tuning in designing and building HANA information models

Transport HANA model objects using HANA ALCM Source to Target systems

Functional verification activities of Development, Quality and Integration systems

Enhancing reports in LUMIRA and Webi, working on Lumira 2.0 proof of concept source as HANA

Worked on linking Datasets and Building and Publishing Lumira Stories to BI Platform

Created reports including formatting options such as Chart, Grouping, Sorting, Alerter, Ranking, Breaks, Sections, parameterized prompts.

Involved in Unit and Integration Testing on reports and then exported them to Schedule Manager for Scheduling and Refreshing.

Modified existing BO universes and WEBI reports as per Business requirements.

Created Crystal reports when needed and exported to enterprise repository in order to make Crystal reports accessible.

Support Business objects WEBI reports, Universe handling issues through incidents

Information Steward for data profiling, validating rules, scorecards, metadata management

Extensively used BODS Management Console to schedule and execute jobs, manage repositories and perform metadata reporting.

Worked on scripts while reading files from a folder using WHILE_Loop in Data services

Client : STATOIL (Oil & Gas)/Capgemini/Bangalore

Environment : SAP BODS, BO 4.0, Tableau, HANASPS 11, IP, BPC 10.0, BW7.0

Duration : Jan 2014 to June 2015

Key Roles and Responsibilities:

Involved in Statoil BO BW transition from IBM to CapGemini

Handling priority/problem tickets/incidents/ housekeeping activities

Monitoring and Scheduling process chains.

Handling RFC’s through Service now tool

Classic and Sql script Analytic privileges and assign users to restrict accessing data native hana

SAP process area’s involved SD, CRM, HR, Treasury & Payments, Accounting & Controlling, Plant O&M

Business Objects reports on HANA modeling constructs (analytic, Attribute & Calculation views)

Created Data Visualization dashboards using embedded Excel sheet integrated with BEX Queries.

Worked on complex reports that require running totals, sub-totals, sub-reports, cross-tab and conditional display of groups, fields, and messages.

Created, managed & formatted standard reports, master/detail reports and cross tab reports using Report Manager

Developed crystal reports and performed unit testing

Migration of BO reports, folders and universes from one repository to other repository

Support Business objects WEBI reports, Universe handling issues through incidents

Developed and executed unit test and integration test plan for BO reports

Data load experience to SAP BW as Target, Success Factor using Data services

Extensively worked on Data Services (Code Migration, Error Handling, Recovery Mechanism,IDOCS)

Making use of SQL transform to read data from source system i.e Oracle in BODS

Setting up of Projects, Jobs, workflow, dataflow and make them uniquely grouped while moving across different environment in BODS

Extensive use of Data integrator and platform transforms such as Key_generation, Table_comparision, Case, Merge, Validation, sql, Query for transformation

Created Profile and Central repositories to manage data consistency in BODS

Migrate data for SAP FICO and HCM module objects such as Vendors, Customer, Materials, Purchase orders, Cost Centers, Profit Centers, Transactional data such as AP, AR, GL to single system ECC 6.0 in BODS

Performed Full pushdown, Source & Target based performance tuning, Bulkloading in BODS

Worked on Data Services Management console to create users and grant privilege to access repositories

Client : Philadelphia Energy Solutions (Energy Oil & Gas)/Capgemini/Bangalore

Environment : SAP BODS, BI7.3,Sql server

Duration : June 2013 – December 2013

Key Roles and Responsibilities:

Involved in designing technical specification document for PES_BW

Involved in build phase of the project

Developed BI data flows using FLAT file extraction

Responsible for creation of test scripts and documentation

Involved in Data Loading & Process chain Monitoring

Involved in unit testing, test case reviews and documentation

Created Crystal reports when needed and exported to enterprise repository in order to make Crystal reports accessible

Design and Develop BODS batch jobs for Material Master, Purchase orders, Asset Master and Batch Determination

Use of Try-catch and while- loop transforms during splitting records in multiple files in BODS

Use BODS conditional clauses such as where, group by, order by while restricting records

Generating Tab delimited output file as per load program

Publishing Data Quality report for quickly status meeting and decision making

As per data quality report assist business users to modify and correct the data

Following are Objects worked as part of data reconciliation activity: PO vendor master, Asset PO, WBSmaster, asset location, cost center in BODS

Client : Target (Retail)/Capgemini/Bangalore

Environment : SAP BI7.3, BOXI R3, Sql server

Duration : December 2012 - June 2013

Key Roles and Responsibilities:

Involved in requirements gathering, designing technical specification document for Project Systems

Involved in BUILD and UAT phase of Project Systems Module (Master Data and Transaction Data)

Responsible for creation test scripts and documentation

Involved in planning and co-ordination of tasks

Involved in Report testing, test case reviews and documentation

Involved in unit testing, test case reviews and documentation

Migration of existing in SSIS to Crystal reports from scratch

Migrated the reports using LCM and import wizard

Developed and executed unit test and integration test plan for BO reports

Creation and maintenance of BO Universe and reports

Worked on CMC module in creating user groups and folders and managing security in BO

Client : Centrica (Gas & Energy)/Capgemini/Bangalore

Environment : SAP BODS,BI7.3, SAPCRM, SAP ISU,Oracle

Duration : June 2012 – December 2012

Key Roles and Responsibilities:

Involved in designing technical specification document for British Gas and Dataware House

Involved in build phase of BGDW (Master Data and Transaction Data)

Developed BI data flows using LSA methodology

Involved in enhancing CRM & ISU datasources

Responsible for creation test scripts and documentation

Involved in Data Loading &Process chain Monitoring

Test case reviews and documentation

Involved in unit testing, test case reviews and documentation

Understanding the client business setup and structuring Object repository in Business object data services

Developed and implemented solutions with data warehouse, ETL, data analysis, and BI reporting technologies.

Most of the BODStransforms used Case, Merge, and Sequence Generator, Query Transform, Map Operations, Table Comparison, SQL, and Validation Transforms

Created Data Flows to load data from flat file, CSV files with various formats into Data Warehouse.

Involved in creating batch jobs for data cleansing and address cleansing.

History preserving, capturing slowly changing dimension data at every interval, script writing, preparing the data from initial load perspective in BODS

Client : Fossil (Lifestyle brands)/Capgemini/Bangalore

Environment : SAP BI7.3, Oracle

Duration : December 2011 – May 2012

Key Roles and Responsibilities:

Involved in requirements gathering technical pre-upgrade and technical post-upgrade activities

Responsible technical pre-upgrade and post-upgrade activities

Responsible for creation test scripts and documentation

Involved in planning and co-ordination of tasks

Co-ordination in India with respect to testing

Involved in unit testing, test case reviews and documentation

Client : Vodafone (Telecom)/IBM/Bangalore

Environment : SAP BI7.0, BO, ECC 6.0, Oracle

Duration : Jan 2010– Nov2011

Key Roles and Responsibilities:

Interacting with Business Analysts for Requirements Gathering and clarifications. Provided the weekly status update on Project Progress.

Created Cubes, DSOs, Multiproviders, Infoset, Transformations, InfoPackages and DTPs.

Involved in implementing logic at Start Routine level

Created Process Chains for automating data load process

Created Open hub for extracting data from info cube into Flatfile/table.

Involved in developing Interaction Analysis Queries using Restricted Key figures, Calculated Key figures, Filters, Free Characteristics, and Variables.

Worked on Report to report interfaces(RRI) to drilldown to detailed Report

Involved in Transportation of all Objects like: InfoObjects, Query’s and DSO objects, Routines, Programs, etc. into BI test system after validation and testing the same in development.

Prepared Project documents like Transport Checklist, Go-Live check list, Support document and Provided the Support Training to the Support Teams

Created Unit Test Cases and Performed Unit testing

Extensively Involved in System Testing and User AcceptanceTesting

Created crystal reports including graphical reports, formula based and well formatted reports according to user requirements

Enhancements in BO universes and reports

Interacted with users for analyzing diverse BO reports and scheduling them

Involved in creation of webi,crystal and dashboards according to user requirements

Client : Vodafone (Telecom)/IBM/Bangalore

Role : Application Developer

Environment : C#, WINDOWS APPLICATION, SQL SERVER 2005

Duration : March 2007 – Dec 2009

Description: The unified Front End (UFE) has been built as a unique interface for Vodafone stores and customer care services. The Legacy System which existed prior to UFE used Siebel and SAP applications to support Call Centre and Stores activities. These applications even when optimized were not process oriented, not allowing the optimal service time for repetitive easy operations, implicating substantial training period and requiring extensive software licensing.

The main objectives behind the creation of UFE are:

To have a single interface which can be customized according to the user profile

An interface which is independent of backend systems (Smart Client)

Easy to use (reduced training for users and short handling time)

Key Roles and Responsibilities:

Creation of (HLD) High Level Document from the Feasibility document

Involved in preparing Detailed Design Document(LLD)

Preparing Unit Test Plan and Unit Test Result document

Decision making on UI design of new applications; Suggesting changes to customer

Coding of the functionality from the Detailed Design Documents(LLD) document

Database related activities and implementation

Preparing Release Notes using open source software Wiki

Writing scripts for database related changes

Developing Windows Services, Web Services

Client : Royal Bank of Canada (Banking - CANADA) /IGate/Bangalore

Environment : C#, ASP.NET, SQLSERVER 2000, JAVA SCRIPT

Duration : July 2006 – Jan 2007

Key Roles and Responsibilities: Designing UI, Coding, Unit Testing.

Safekeeping is banking project to generate fee file for specific range of Canadian regions. Ontario, International and Quebec

Developed on OPS Tools framework and modules are Excluded Accounts,Feefile Generation,TaxMaster,Confirmation Page,Fee file Generation Report.

OPS Tools framework is designed with XML and WEB SERVICES using YDO0_DEV (sqlserver) as Database.

Work flow of project: UI interface is user controls

Excluded Accounts screen will display the previously excluded accounts; at the same time we can include these accounts for fee file generation.

From Account Filtration page/Fee file generation page list of accounts are checked/unchecked with available checkboxes.Feefile will be displayed on Confirmation Screen as well as you can generate PDF report.TaxMaster which updates tax details GST,QST,HST for Ontario, International and Quebec regions

Education

Bachelor’s Degree (B.Tech),CSIT, (India)



Contact this candidate