Post Job Free

Resume

Sign in

Data Sap

Location:
Bolingbrook, IL
Posted:
December 03, 2018

Contact this candidate

Resume:

Rohit Rastogi

E-mail: ac7u4g@r.postjobfree.com Phone: 925-***-****

Summary

Extensive experience working with ETL tool SAP Business Objects Data Services in designing and developing jobs with complex Mappings, Transforms, Transformations, Workflows, data flows, Configuring and scheduling the Workflows.

IT experience in Analysis, Design, Development and Implementation of Data Warehousing and Database applications using SAP Data Services (BODS-ETL).

SAP HANA Cloud Integration for Data Services (SAP HCI-DS) and SAP Integrated Business Planning (SAP IBP).

SAP Data services to connect to SAP ECC 6.0, SAP BW as Source and Target systems. Also, load data from PSA to Data Targets using Transformations and DTPs.

Experienced with Data Quality Management Metadata Management, Data Profiling, Master Data Management, Data Quality, and Data Model Standards.

Experience in Migration of Jobs and workflows from Development to Test and to Production Servers to perform the integration and system testing.

Experienced in Data Conversions, Data integration and Data Migration with specialization in BODS and expert on Slowly Changing Dimensions Type 1, Type 2 and Type 3 for inserting and updating Target tables for maintaining the history.

Efficient in Optimizing, Debugging and testing SQL queries, views and stored procedures written in Teradata, Oracle and SQL Server.

Design the solutions with SMEs and Solution Architects to implement the best design solutions.

Proficient in Data Analysis in BI.

Excellent Technical, Interpersonal, and Communication skills along with Time and Project management.

Experience

Ulta Beauty September 2018 - Present

Bolingbrook, Illinois

Role: BODS Developer

Ulta Beauty, Inc. is the largest beauty retailer in the U.S. and the premier beauty destination for cosmetics, fragrance, skin care products, hair care products and salon services.

Involved in BODI/BODS (Business Object Data Services) Projects like End-to-End implementation, Data migration and Upgradation.

Working on multiple projects simultaneously.

Extensively used SAP Data Services to create ETL jobs and put data into staging tables and transform it further to populate the data warehouse tables.

Proficiency in data warehousing techniques for Slowly Changing Dimension phenomenon, surrogate key assignment and Change Data Capture.

Used Data Services Management Console and Tidal to schedule and execute jobs, manage repositories and perform metadata reporting.

Drill down the Analytics to the detail level data.

Involved in client meetings to understand the requirements.

Loading the data into SAP-ECC system using IDOC's.

Designed various Data Flows for extracting data from various sources including flat files and relational tables.

Implemented conditional flow to load either full load or delta load.

Exception handling using TRY/CATCH Block.

Worked with different transformations like (History Preserving, Table Comparison, Validation, SQL Transformation, Case and Table Comparison, Row Generation).

Using SQL Queries like inner joins, Outer joins and DDL, DML commands to testing the data at the database level.

Created custom functions to implement reusable logic.

Participated in business requirements analysis, development of technical specifications, testing and Implementation.

Created Scripts like Starting Script and Ending Script for each job, sending the job notification to the user scripts and declaring the Local and Global Variables.

Working with various cross functional teams and the business users.

Toshiba March 2018 – September 2018

Irvine, CA

SAP BODS Lead

Toshiba is a $60 billion global company employing nearly 200,000 in 30 countries around the globe. Its diversified products and services include information technology and communications equipment and systems, electronic components and materials, consumer electronics, household appliances, medical equipment, & office equipment.

Responsibilities:

Currently leading the Oracle ERP upgradation project (from 11.5.10 to 12.2.6).

Participated in all phases of Software Development Life Cycle SDLC (Requirement Analysis, Design, Development, Testing and Documentation).

Created complex Jobs, Work Flows, Data Flows, and scripts using various Transforms to successfully load data from multiple sources into a desired target.

Proficiency in data warehousing techniques for Slowly Changing Dimension phenomenon and Change Data Capture.

Extensively using Data Services Management Console to schedule and execute jobs, manage repositories.

Participate in Data Mapping workshops and follow-up meetings with the client business.

Designed and developed ETL routines in BODS, modified or created new Data Services jobs to meet project requirements.

Ability to interpret, configure and implement data mapping / transformation business rules into Data Services.

Performed SAP Data Migration by using Business Objects Data Services as the ETL tool

Used the case, validation, merge transforms and lookup function to validate the Customer Master basic data.

Moved the jobs into QA and Production.

Well versed in using Performance Tuning techniques to improve Throughput of data.

Fluent in SQL Coding using different software tools such as ORACLE SQL Developer, TOAD, and Ms SQL Server Management Studio 2008

Experience in creating TYPE I, II SCD (Slowly Changing Dimensions) JOBS.

McKesson Corporation Feb 2017 – Feb 2018

San Francisco, CA

SAP BODS Developer - Consultant

McKesson helps health care providers improve business health and deliver better care to patients. As a largest pharmaceutical distributor and health care information technology company, McKesson provides systems and Technology Solutions for medical supply management, clinical workflow, practice management, pharmacy automation and care management.

Responsibilities:

Participated in business requirements analysis, report designing and modeling, development of technical specifications, testing and Implementation.

Responsible for Extracting Data from ECC (using BAPI calls & RFC Direct read method) and other legacy systems into SAP BW, SQL server and SAP HANA using SAP Data Services (4.2)

Collaborate with business teams and develop the jobs to extract, transform and load, populating and refreshing DW Tables from Oracle (IW) tables, Flat files and SQL tables sources using BODS Jobs.

Analyzed business requirements and worked closely with various application teams and business teams to develop ETL procedures that are consistent across all applications and system.

Wrote ETL design documents per the ETL coding standards with mapping documents and perform Transformations reviews.

Worked with SAP Information Steward for data profiling and created custom cleansing packages.

Performed column, address, dependency and redundancy and uniqueness profiling.

Created complex validation rules and rule bindings for data profiling according to business requirements.

Generated Data quality reports on the dash board for the data analysis purpose.

Responsible for the Data loads to IBP from SAP ECC, Flatfile, and BW on HANA.

Performed data extracts from SAP Hana – IBP system to file formats and then loaded it back to BW-HANA and pushed it to ECC using BAPI call through BODS.

Extensively worked on creation of HCI Projects, Tasks and Process to perform ETL from SAP BW/SQL Server into SAP IBP and Flat files to send the data back to SAP.

Work on identifying the SAP ECC -> SAP BW -> SAP HCI mapping.

Tested the data loaded into IBP using the IBP Excel Plug-in.

Schedule HCI Tasks to run weekly and fix any failures if schedules fail.

Prepared and delivered Functional and Technical specification documents for all the HCI Tasks.

Responsible for publishing the BODS batch jobs as web services as well as through executable commands to call them from the third-party scheduler (MAESTRO) applications.

Created Scripts like Starting Script and Ending Script for each job, sending the job notification to the user scripts and declaring the Local and Global Variables.

Design and write jobs to fetch data from SAP BW using the Open Hub Destination, kicking off the process chains and pushing the data to the BW DSOs using the external source system configurations.

Expert in implementing SCD1, SCD2 and SD3 ETL concepts for direct updates, history preserving using target based Table comparison, History preserving and map operation transformation.

Created several jobs for loading the data to SQL Server 2008 from flat files, data files, excel files, oracle tables, oracle views.

Created jobs to perform the full initial data loads and then switch these jobs to Delta mode to append the delta data in the existing data warehouse.

Involved in SAP ECC R/3 source extractions by using sap Source Extractors, SAP Functions making BAPI calls using BODS 4.1/4.2.

Created physical design documents, technical specification documents for all ETL jobs.

Extensively worked on error handling and performance tuning of programs and processes by using concepts like parallel execution, recovery mechanism, degree of parallelism (DOP), increasing number of concurrent loaders and bulk-loading options.

Provide Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Loading Data from multiple Data Sources to multiple Data Targets with different Transformations and DTP's by scheduling different updates.

Responsible for the execution, monitoring and scheduling of data services jobs using Data Services Management Console (Administrator) and used web services to publish the jobs.

Data services Workbench to load and provision data into SAP HANA.

Environment:

SAP BO Data Services (Ver 4.1/Ver 4.2), SAP Information Steward (Ver. 4.2), SAP Business Objects XI R3.1, Oracle 11g, Designer, SQL Server 2008, Windows Server 2008 R2, HANA 1.x, IBP 1708/1711, HCI DSoD

EMC2 Corporation July 2016 – Feb 2017

Santa Clara, CA

Data Warehousing Analyst/ SAP Data Services (Solution Developer)

EMC is a global leader in enabling businesses and service providers to transform their operations and deliver information technology as a service (ITaaS). Through innovative products and services, EMC accelerates the journey to cloud computing, helping IT departments to store, manage, protect and analyze their most valuable asset information in a more agile, trusted and cost-efficient way.

Responsibilities:

Extensively used SAP Data Services to create ETL jobs and put data into staging tables and transform it further to populate the data warehouse tables.

Working with multiple joins in the IBP task dataflow and filtering data based on business transformations rules.

Running / scheduling tasks for: 1) Inbound-Data Loads into IBP and 2) Outbound / Extraction from IBP to Oracle EDW DB tables.

Log monitoring, interpretation and fixing the issues, if any.

Involved with SAP Basis folks to upgrade the Sandbox and Production Agents based on the SAP recommendations.

Responsible for loading the ECC Master Data (Attributes data into Simple and Compound Master Data Objects) into IBP first and then load the Key Figures data.

Set up the processes in IBP to maintain the dependency among the dataflows Basically, sequencing of the dataflows.

Loaded the Key Figures data (Transactional Data like Demand Sales History) after the corresponding master data loads.

Responsible for Extracting Data from ECC and other legacy systems into SAP BW and SQL server using SAP Data Services (Data Integrator).

Responsible for the execution, monitoring and scheduling of data services jobs using Data Services Management Console (Administrator) and used web services to publish the jobs.

Defined Data Stores and Configurations in SAP Data services to connect to R/3, SAP BW, Source and Target systems.

Used SAP Business Objects Data Services to create work flows and data flows to load the data into SAP BW staging tables (PSA) and transform it further to populate the DW tables.

Developed complex mappings in BODS to load the data from various sources into the Data Mart, using different transformations like Query, Case Transformations, Table Comparison, Merge, Pivot, Reverse Pivot, Lookup, Key Generation, History Preserving, Date Generation, Map Operation and Validation etc.

Created several jobs for loading the data to SQL Server 2008 from flat files, data files, excel files, oracle tables, oracle views.

Created jobs to perform the full initial data load and then switched the jobs to Delta mode to append the delta data in the existing data warehouse (EDW).

Involved in sap R3 source extraction by using sap Source Extractors using BODS 4.0.

Responsible for pushing the daily delta data load to SAP APO.

Defined file format to use as source and target files and developed several reusable transformations and custom functions.

Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.

Involved in migrating jobs from development to testing and then finally to the production.

Created physical design documents, technical specification documents for all ETL jobs.

Involved in error handling and performance tuning of data flows.

Expertise in dimensional data modeling, Star Schema Modeling, Snow-Flake Modeling, fact and dimensional table design, physical and logical data modeling.

Extensively used TRY/CATCH blocks to Handle Exceptions and writing Scripts to automate the Job Process.

Responsibility of creating and maintaining database objects and structures (e.g. tables, indexes, views, triggers, Functions, Procedures etc.).

Environment:

SAP BO Data Services (Ver.4.0), SAP HCI, SAP IBP 4.x, EIS SmartOps 6.10 SP2, SAP Business Objects XI R3.1, Oracle 11g, Designer, SQL Server 2008, Windows Server 2008 R2.

Levis Strauss & CO. Sep 2015 – June 2016

San Francisco, CA

SAP BODS/ETL Developer

Levis Strauss & Co. is one of the world's largest brand-name apparel marketers with sales in more than 110 countries. The project incorporates migrating and integrating the Sales, Distribution & Marketing Data from different legacy systems and then loading this data into SAP BW and Teradata systems. Further the data is used from the Data Marts to report measures using business objects tools like universe designer, WEBI, and Crystal Reports.

Responsibilities:

Developed and supported in Extraction, Transformation and Load process (ETL) using Business Objects Data Services to populate the tables in Data warehouse and Data marts.

Expertise in gathering, analyzing and documenting business requirements, functional requirements, and data specifications for Business Objects Universes and Webi Reports.

Defined Data Stores and Configurations to connect to R/3, SAP BW, Source and Target systems.

Extensive experience in implementation of Data Cleanup procedures, transformations, Scripts, Stored Procedures and execution of test plans for loading the data successfully into the targets.

Loaded SAP R/3 tables and developed transformations that apply the business rules given by the client and loaded the data into the target database.

Identified bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs and redesign the existing mappings for improving the performance.

Used SAP Data Services Data Quality to develop components that would help to clean data elements like addresses and match entity names.

Expert on Slowly Changing Dimensions Type 1, Type 2 and Type 3 for inserting and updating Target tables for maintaining the history.

Extensively used Query Transform, Map Operations, Table Comparison, lookup function, Merge, Case, SQL, and Validation Transforms in order to load data from source t o Target Systems.

Created Scripts like Starting Script and Ending Script for each job, sending the job notification to the user scripts and declaring the Local and Global Variables.

Defined file format to use as source and target files.

ABAP data flow processing Using Data services.

Worked on writing Functions to make the code reusable and used lookup functions to reference the reference table data.

Extensively used Try/Catch to handle exceptions and writing Scripting to automate the Job Process.

Experience in Migration of Jobs and workflows from Development to Test and to Production Servers to perform the integration and system testing.

Created physical design documents, technical specification documents for all ETL jobs.

Dealt with Unit Testing and data Validation testing of all the mappings end to end and also with UAT.

Used check-in/check-out procedure to import and export projects/jobs for secure access to central repository objects and maintaining version history.

Experience in Debugging and Performance Tuning of targets, sources and mappings.

Created several jobs for loading the data to Teradata 13.0 from flat files, data files, excel files, oracle views and SQL tables.

Designed, developed, deployed and maintained universes using Business Objects designer.

Developed Custom hierarchies to support drill up/down reports, and also created cascading LOVs (List of Values) to be used for cascading prompts.

of chart and tables.

Environment:

SAP Data Services XI (Ver. 3.2), SAP Business Objects 4.0, Designer, Web Intelligence, Info View, Crystal Reports 2008, Teradata 13.0 (Teradata SQL Assistant), Oracle 10g, MS-SQL, SQL Server, Toad, Windows 2003 Server.

Ameriprise Financial Ltd. Oct 2011 – June 2013

Gurugram, Haryana, India

Role: SQL Developer

Responsibilities:

Designed, developed, and maintained relational databases.

Documented the Business Requirements and Technical Requirements (Database perspective) in confluence.

Worked creating databases and various SQL server objects like schemas, tables, indexes, indexed view as per the specifications and requirements to meet the business functionality.

Worked with business users in designing the database model and maintain referenced relationships for the data integrity.

Translated business needs into data analysis, business intelligence data sources and reporting solutions for the clients depending upon business requirement.

Create indexes to improve the performance and retrieve the data faster.

Expert in using tools like MS SQL Profiler, Database Tuning Wizard, Windows Performance for monitoring and tuning SQL Server performance.

Recovered the databases (from backups) to a specific point of time, as per the requests.

Troubleshoot various problems that arises in a day-to-day work and fix the issues.

Create views and assist QA team to perform Data validation.

Mercer Aug 2010 - Sep 2011

Gurugram, Haryana, India

Role: SQL Developer

Responsibilities:

Analyze project information requirements, data relationships, and attributes, resulting in data flows, program structures, and outputs for the projects using Crystal Reports and other tools.

Provide support to the finance team in reconciling financial system.

Analysis, design, production support, tuning and maintenance of Oracle stored procedures and SQL Scripts.

Develop an in-depth knowledge of the company’s data-related process and systems. Work on problems of diverse scope where analysis of situations or data requires a review of a variety of factors.

Coding of Oracle PL/SQL procedures, operations, and code reviews related to database objects such as views, tables, stored procedures, functions, packages, and ad-hoc query writing and SQL tuning skills, tasks related to data loading, and documentation.

Provide direct support to the end-users.

Provides recommendations for improving processes and procedures.

Education

Masters of Science (Computer Science) 2013 - 2015

Post Graduate Diploma in Computer Application 2009 – 2010

Bachelor of Comm. 2006 - 2009



Contact this candidate