Post Job Free

Resume

Sign in

Data Customer Service

Location:
Fremont, CA
Salary:
130000
Posted:
February 10, 2017

Contact this candidate

Resume:

Summary

Sagar is a seasoned SAP BODS/ETL lead with over 11 years of experience in Analysis, Design, Development, Testing, Maintaining and Implementation of Business application systems for Retail, Service Management and Healthcare sectors. Sagar is also a certified Hadoop Developer and has experience on Spark, Pig, Hive, Sqoop, Flume, HBase, and Scala.

Experience on the Data Migration using SAP BODS from SAP ECC/SQL Server/Oracle to SAP HANA.

4 years of Experience as a senior technical lead and architect with strong hands-on knowledge in ETL technologies.

Experience in SAP Information Steward for Master data Management Using Data Insight, Meta Data management and Data Cleaning Packages for profiling & cleansing Master data

Worked on ETL processing and extracting data from multiple source environments.

Good knowledge of different phases of Data warehousing projects.

Experience on BW Extractors like Profit and Loss, Cost Control, Inventory and Material Ledger.

Worked on Business Objects data services Legacy to SAP Migration Using IDOCS, BAPI & LSMW Extensively.

Experience on migrate data services jobs from 3.2 to 4.1 into existing environment.

Good exposure in implementing RAPID DEPLOYMENT SOLUTIONS (RDS) as part of Master data and Transaction data Migration.

Experience on Replicating SAP System data in SAP HANA with SLT using LTRC, mass transfer id and data provisioning.

Experience on BODS ETL Performance tuning.

1+ years of onsite experience as an onsite coordinator.

Very good Experience and knowledge on SAP BODS administration.

Knowledge on HANA modeling.

Experience in People management and Stakeholder management.

Experience in debugging execution errors using Data Services logs (trace, monitor and error) and by examining the target data.

Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Migrated and tested jobs in different instances and validated the data by comparing source and target tables

Expertise in writing, testing, debugging, documenting and implementing complex Batch jobs to extract data from a variety of data sources, transform the data, and load the data to specified destinations

Have very good knowledge on Business objects universe, WEBI and Crystal Reports testing experience

Worked on CDC (Change Data capture)- MAP CDC Operation, Table comparison, Query Transforms, Table comparison, History preserving

Used different transforms such as Query, Case, Key Generation, Merge, Date Generation, Pivot transforms and functions like ifthenelse, subset, lookup and

lookup_ext etc.

Experience on Big data projects using Apache Hadoop and Cloudera distributions

Experience of working in AGILE methodology with SCRUM process.

Have very good knowledge on Business objects installation and integration with Server Desk Manager.

Educational Qualifications

Post Graduate in Masters in Computer Application, From Berhampur University in 2003.

Graduate in Commerce, From Berhampur University in 2000.

Certification and Achievements

CCA Spark and Hadoop Developer by Cloudera.

Received multiple times above award from my previous and current employer.

Nominated as Subject Matter Expert for Business Objects and GIS.

IBM Db2 Fundamentals Certified

Work Experience

Numbers Only, Inc. USA as SAP BODS Lead from June 2016

Capgemini as Sr. Consultant (P5), from Dec 2013 to May 2016.

Computer Associates (CA) as Sr. Software Engineer from Jan 2008 to Dec 2013.

HCL Technologies Ltd., Hyderabad as a Software Engineer from May 2006 – Jan 2008.

Professional Experience

Lam Research, California

Client: LamResearh June 2016 to Till date

Project: SO BOM - Migration from ECC, Sql Server, Oracle to Hana

Designation : SAP BODS/Big Data Lead

Project Description

Lam Research Corporation is an American corporation that engages in the design, manufacture, marketing, and service of semiconductor processing equipment used in the fabrication of integrated circuits. Its products are used primarily in front-end wafer processing, which involves the steps that create the active components of semiconductor devices and their wiring. The company also builds equipment for back-end wafer-level packaging, and for related manufacturing markets such as for microelectromechanical systems. The data is pulled from different source systems in the form of text files, oracle tables into staging, Enterprise data warehouse (EDW).

Responsibilities

Playing the role of Sr. Technical Lead cum Onsite Coordinator for Dev & QA

Point of contact at Onsite for ETL deliverables

Guiding and monitoring offshore team and reviewing all the ETL deliverables

Building stage environments and loading data

Discussing the requirement with client, Onsite manager, data modelers.

Prepare Data Mapping Reports (DMR), create ETL mappings to load data from text files to Staging, EDW.

Created complex Jobs, Work Flows, Data Flows, and Scripts using various Transforms(Integrator, Quality and Platform) to successfully load data from multiple sources into a desired target

Extensively used Data Services Management Console to schedule and execute jobs, manage repositories and perform metadata reporting

Used Open Hub Services to push data out of SAP BI using BODS

Generate Data quality and Data profiling reports.

Worked with Match Transforms, Address Cleanse and Data Cleanse Transforms for US Data Cleansing in Data Quality.

Used Data Integrator and Platform Transforms extensively.

Worked on Change Data Capture on both Source and Target level for SCD (Slowly changing Dimension) Type 1, Type 2 and Type 3.

Customizing code as per Lam Research project framework, adhering to all processes and standards of Lam Research.

Moved the projects from DEV to Central repository, Central repository to UAT and UAT to Production.

Scheduled the jobs, managed users and repositories in Management Console.

Test the code in development environment and documenting changes and issues

Monitoring and Managing day to day ETL/ELT project needs

Functional Testing and Regression Testing

Documentation, Reporting and Mentoring new team members

Environment: Business Objects Data Services 4.1/4.2, HANA 2.0, Oracle 10g, MS SQL Server 2008, SAP ECC, SAP BW, and ETL

Cap Gemini, New Jersy

Client: UNILEVER Dec 2014 to May 2016

Project: Finance Connect

Designation : SAP BODS Technical lead

Project Description

Unilever is one of the major consumer care companies in the world. The main objective of Finance Connect is to provide a finance master data warehouse. That can be used across all Unilever applications. Currently master data like customer, products, channels etc. is scattered across different systems. The project data is migration from different systems into Teradata data warehouse. The data is pulled from different source systems in the form of text files, oracle tables into staging, Enterprise data warehouse (EDW).

Responsibilities

Playing the role of Sr. Technical Lead for Dev & QA

Prepare Data Mapping Reports (DMR), create ETL mappings to load data from text files to Staging, EDW.

Prepare Data Mapping Reports (DMR), create ETL mappings to load data from text files to Staging, EDW.

Created complex Jobs, Work Flows, Data Flows, and Scripts using various Transforms(Integrator, Quality and Platform) to successfully load data from multiple sources into a desired target

Extensively used Data Services Management Console to schedule and execute jobs, manage repositories and perform metadata reporting

Experience on BW Extractors like Profit and Loss, Cost Control, Inventory and Material Ledger.

Used Open Hub Services to push data out of SAP BI using BODS

Generate Data quality and Data profiling reports.

Worked with Match Transforms, Address Cleanse and Data Cleanse Transforms for US Data Cleansing in Data Quality.

Used Data Integrator and Platform Transforms extensively.

Worked on Change Data Capture on both Source and Target level for SCD (Slowly changing Dimension) Type 1, Type 2 and Type 3.

Customizing code as per Unilever project framework, adhering to all processes and standards of Unilever.

Test the code in development environment and documenting changes and issues

Building stage environments and loading data

Monitoring and Managing day to day ETL/ELT project needs

Functional and Regression Testing

Documentation, Reporting and Mentoring new team members

Environment: Business Objects Data Services 4.1/4.2, Oracle 10g, MS SQL Server 2008, SAP ERP, SAP BW and Windows XP and ETL

Cap Gemini, Chigago

Client: SABMILLER Mar 2014 to Till Nov 2014

Project: Migration from ECC to ECC

Designation : SAP BODS Technical Lead

Project Description

SABMiller plc (LSE: SAB, JSE: SAB) is a British multinational brewing and beverage company headquartered in London, England. It is the world's second-largest brewer measured by revenues (after the Belgian-Brazilian Anheuser-Busch InBev) and is also a major bottler of Coca-Cola.Its brands include Fosters, Grolsch, Miller Brewing Company, Peroni Nastro Azzurro and Pilsner Urquell. It has operations in 75 countries across Africa, Asia, Australia, Europe, North America and South America and sells around 21 billion litres of lager per year.

Responsibilities:

Discussing the requirement with client, Onsite manager and data modelers.

Using Master Data Reports (MDR), create ETL mappings to load data from Source to Target. Generate Data quality and Data profiling reports.

Customizing code as per the RDS (Rapid Development Solution) with in Sabmiller project framework, adhering to all processes and standards of Sabmiller.

Good exposure in implementing RAPID DEPLOYMENT SOLUTIONS (RDS) as part of Master data and Transaction data Migration.

Test the code in development environment, documenting changes and issues

Worked on complex Jobs, Workflows, Dataflows, Scripts to do ETL processing using BODS Transformations (Platform, Data Integrator, Data Quality)

Worked on the vendor master, customer master and material master IDOCS: CREMAS, DEBMAS and MATMAS.

Environment: Business Objects Data Services 4.0, Oracle 10g, MS SQL Server 2008, SAP ECC and SAP BW

Cap Gemini, Hyderabad, India Dec 2013 to Feb 2014

Client: Unilever Company

Project: Master Data Track (MDT)

Designation : SAP BODS Technical Lead

Project Description

Unilever is one of the major consumer care companies in the world. The main objective of Master Data Track (MDT) is to provide Unilever a master data warehouse. That can be used across all Unilever applications. Currently master data like customer, products, channels etc. is scattered across different systems. The project data is migration from different systems into Teradata data warehouse. The data is pulled from different source systems in the form of text files, oracle tables into staging, Enterprise data warehouse (EDW).

Responsibilities

Discussing the requirement with client, Onsite manager, data modelers.

Prepare Data Mapping Reports (DMR), create ETL mappings to load data from text files to Staging, EDW.

Prepare Data Mapping Reports (DMR), create ETL mappings to load data from text files to Staging, EDW.

Created complex Jobs, Work Flows, Data Flows, and Scripts using various Transforms(Integrator, Quality and Platform) to successfully load data from multiple sources into a desired target

Extensively used Data Services Management Console to schedule and execute jobs, manage repositories and perform metadata reporting

Experience on BW Extractors like Profit and Loss, Cost Control, Inventory and Material Ledger.

Used Open Hub Services to push data out of SAP BI using BODS

Generate Data quality and Data profiling reports.

Worked with Match Transforms, Address Cleanse and Data Cleanse Transforms for US Data Cleansing in Data Quality.

Used Data Integrator and Platform Transforms extensively.

Worked on Change Data Capture on both Source and Target level for SCD (Slowly changing Dimension) Type 1, Type 2 and Type 3.

Generate Data quality and Data profiling reports.

Customizing code as per Unilever project framework, adhering to all processes and standards of Unilever.

Experience on migrate data services jobs from 3.2 to 4.1 into existing environment

Test the code in development environment. Documenting changes, issues

Environment: Business Objects Data Services 4.1, BODS XI 3.2,Oracle 10g, MS SQL Server 2008, SAP ERP, SAP BW, TOAD and Windows XP and ETL

Client: Computer Associates Int., IslaIndia, Jan 2008 to Dec 2013

Project: GIS

Designation : Sr.Software Engineer

Project Description

Service desk software from CA Technologies lets you optimize the business user’s support experience. Gain the ability to deliver high-quality, consistent IT service support with Service Desk Manager. Be able to easily automate incident, problem, knowledge management, interactive support, and self-service and advanced root cause analysis. Deliver superior end-user support with simplified change and configuration management. Service desk software from CA Technologies delivers extensive automated support tools that let you resolve issues, achieving a higher quality of customer service while lowering costs for your business.

Automate incident, problem, change and knowledge management

Automate interactive support, self-service and advanced root cause analysis

Achieve a higher quality of customer service while lowering costs for your business

Responsibilities

Played a key role during migration and tested jobs in different instances and validated the data by comparing source and target tables

Involved in writing, testing, debugging, documenting and implementing complex Batch jobs to extract data from a variety of data sources, transform the data, and load the data to specified destinations

Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings

Took complete ownership end to end Business objects universe, WEBI and Crystal Reports validation and reviewing before moving the same to production.

Took complete ownership for Business objects installation and integration with Server Desk Manager.

Took complete ownership for Business objects installation and certifications on behalf of customers and incorporated the required changes in the release notes.

Took complete ownership towards data loading from different sub products as source and sending back to main server database i.e. Central MDB as target by using different transformers.

Analyze existing system to understand table metadata, source systems, file formats, Data Types, Column Names, Field lengths and schedule.

Configure Data Stores and File formats.

Design Jobs, Work Flows, Data Flows and scripts to do ETL processing using BODS Transformations.

Use Platform, Data Integrator and Data Quality transforms. Execute and Monitor the ETL jobs and scheduled Batch jobs using Data Services Management Console.

Load the test data to estimate job run time and Performance Statistics and optimize/tune the job.

Used Push Down where applicable. Perform Data Profiling including Column and Relationship on source data.

Worked on ETL development using multiple sources both data stores and file formats (XLS,.CSV XML, DB2, Oracle, SAP ERP and BW).

Used Information steward to build rules of Data Profiling.

Worked on Data cleansing activities like address cleanse and data cleanse.

Used transforms like Query, Case, Merge, Date and Key Generation. Used Map CDC, Table Comparison and History Preserving.

Writing PIG scripts, Hive QL and Sqoop jobs

Test Planning and Execution. Perform Peer review, estimations, and KT sessions. Meeting with end users for freezing requirement.

Environment: Business Objects Data Services 4.0, BODS XI 3.2, GRLoader,Hadoop - MapReduce, PIG, Hive, SQOOP, Flume Shell scripting,Oracle 10g, MS SQL Server 2008, SAP ERP, SAP BW, BOXI 3.1, DB2, Universe Designer, Web Intelligence, Windows XP, Linux, Solaris and AIX

HCL Technologies Ltd., Hyderabad, India Jan 2006 to Jan 2008

Client: Wolters Kluwer

Project: WK - Engagement

Project Description

WK is a Multinational publishing company active in legal and tax information, business, medical, scientific and educational publishing. CCH is one of WK brand in the leading provider of tax research and software products in the United States. ProSystem fx Engagement: In order to automate its products, CCH has started its project ProSystem fx(r) Engagement. ProSystem fx(r) Engagement provides powerful tools to help prepare audit and tax Workpapers and reports in - Microsoft Word and Excel. The core of the Engagement system is the trial balance. Engagement contains two Modules: The Engagement Administrator and Workpaper Management modules.

Responsibilities

Test Script Review for Automation

Analyzing and understanding the Requirement Specification.

Participated in preparation of Automation Framework.

Identifying the Automation scenarios.

Participated in Designing and Configuring Central Repository.

Designing and Executing the Test Scripts. Involved in Preparing of Test Data

Analyzing the Results. Reporting the Defects

Proper Communication with Development team regarding bugs or any issues.

Environment: QTP 8.2,QC, C++, VB, ASP, .NET,Windows 2000

HCL Technologies Ltd., Hyderabad, India May 2006 – Dec 2006

Client: LexixNexis, USA

Project: Client Test System (CTS)

Project Description

The project (LN-CTS) is aimed at testing LexisNexis infrastructure beyond QLRB by building and maintaining a testing application by the name of CTS (Client Test System). The application is meant to simulate actual production scenarios and test the search and retrieval applications of LexisNexis for accuracy of data and performance. Any changes in the applications that are part of the search and retrieval infrastructure are first tested using CTS and released to production once the concerned code change has been certified. CTS has to be appropriately enhanced for it to be able to certify any code change in the infrastructure layer of LexisNexis. This makes it indispensable in the development cycle of LexisNexis applications.

Responsibilities

Tested and certified enhancements like JNORM changes, Rosetta 4.1 Cite Normalization, Rosetta 5 NLS fix & Rosetta R5.1 Global Source Directory.

Recruiting people at offshore to develop the offshore Rosetta team.

Review functional and design specifications.

Develop Test Plans, Test Scenarios & Test strategies. Prepare test documents and set the test environment. Running test cases and posting defects using PVCS.

Prepared Test Logs and Problem reports.

Reporting and prioritizing software bugs with the development and QA managers.

Environment: C++, J2EE, Assembly language, EJB, Mainframes, XML, Sun Solaris Release 5.8



Contact this candidate