Post Job Free
Sign in

Data Manager

Location:
Fremont, CA
Posted:
December 27, 2019

Contact this candidate

Resume:

ANUSHA MUPPALA Email : *********@*****.***

Phone: 510-***-****

Professional Summary

* ***** ** ********** ** IIS DataStage 9.1, IIS DataStage 8.5, DataStage 8.1 & Ascential DataStage 7.5 EE (PX) with various roles and responsibilities.

Excellent experience on IBM Information Analyzer.

Good work experience in UNIX Shell Scripting.

Good work experience in Teradata, DB2, Oracle and SQL.

Good Knowledge on Data Warehousing Concepts.

Have work experience on Control-M Scheduling tool, MKS and RTC.

Excellent Interpersonal Skills, Communication Skills, Technical and Management Skills, Strong Logical & Analytical Skills and a quick learner.

Expertise in Extraction, Transformation and Loading with various operational sources like Oracle and Flat Files into a Data Warehouse.

Knowledge on Tableau.

Education qualification

MCA from Osmania University, Hyderabad with 77%

Technical Expertise

ETL

IBM IIS DataStage 9.1, IBM IIS DataStage 8.5, WebSphere DataStage 8.1, DataStage Version control, Data Stage SE 7.5

Languages

SQL, PL/SQL Programming, HTML and UNIX Shell Scripting

Database

Teradata, DB2, SQL, Oracle

Operating System

Unix, Linux

Scheduling Tools

Control – M

Others Tools

Information Analyzer, MKS and RTC

Learning’s

Tableau

Projects

CLIENT

Titan Technologies Inc. -

Onsite- May 2019 to Present.

Offshore Office- Mumbai India – Jan 2017 to May 2019

Role: Analyst Programmer

Client: Mercadien Group (Hamilton NJ)

Role

Development and Testing

Team size

4

Environment

DataStage 9.1, Oracle, UNIX Shell Scripting & Information Analyzer.

project description

Was involved in Data Analytics project for the Mercadien Group’s Tax accounting and financial services group. Mercadien Group was awarded a multi- year SBA contract to provide Data Analytics services for SBA’s Small Business Loan’s program.

The backend data analytics project work was out sourced to Titan Technologies

responsibilities

. As a part of the Data analytics team offshore, I was responsible for the following:

Involved in analyzing the data through Information Analyzer. Extract the data from Flat files and load into Staging tables and perform the business logic in Transformation and later load in Target Teradata tables.

Involved in Developing and testing the data. And providing KT to UAT team in explaining the business requirements.

Designed Parallel jobs using Sort, Join, Lookup, funnel, Transformer, Filter processing stages.

Created monthly data load extracts from all SBA loan servicing agencies for the Data Analytics team. The file sources were several thousands of CSV, Access, SQL, Flat files.

Wrote extensive backend SQL Scripts to scrub and load data files in to a DB2 Database for each months Analytics reporting services.

Designed ETL technical specs performed analysis.

Conducted live sessions for demonstrating prototypes, UAT labs and issue resolution.

Involved in Defect Analysis.

Involved in preparing a test cases document and writing test cases in Quality Centre(QC).

Future Business Intelligence Landscape (FBIL) IKEA May 2014 –August 2016

CLIENT

IKEA

Role

Designing and Testing

Team size

30

Environment

DataStage 9.1, Teradata, UNIX Shell Scripting & Information Analyzer.

project description

Currently the IKEA BI architecture landscape does not meet the business expectations and does not support a growing IKEA. The long-term vision is to streamline the way IKEA collects and manages data, restructure the complex current landscape and create the pre-conditions to deliver business value efficiently. By establishing a new IKEA common DWH (CLA/IDSS) - the journey will start to establish a new BI landscape that, in the long run, will be used to consolidate today’s outdated and complex BI landscape, and cater to the business need of the future. The legacy data warehouses use different standards and practices for making information available to their information consumers. A consolidation of several legacy datawarehouses will save money while give possibility of adding new capabilities for data quality and traceability which will increase trust in data.

responsibilities

Producing high quality work product and deliverables on-time (zero roll-back, backout) within budget and allowable Delivery Excellence Metrics, adhering to Process.

Taking additional team responsibilities and flexibility for the benefit of the project.

Ensure timely and prompt reporting of risks / issues / Dependencies.

Proficient in developing strategies for Extraction, Transformation and Loading (ETL) mechanism.

Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, Remove duplicates, Filter, Dataset, Funnel, Transformer.

Designing Reusable components (Parameter set, Job Sequencer) which can be used across the project.

Creating UNIX Shell Scripting for audit process.

Expert in working with DataStage Manager, Designer, and Director.

Involved in Performance tuning to improve the performance of the Data stage jobs.

Involved in Unit Testing, System Testing and UAT.

Involved in creating Data lineage report and Metadata Work bench.

Involved in deployment of code in UAT.

Involved in defect analysis.

Involved in analyzing the data through Information Analyzer.

Involved in creating Test case documents.

Involved in interacting with testing team for any correction of Data.

Involved in writing Test cases in QC.

Providing KT to the team in required area.

Clean Well of Data (CWoD) HSBC July 2012 –April 2014

CLIENT

HSBC, US

Role

Designing, Testing, Production Implementation and Maintenance

Team size

11

Environment

Data Stage 8.1, DB2, UNIX Shell Scripting & Control-M.

project description

The Clean Well of Data (CWoD) is the foundation for Management Reporting, Analysis and Modeling for HSBC Mortgage Corporation, US. The CWoD is the Central source of Data and the tools to mine those data for valuable Business Intelligence. Data are cleansed and standardized when loaded to the CWoD where multiple sources exist for a field; the best source is selected for each loan. Commonly used aggregates or derived data (such as product hierarchy) are calculated and loaded as fields that can be queried. We receive flat files from different source system, cleanse them and load into staging tables. From staging tables read the data and apply business logic and load them into target tables. From CWoD database, Reporting team i.e. SAS is used for creating reports.

responsibilities

•Responsible in gathering the requirements from Onsite team and Analyze the requirement.

•Involved in creating the mapping sheet and design document.

•Understand the technical specifications and develop DataStage jobs for Extraction Transformation, Cleansing and Loading process of DW.

•Designing the Parallel jobs using various stages like Transformer, Lookup, Join, Filter, Funnel, Copy, Remove Duplicates, Sort, Sequential file

•Designing Reusable components (Parameter set, Job Sequencer) which can be used across the project.

•Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the Teradata, Oracle, DB2 database.

•Extensively involved in creating UNIX Shell Scripting for audit process.

•Reviewing the code delivered by the team members and raising the defects in QC.

•Designing, compiling, validating, running, monitoring, exporting and importing the Data Stage Jobs using the Client Components.

•Involved in Performance tuning to improve the performance of the Data stage jobs.

•Involved in Unit Testing, System Testing and UAT.

•Involved in scheduling the jobs using Control-M tool.

•Involved in deployment of code in production.

•Involved in defect analysis.

•Extensively involved in Validation of data.

•Involved in interacting with source team for any correction of Data in files.

•Involved in writing Test cases in QC.

•Involved in Production support.

Global Data Management (GDM) HSBC march 2011 –june 2012

CLIENT

HSBC, US

role

Designing and Testing

team size

6

Environment

Data Stage 8.1, DB2, UNIX Shell Scripting & Control-M.

Project

Description

HBUS (HSBC Bank Group) wants DW – BI solution to deliver the group for Global data Management.

GDM solution is a Group solution for Analytics. The current GDM solution is built and deployed in SAS for multiple countries in Asia. It provides analytics solution for Deposit, Loans, Investment, Insurance, Cards, Applications, Channels, Campaign, Contact and Customer subject area of HSBC. The process has daily and monthly files which involves account/customer level information, transaction data and summarized data as per the business requirements. Source data for GDM is OHBI interface staging tables.

Responsibilities

Analyzing the RSS document received from the user.

Design and development of Jobs

Mapping data items from source system to the target system.

•Understand the technical specifications and develop DataStage jobs for Extraction Transformation, Cleansing and Loading process of DW.

•Designing Reusable components (Parameter set, Job Sequencer) which can be used across the project.

Involved in Unit Testing, System Testing and UAT.

•Extensively involved in creating UNIX Shell Scripting for audit process.

•Designing, compiling, validating, running, monitoring, exporting and importing the Data Stage Jobs using the Client Components.

•Involved in scheduling the jobs using Control-M tool.

•Involved in deployment of code in production.

•Extensively involved in Validation of data.

•Involved in interacting with source team for any correction of Data in files.

•Involved in writing Test cases in QC.

•Involved in Production support.



Contact this candidate