Post Job Free

Resume

Sign in

Data Manager

Location:
Bentonville, AR
Salary:
120000
Posted:
May 03, 2018

Contact this candidate

Resume:

PROFESSIONAL ASSETS

ETL Developer

+1-410-***-****

Certified IBM DataStage professional

Email:ac5b2b@r.postjobfree.com

Summary:

Over 12 + years of experience with fortune companies in System Analysis, Design, Development, Testing and over 10 Years in the fields of implementation in Data Warehousing, Data Integration, Data Migration and Developing Client server applications using ETL Tools like IBM DataStage11.5/9.1/8.5/7.x and Informatica 7.x/8.x. SAP BW, Teradata, DB2, SQL procedural language and Shell Scripts. Industry experience in Banking & Financial Services, Marketing, Refinery and Retail.

Extensive experience in client / Server architecture and good knowledge of Software Development Life Cycle.

Expert in Insurance, Finance, Retail domains.

Strong data warehousing experience using IBM DataStage Components like DataStage Designer, DataStage Director, DataStage Administrator and Parallel Extender, ERwin, SQL Server, Oracle PL/SQL and involved in all the stages of Software Development Life Cycle (SDLC).

Proven track record in troubleshooting of DataStage jobs and addressing production issues like performance tuning and enhancement.

Thorough understanding of DW principles (Fact Tables, Dimensional Tables, Dimensional Data Modeling - Star Schema and Snow Flake Schema).

Worked on various transformations like Expression, Join, Lookup, Sort, Filter, Aggregator and Mapplets and Reusable transformations using Informatica Powercenter Designer.

Experienced in developing UNIX shell scripts for file validations, to run the batch processing and summarized report of data load.

Experience in implementing full lifecycle management to capture and migrate various DataStage ETL metadata including data mapping and other data integration artifacts (such as schedulers, scripts, etc.,) across environments including the establishment of standards, guidelines and best practices.

Experienced in Data Warehousing, Application Development and Production Support.

Experienced in conducting design review session with different stake holders and code review sessions with developers.

Good experience with Oracle 11g/10g/9i/8i,DB2,Teradata 13/14,SAP HANA,Flatfiles and XML sources.

Worked on loading of data from several flat files sources to Staging using Teradata Loading Utilities like TPUMP, MLOAD, FLOAD and BTEQ.

Expertise in creating reusable components such as Shared containers, Local Containers and in creating custom transformations, functions and routines.

Experienced in Onshore – Off Shore coordination Development and Production Support Team.

Excellent understanding of the DBA functions including creating tables, table spaces, databases and indices.

Extensive experience in UNIX Shell Scripting.

Good Knowledge on Agile Methodology (scrum) and waterfall methodologies.

Good Knowledge on Finance, Banking, Manufacturing, Retail and Insurance domains.

Worked on Scheduling tools like Crontab, CA7 and Control-M for automating jobs run.

Possess good knowledge on Onsite-Offshore Model, proven problem solving skills, good conceptual foundation, and ability to handle pressure.

Excellent track record as a strong team player with effective communication, analytical and multi-tasking skills.

Result driven and self-motivated.

CERTIFICATIONS:

Completed C2090-303 – IBM Infoshpere DataStage v9.1 certification.

Completed HSBC internal certification for Data Stage Consultant(Professional)

Certification.

Got Feather in the Cap award for best performance during Di migration project.

Got Rise award for developing a high profile projects as a team.

EDUCATIONAL QUALIFICATIONS

Master of Computer Science from Marat Wada University, Aurangabad, India.

Bachelor of Computer Science from Kakatiya University, Warangal, India.

TECHNICAL SKILLS:

Operating System

Windows NT/2000, 2003, MS-DOS, AIX, Linux.

RDBMS

Oracle 11g/10g/9i/8i, Teradata 13x/14x,SAP HANA, IBM DB2.

Data Warehousing

Star & Snow-Flake schema Modeling, Fact and Dimensions, Physical and Logical Data Modeling.

ETL Tools

IBMInfosphereDataStage11.5/9.1/8.x/7.x(Manager,Designer,Director, Administrator), IBM Infosphere Information Analyzer, DataStage Version Control. Informatica Power center 7.x/8.x.

Language

SQL, UNIX Shell Programming, C and C++.

Scheduling Tools

Control M,CA7,Tivoli Batch Scheduler,Crontab& Datastage Scheduler.

BI Tools

Cognos, Tableau

DB Tools

Teradata SQL Assistance, TOAD 10.6/9.55, SQL*Plus, Oracle developer.

Testing/Defect Tracking

HP Quality Center

Release Management

CA Global Service Desk R12,JIRA

Other Tools

Slack#, LeanKit, SAPLGPAD, OneNote,JIRA, MS VIZIO 2010, Putty, Ultra edit,FileZilla/WinSCP, Excel etc..

EXPERIENCE

Miracle Software Systems, Inc. Client(Walmart) Jul17–Till Date

Sr. ETL(Data stage) Developer Project Name:Datastage Upgrade

Domain: Retail

Walmart is an American multinational retail corporation that operates a chain of discount department stores and warehouse stores. Headquartered in Bentonville, Arkansas.

Description: This project wills It also involves each and every module migration of jobs from DataStage 9.1 to 11.5 and similarly modifying the complete UNIX scripts to Red hat Linux scripts. Main aim is to capture the data in data warehouse and make data available in reports, for access by business users in new environment..

Responsibilities:

Designed the ETL processes using DataStage to load data from Teradata, DB2,HANA, Informix, Flat Files (Fixed Width) and XML files to staging database and from staging to the target SAP HANA Data Warehouse database.

Gathering requirements from client. Implementing projects from End to End.

Migrating 1200+ Datastage jobs from V9.1 to V11.5.

Expertise in offshore onsite model. Coordinating teams. Following up things.

Extensively worked on Job Sequences to Control the Execution of the job flow using various Activities & Triggers (Conditional and Unconditional), Wait for file, Email Notification, Sequencer, Exception handler activity.

Unix script writing or changing existing with respect to new environment/Requirement.

Closely working with Business Intelligence, SAP BI, SAP Basis, DI Admin, Mainframe, Teradata/DB2 DBA Teams.

Involved in each and every step of project planning and analysis.

Used DataStage Parallel Extender stages namely Datasets, Sort, Lookup, Change Capture, Funnel, Peek, Row Generator. SCD

SAP R/3 pack used to extract data directly from SAP DB2 Database.

SAP (ABAP/BAPI/IDOC) stages experience.

Data migration from DB2 Mainframe to Teradata. Handled Billions of records.

And also have experience in SAP BI stage.

Extensively worked with Teradata connector. Loaded high volume of data into HANA Database.

Stages in accomplishing the ETL Coding.

Developed job sequencer with proper job dependencies, job control stages, triggers.

Used the DataStage Director and its run-time engine to monitor the running jobs.

Involved in performance tuning and optimization of DataStage mappings using features like partitions and data/index cache to manage very large volume of data.

Involved in parameterize all the DataStage Jobs (Parallel).

Experience on working with Mainframe CA7/ scheduling tool.

Worked closely with Datastage Admin, SAP BI & SAP Basis teams.

Organized data in the reports by using Filters, sorting, Ranking and highlighted data with Alerts.

Involved in data validation using Excel.

Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis

Extensively worked on Error handling, cleansing of data and performing lookups for access of data.

Involved in Unit testing, System testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.

Involved in production support working with various mitigation tickets created while the users working to retrieve the database.

Involved in all steps of development and deployment processes.

Coordinated with offshore team on daily basis.

Environment: IBM InfoSphere DataStage 9.1/11.5(Designer, Director, Manager, Analyzer), DB2,HANA, Teradata 12.0,SQL Server 05/08/12, TOAD 9.6, SQL*Plus, Shell Scripts, CA7 scheduler.

Miracle Software Systems, Inc. Client(Walmart) Sep16–June17

Sr. ETL(Data stage) Developer Project Name: SAP HANA Migration

Domain: Retail

Walmart is an American multinational retail corporation that operates a chain of discount department stores and warehouse stores. Headquartered in Bentonville, Arkansas.

Description: This project wills Migrate SAP BW to SAP HANA. Currently SAP BW is using DB2 as a database and this project will migrate DB2 database to HANA database. SAP HANA migration was very challenging it included the Whole infrastructures to migrate from older versions to newer versions including the hardware. establishing HANA connection in QA, testing would be carried out and source path for both the projects will be same and jobs in both the projects will run parallely It included (Both ODS and Warehouse),DB2, HANA,Unix,ETL servers, CA7 servers, Source Servers to be migrated from older versions to new versions to all new servers.

Responsibilities:

Designed the ETL processes using DataStage to load data from Teradata, DB2,HANA, Informix, Flat Files (Fixed Width) and XML files to staging database and from staging to the target SAP HANA Data Warehouse database.

Gathering requirements from client. Implementing projects from End to End.

Expertise in offshore onsite model. Coordinating teams. Following up things.

Extensively worked on Job Sequences to Control the Execution of the job flow using various Activities & Triggers (Conditional and Unconditional), Wait for file, Email Notification, Sequencer, Exception handler activity.

Unix script writing or changing existing with respect to new environment/Requirement.

Closely working with Business Intelligence, SAP BI, SAP Basis, DI Admin, Mainframe, Teradata/DB2 DBA Teams.

Involved in each and every step of project planning and analysis.

Used DataStage Parallel Extender stages namely Datasets, Sort, Lookup, Change Capture, Funnel, Peek, Row Generator. SCD

SAP R/3 pack used to extract data directly from SAP DB2 Database.

SAP (ABAP/BAPI/IDOC) stages experience.

Data migration from DB2 Mainframe to Teradata. Handled Billions of records.

And also have experience in SAP BI stage.

Extensively worked with Teradata connector. Loaded high volume of data into HANA Database.

Stages in accomplishing the ETL Coding.

Developed job sequencer with proper job dependencies, job control stages, triggers.

Used the DataStage Director and its run-time engine to monitor the running jobs.

Involved in performance tuning and optimization of DataStage mappings using features like partitions and data/index cache to manage very large volume of data.

Involved in parameterize all the DataStage Jobs (Parallel).

Experience on working with Mainframe CA7/ scheduling tool.

Worked closely with Datastage Admin, SAP BI & SAP Basis teams.

Organized data in the reports by using Filters, sorting, Ranking and highlighted data with Alerts.

Involved in data validation using Excel.

Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis

Extensively worked on Error handling, cleansing of data and performing lookups for access of data.

Involved in Unit testing, System testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.

Involved in production support working with various mitigation tickets created while the users working to retrieve the database.

Involved in all steps of development and deployment processes.

Coordinated with offshore team on daily basis.

Environment: IBM InfoSphere DataStage 9.1(Designer, Director, Manager, Analyzer), DB2,HANA, Teradata 12.0,SQL Server 05/08/12, TOAD 9.6, SQL*Plus, Shell Scripts, CA7 scheduler.

AirySoft Inc,Grand Rapids,MI,USA Mar 16–Jun16

Computer Programmer, Project Name: Tedpros Analytical Systems

Domain: SCM(Supply Chain Management)

Description:

The Tedpros system brings together customer and Vendors data from various Business units, as well as Affiliate file. This enables cross-business reporting such as Balanced Scorecard, as well as business unit specific analytics. which are predominantly Marketing focused

Responsibilities:

• Interacted with End user community to understand the business requirements and with Functional Analyst to understand the Functional Requirements identifying data sources and its implementations.

• Study and analyzing existing Source data.

• Defining Source and Target definitions.

• Extensively used Datastage to load data from Relational databases, flat files sources to target Oracle database.

Developed and supported the Extraction, Transformation and Load process (ETL) for a data warehouse from various data sources using DataStage Designer and to load the target tables using the DataStage Designer.

Developed parallel jobs using various Development/debug stages (Peek stage, Row generator stage, Column Genrator)

Environment: IBM InfoSphere DataStage 9.1 (Designer, Director) Oracle 10g, Sequential files, UNIX Shell Scripting, Tableau8.3

HSBC Software Development Jul 15–Feb 16

Senior Software Engineer, Project Name: Digital Insight (DI)

Domain: Financial

Description: DI involves with profile information and events performed by users & customers of personal internet banking as well as Corporate banking. It provides the users and enables bank to deploy new features in order to improve the services and bank with information related to applications being accessed by the customer satisfaction. Main aim is to capture this data in data warehouse and make data available in reports, for access by business users.

Responsibilities:

Worked closely with Business analysts and Business users to understand the requirements and to build the technical specifications.

Responsible to create Source to Target (STT) mappings.

Involved in day to day production support activities.

Worked on various defects raised by the concerned business teams from various entities.

Developed and supported the Extraction, Transformation and Load process (ETL) for a data warehouse from various data sources using DataStage Designer and to load the target tables using the DataStage Designer.

Worked on various stages like Transformer, Join, Lookup, Sort, Filter, Change Capture and Apply stages,Quality Stage.

Developed parallel jobs using various Development/debug stages (Peek stage, Row generator stage, Column generator stage, Sample Stage) and processing stages (Aggregator, Change Capture, Change Apply, Filter, Sort & Merge, Funnel, Remove Duplicate Stage)

Designed job sequencer to run multiple jobs with dependency and email notifications.

Involved in Unit Testing, SIT and UAT Worked with the users in data validations.

Prepared documentation for unit, integration and final end - to - end testing.

Optimized/Tuned DS jobs for better performance and efficiency

Responded to customer needs; self-starter and customer service oriented.

Provided support and guidance by creating Release, Deployment & Operation guide documents.

Involved in Performance tuning of complex queries.

Developing SQL scripts to facilitate the functionality for various modules.

Environment: IBM InfoSphere DataStage 8.x (Designer, Director) Oracle 10g, Teradata 13, Sequential files, UNIX Shell Scripting,, COGNOS10.01

HSBC Software Development . Aug 14–June 15

Sr. ETL(Data stage) Developer Project Name: EBI Evergreen Migration

Domain: Financial

Description: This Project involves migration from database Oracle 10g to Teradata. It also involves each and every module migration and redesigning of jobs from Data Stage 7.5 to 8.5 and similarly modifying the complete UNIX scripts to Redhat Linux scripts. Main aim is to capture the data in data warehouse and make data available in reports, for access by business users in new environment.

EBI migration was very challenging it included the Whole infrastructures to migrate from older versions to newer versions including the hardware. It included (Both ODS and Warehouse) Oracle,TeraData,Datastage, Unix, ETL servers, Ctl-M servers, Source Servers to be migrated from older versions to new versions to all new servers.

Responsibilities:

Assisted in the creation of an enterprise operational data store and analytical data warehouse.

Built DataStage ETL interfaces to aggregate, cleanse and migrate data across enterprise-wide MDM ODS and Data Warehousing systems using staged data processing techniques, patterns and best practices.

Conducted design review session with different stakeholders.

Responded to customer needs; self-starter and customer service oriented.

Experience with ETL Unit and Integration Test creation and verification.

Worked on various stages like Transformer, Join, Lookup, Sort, Filter, Change Capture and Apply stages, Quality Stage.

Involved in Unit Testing, SIT and UAT Worked with the users in data validations.

Extensively worked in improving performance of the jobs by avoiding as many transforms as possible.

Prepared documentation for unit, integration and final end - to - end testing.

Helped in preparing the mapping document from source to target.

Used DataStage stages namely Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Funnel, Row Generator, Remove Duplicates, Copy stages in accomplishing the ETL Coding.

Developed job sequencer with proper job dependencies, job control stages, triggers, and used notification activity to send email alerts.

Excessively used DS Director for monitoring Job logs to resolve issues.

Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.

Used Control-M job scheduler for automating the monthly regular run of DW cycle in both production and UAT environments.

Environment: IBM InfoSphereDataStage8.5/7.5, Oracle 10g, Teradata 13, Sequential files, Control M, UNIX Shell Scripting, COGNOS10.01.

HSBC Software Development Jun 08-Jul 14

Senior Software Engineer, Project Name: e-Business Insight.

Domain: Financial

Description:P2G is a strategic Group flagship project to deliver the next generation of internet banking. EBusiness Insight (eBI) is a suite of products and common components that help a business to measure and optimize their e-business propositions. It provides e-Businesses with Management Information and Reporting capabilities through Cognos ReportNet.

Responsibilities:

Understanding the Mapping documents, Data Model and there by create System Design Document.

Create Low Level design document.

Developed parallel jobs and sequence jobs for different interfaces.

Developed static load jobs using change capture stage to load the current mapping data.

Scheduling the interfaces to run daily/weekly/monthly/yearly based on the type of interface.

Worked on sql queries and UNIX while developing jobs and debugging issues.

Extensively used Crontab to schedule DataStage jobs.

Created Error Files and Log Tables containing data with discrepancies to analyse and re-process the data.

Create Unix Shell Scripts that are required to perform necessary validations, transformations and transfer of files.

Prepare Unit Test Cases and execute them.

Helped in preparing the mapping document from source to target.

Fix the defects raised by testing team.

Migrating jobs from one environment to other.

Involved in Performance Testing and there by tuning the jobs to yield better results.

Extensively used SQL tuning techniques to improve the performance in Data Stage Jobs.

Environment: IBM InfoSphereDataStage8.5/7.5, Oracle 10g, Teradata 13, Sequential files, ControlM, UNIX Shell Scripting, Cognos10.01.

Principal Financial Group. Nov 07-May 08

ETL Developer, Project Name: EDW Account Delta and History.

Domain: Financial/Insurance

Description: The EDW brings together customer and account data from various business units, as well as Affiliate file. This enables cross-business reporting such as Balanced Scorecard, as well as business unit specific analytics. Currently, the EDW is serving as the source for several data marts (Customer Data Mart and Princor Fund Mart), which are predominantly Marketing focused. The vision for the EDW is that it will eventually contain several 'domains' of data (Customer, Marketer, Financial, etc.) that will be integrated, creating a one-stop shop for key enterprise analytical data. The EDW will then serve as a source for analytical Data Marts throughout the company.

Responsibilities:

Interacted with End user community to understand the business requirements and with Functional Analyst to understand the Functional Requirements identifying data sources and its implementations.

Study and analyzing existing Source data.

Defining Source and Target definitions.

Extensively used Informatica to load data from Relational databases, flat files sources to target DB2 database.

Created Mapplets for reusable business rules.

Created tasks, workflows and Worklet using workflow manager.

Used Workflow monitor-to-monitor workflows.

Used Tivoli Job Scheduling to monitor the Jobs.

Used Workflow monitor-to-monitor workflows.

Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.

Environment: Informatica 7.1.3, DB2, Tivoli, UNIX and Oracle, SQL Server, Business Objects.

Principal Financial Group. Aug 06 to Nov 07

ETL Developer, Project Name: MDM (Marketing Data Mart)

Domain: Financial/Insurance

Description: MDM is a copy of the EDW and it is made to a set of Oracle tables, known as the Marketing Data Mart (MDM). The MDM is populated by a set of Informatica mappings which perform a direct pull of all records from UDB to Oracle. This workflow is run once a month following EDW certification. Each mapping is specifically designed to be limited to a read of the UDB table and a write to the Oracle table.

Responsibilities:

Study and analyzing existing Source data.

Defining Source and Target definitions.

Extensively used Informatica to load data from Relational databases, flat files sources to target DB2 database.

Created Mappings using Mapping Designer to load data from various sources, using different transformations like Source Qualifier, Expression, Lookup Aggregator, Update Strategy, Joiner, Filter and Router transformations.

Created Mapplets for reusable business rules.

Created tasks, workflows and work lets using workflow manager.

Used Workflow monitor-to-monitor workflows.

Used Tivoli Job Scheduling to monitor the Jobs.

Responded to customer needs; self-starter and customer service oriented.

Provided support and guidance by creating Release, Deployment & Operation guide documents.

Worked closely with QA Testers for the resolution of the data defects identified by them through the Application Lifecycle Management.

Environment: Informatica 7.1.3, DB2, Tivoli, UNIX and Oracle, SQL Server, Business Objects.

Wipro Technologies Aug 05 to Feb 06

Software Engineer Project Name: Carrier Intelligent System.

Domain:Retail

Description: Carrier-Intelligent System is a Decision Support System of Carrier Air Condition Pvt.Ltd. implemented using Erwin, Informatica and BusinessObjects for efficient querying and analytical reporting. Carrier has been world leader in Manufacture, Sells and Services 50 different brands in residential, commercial AC units across 170 countries. The company has 80 manufacturing units all over world. The fundamental goal of this project is study and analyzing existing database and transforms business data into information for making decision more easily and effectively. Carrier Intelligent System is developed to analyze sales, services of different type air conditions. Thus ensuring a study growth in both sales volume and market percentage. Several reports were developed for Field Sales Engineers, Field Service Engineering.

Responsibilities:

Study and analyzing existing data.

Defining Source and Target definitions.

Designing and creating Mappings.

Designing and creating Jobs

Extraction, Transformation and Loading of the data warehouse.

Used Mapping designer and Mapplets to generate different mappings for different loads.

Used Workflow Manager to schedule and monitor different sessions of loading.

Wrote procedures for data extraction, transformation and loading.

Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.

Creating and executing Sessions and Batches.

Environment: Informatica 7.1.3, DB2, Tivoli, UNIX and Oracle, SQL Server, Business Objects.



Contact this candidate