Post Job Free

Resume

Sign in

Test Data Management, QA Analyst 11

Location:
Monroe Township, NJ
Salary:
80000
Posted:
December 10, 2015

Contact this candidate

Resume:

Professional Summary

** ***** ** ******* ********** in Information Technology including three years in Data Archiving and Test Data Management using IBM Infosphere Optim.

Strong expertise in implementing Data Growth Solutions and Data Archiving using IBM Optim.

Configured development and QA environments to meet Data Privacy, Data Masking and Data Sub-setting requirements using IBM Infosphere Optim Test Data Management (TDM) services.

Excellent backend skills in creating SQL objects like Tables, Stored Procedures, Views, Indexes, Triggers, Rules, Defaults, and functions.

Has very good experience in complete Information Life cycle including Data Analysis, Database Design, Data Mapping, Conversion and Data Load.

Lead onshore / offshore teams for project Implementation with daily offshore task assignment, review daily offshore status & customize the same to the client expectations.

Experience in Multi-Dimensional Data Modeling (Star Schema) - Identify Business & Data requirements and converted them into conceptual & logical data models

Possess a very strong technical foundation with excellent interactive communication and interpersonal skills.

Ability to learn new technologies and functionalities quickly and adapt to changing requirements.

Team player with demonstrated collaborative working relationships.

Job Experience

Client: Nationwide, OH Apr 2015 - Present

Environment: IBM Optim, Oracle 11g/10g, LINUX, SQL Server, TOAD, SQL, PL/SQL.

Role: IBM Optim Consultant

Implement Enterprise Data Management using IBM Info Sphere Optim tool for Test Data Management(TDM) and Data Growth Solutions.

Gather requirements for Test Data Management and Data Archival.

Create OPTIM Directories and DB Aliases for Oracle, SQL Server, Teradata databases.

Work with relational databases installed on Oracle RAC and SQL Server Clusters and Teradata.

Derive database relations using Optim Discovery.

Create OPTIM access definitions, Archive requests, Extract Requests, Table Mappings and Columns Mappings and apply Data Masking functions.

Create OPTIM Extract / Archive, Convert, Insert, Compare and Load requests based on the masking requirements.

Worked closely with data architects and database administrators to understand and configure OPTIM and database relationships.

De-identify sensitive data in unstructured documents in production and nonproduction systems using IBM Guardium to manage risk and satisfy compliance requirements.

Provide an easy-to-use, highly consumable and reusable process for users and test practitioners to obtain

testing data that is complete and accurate. Worked with Optim Support team to resolve data issues with complex data types and provide workaround solutions.

Responsible for scheduling and automation of OPTIM Extract, Convert and Load jobs.

Client: Express Scripts, NJ Jan 2014 – Mar 2015

Environment: : IBM Optim, Oracle 11g/10g, SQL Server, IBM DB2, SQL, PL/SQL.

Role: IBM Optim Consultant

Express Scripts which is an Online Pharmacy Benefits Management firm has acquired Medco Inc., and identified a number of redundant business applications and processes to be consolidated, archived or retired as part of Systems Retirement Project. I was engaged to support the consolidation and Systems Retirement process. The role was extended to provide Data Growth Solutions using IBM Info Sphere Optim.

Implemented end to end Enterprise Data Management using IBM Info Sphere Optim tool for Data Archival (Data Growth, Performance Archival and Systems Retirement Archival) and Test Data Management.

Coordinate meetings with business Analyst, project manager client management teams for scheduling of Archiving initiatives.

Gather requirements for Data Archival and Data Life Cycle Management.

Worked closely with data architects and database administrators to understand and configure OPTIM and database relationship.

Understand data related across heterogeneous system by discovering data transformation rules and application relationships.

Design and develop Optim access definitions, archive requests, table mappings and columns mappings and apply transformation and data masking functions.

Convert flat files and csv files into relational model to use with Optim.

Create and test Optim extract, archive, convert, insert jobs.

Implemented Test Data Management and Data sub setting services for Oracle and Sql Server databases.

Worked with Optim support team to resolve the archive data access issues with data types.

Client: Merck, NJ Jan 2012 - Oct 2013

Environment: IBM Optim, Oracle 11g/10g, LINUX, SQL Server, TOAD, SQL, PL/SQL.

Role: IBM Optim Consultant

Implement Enterprise Data Management using IBM Info Sphere Optim tool for Data Archival (Data Growth, Performance Archival and Systems Retirement Archival) and Data Purge.

Coordinate meetings with business Analyst, project manager client management teams for scheduling of Archiving initiatives.

Gather requirements for Data Archival and Data Life Cycle Management.

Create OPTIM Environments, Directories and DB Aliases to Oracle, SQL Server, DB2 databases

Work with relational databases installed on Oracle RAC and SQL Server Clusters.

Derive database relations using Optim Discovery.

Create OPTIM access definitions, archive requests, Extract Requests, table mappings and columns mappings and apply masking functions.

Create OPTIM Extract, Convert, Insert, Compare and Load requests based on the masking requirements.

Identify Application Level dependencies and created lookup and reference tables.

Worked closely with data architects and database administrators to understand and configure OPTIM and database relationship.

Understand data related across heterogeneous system by discovering data transformation rules and application relationships.

Convert flat files and CSV files into relational model to use with OPTIM.

Worked with Optim Support team to resolve the archive data access issues with complex data types and provide workaround solutions.

Responsible for scheduling and automation of OPTIM Archive, Extract, Convert and Load jobs.

Accenture Plc. Dec 05 – Jan 11

Client: JPMC

Role: Software QA Lead for Salesforce.com implementation.

Gain functional knowledge (Application & DB) to an extent of driving the offshore team.

Daily offshore task assignment.

Lead the QA efforts for functional and Performance testing of Salesforce.com Implementation.

Review daily offshore status & customize the same as expected by the IST lead / Client.

Defect follow-up, re-test.

Work on any issue resolution with the PM / Dev leads of the respective work stream.

Onshore-Offshore co-ordination.

Lead the QA teams responsible for functional and Performance testing using QTP and Load Runner.

Collect and summarize performance test results from support teams.

Provide appropriate recommendations to ensure data Quality and test robustness.

Role: Software QA Specialist for Stock Loan Applications

Gain knowledge of the Stock Loan applications and perform test planning and test execution.

Share the knowledge and support the team members in testing the applications.

Coordinate testing efforts with Business analyst, development and management teams.

Ensure timely completion of the test execution and submission of the deliverables.

Escalating and coordinating resolution of the issues with team members.

Environment: Salesforce.com, Sybase, JSP, XML HTML, EJB, Oracle, HP Quality Center, QTP, Load Runner.

Client: Novartis Pharmaceuticals Oct 2007 - Aug 2008

Role: Seibel CRM implementation Lead

Novartis Legal Data Services provides the Novartis employee (Field force) information from Novartis Roster (Field force information System) to Legal Department with the details of who was historically responsible for field force’s detailing calls on a specific call date.

As the Novartis base employee information storage system was upgraded and base data storage structure has changed, there was a necessity to rewrite the PL/SQL procedure to generate the requested report.

As part of the team providing data to Novartis LDS team, gained understanding of the existing process, analyzed the data entities, documented the Functional requirements, Created detail design document, Developed PL/SQL procedures for generating the requested reports from Novartis Roster.

Provided data to Novartis LDS team,

Gained understanding of the existing process

Analyzed the data entities

Documented the Functional requirements

Created detail design document

Developed PL/SQL procedures for generating the requested reports from Novartis Roster.

Environment: Seibel CRM, Oracle

Client: New York City Administrative Services Sep’06 - Oct 2007

NYCAPS-New York City Automated Personnel System is PeopleSoft HR project implementation for New York City to consolidate and manage HR activities of various city Agencies with central PeopleSoft HRMS System. The project was to convert New York City Department of Education (DOE) human resources data from legacy systems to NYCAPS - HRMS.

Role: Senior Conversion Analyst

As a senior conversion analyst/developer, reviewed and updated requirements and detail design documents, data layouts and rules to convert the data from legacy systems to NYCAPS.

Developed and modified SQR’s for data conversion from Legacy systems to PeopleSoft records by performing required validations and acquired good working knowledge on FIT/GAP Analysis.

Converted data from staging to PeopleSoft records for Personal Data, Tax Data, Recruitment Processing, Pre Employment processing, Probation and Employee Performance Management.

Executed and modified the conversion run book during mock conversions.

Coordinated with development, test and functional teams in processing and analyzing the converted data, troubleshoot the issues while processing the converted data.

Ensured all the prep scripts are executed and data validation is complete in testing environment

Provided support to functional teams during Mock conversions in analyzing and fixing incorrect data using SQL.

Executed, analyzed and presented converted data reconciliation to client manager.

Environment: PeopleSoft HRMS, Oracle SQL, PeopleSoft SQR, UNIX, XML, Load Runner.

Role: Implementing Auto step process - Configuration Lead

Was a team member in implementing Auto Step process for the employees having salary progress in steps in accordance with the contractual agreement.

Derived the contractual Salary increments for New York City employees in uniformed titles and configured data in HRMS setup for implementing PeopleSoft HR Step Progression as Salary Plan, Grade and Steps.

Analyzed employee salaries and converted the salary plan information for the employees to fit in to the configured values by leveraging existing Application Engine program for conversion and mass updates.

Ensure correctness of converted data.

Used SQL queries for data analysis and presented the analysis results to the project team.

Provided support to the Testing Team in analyzing converted data and providing input on the functionality of the application.

Responsible for documenting key discussion points during team meetings, and distributing meeting minutes to team members.

Environment: PeopleSoft HRMS, Oracle SQL, PeopleSoft SQR, UNIX, XML

Client: Pfizer Inc. Dec ’05 - Sep’06

Project: Chantix implementation with Get-Quit support program

Chantix is a smoke cessation prescription drug. Get-Quit.com is web and IVR based behavioral modification program aimed at supporting and encouraging the Chantix users to quit smoking and stay quit.

Role: Onshore/ Offshore Testing Lead

Manage offshore QA team responsible for testing of web/IVR applications to manage the customer base and progression of Get Quit support program for the Chantix users.

Coordinate with business Analyst, project manager, development and client management teams to provide scheduling for testing, reporting and tracking of defects.

Prepared detailed test plan specifying the system inputs, outputs, limitations, entry and exit criteria.

Reviewed test scenarios and test cases on a regular basis to ensure that the test cases are most current and in accordance with changes in requirements.

Coordinated meetings for the test team with business analyst to understand new business rules as compared to the existing ones.

Managed all phases of testing (unit, regression, system and user acceptance testing) to ensure the test deliverables are met on time

Analyzed and communicated the impact of requirements changes on testing deliverables.

Coordinated with business analyst, development team and management team to prioritize and resolve issues.

Environment: XML HTML, JAVA, Oracle, HP- Mercury Quality Center

Stay in Front May ’04 – Dec ’05

Project: Implementation of Stay in front CRM

Role: QA Lead / senior QA Analyst

Lead QA team responsible for testing CRM and analytical systems for pharmaceutical clients.

Prepared validation scripts with system inputs, outputs, limitations, entry and exit criteria specified.

Coordinated with project manager, development and client management teams to provide estimates required for testing, reporting and tracking defects.

Was responsible to ensure the test deliverables are met on time.

Review of test plans and test cases on a regular basis to ensure that the test cases are updated in accordance with changes in requirements.

Coordinated impact of requirement changes on testing schedule.

Managed all phases of testing (unit, system, user acceptance and regression)

Conducted meetings with the business analysts and team members to understand business rules, and understand the existing legacy systems.

Created test plan, test strategy and test cases that would be used for all phases of testing.

Manual and automated Testing of client-server as well as web based applications using Quick Test Professional

Used Mercury - HP Quality Center for defect-tracking.

Coordinated with business analyst and development team to resolve issues.

Environment: SQL Server, XML HTML, VC++, VB Script, Oracle, HP- Mercury Quality Center, Win Runner.

Condenast Publications Aug ’03 – May’04

Project: Brides.Com web application

Role: QA Lead / Manager

Lead QA team responsible for testing online Applications.

Prepared detailed test plan specifying the system inputs, outputs, limitations, entry and exit criteria.

Performed server administration and upgrades to Load Runner.

Analyze and migrate project data for Performance and Load testing scenarios.

Coordinate with project manager, development and content management teams to provide estimates required for testing, reporting and tracking of defects.

Record script the load test scenarios using Load Runner and Quality Center

Ensured the test team deliverables are on schedule.

Reviewed test plans and test cases on a regular basis to ensure that the test cases are most current and in accordance with changes in requirements.

Analyzed and coordinated the impact of requirement on testing delivery dates.

Provided 24x7 support during the launch time for the web site.

Managed all phases of testing (unit, system, user acceptance and regression)

Conducted meetings with the business analyst to understand business rules, and understand the existing legacy systems.

Created test plan, test strategy and test cases that would be used for all phases of testing.

Manual and automated Testing of client-server as well as web based applications using Quick Test Professional

Managed defect tracking and issues resolution using Mercury Test Director for.

Trained business and functional users to use Test Director and setting up processes for defect cycle.

Worked with business analyst and development team to resolve open defects.

Drafted release notes with new features and fixed defects.

Environment: JSP, XML HTML, EJB, Oracle, Mercury Test Director, Load Runner.

Berkely Group, Aon Corp., NY Aug 01 - Jan 02

Role: PL/SQL Developer

Environment: Oracle 8i, Developer 9i, Reports 9i on Windows 2000 and Solaris 8

Project Name: Salvage and Document System

Berkely Group is the leader in travel insurance brokerage and administration services. Berkely Salvages the Trip Cancellation, Trip Delay, Sickness and Medical Evacuation Claims Payments from the Airlines and Primary Health Insurance providers. Salvage records are automatically or manually created by Claims Examiners, when claims are paid, based on pre-defined rules. Salvage payments received are processed and credited to the appropriate underwriters by the accounting system.

Document system is used to generate Claim Forms, Action Sheets and letters needed to process the claims. This system allows the user to generate documents in ad-hoc and batch mode, which are mailed, emailed or faxed to the addressee based on the preference set. It is integrated with the Diary module of the claims system. This system had about 200 reports and five forms.

Performed system analysis for upgrading existing Salvage screens.

Gathered business requirements, prepared functional technical documentation, programmed, unit tested, and delivered software.

Used PL/SQL packages.

Provided application and database analysis and support during clients pre-go live testing, go live, and post-go live support efforts.

Coordinated with technical team leader to have technical specifications written and programming tasks completed.

Provide unit/system-testing data analysis SQL Plus, and other data analysis tools

DSQ Software Limited

Role: Programmer Analyst

Client Name: Punjab Tractors Ltd., Chandigarh, India

Application Development for Finance, Sales, Purchase, Plant Maintenance, Production Planning, Stores, Export - Import, Costing & Budget and Miscellaneous. Each section has been divided into 9 different modules and integrated into a product. There were no of users and each user had an associated role and depending on the role he was allowed into different screens on the product.

Responsible for Design, Development, Implementation & Technical Support of the Integrated Business Solution Applications (IBS-Appl) at DSQ software ltd.

Used Designer/2000 for Process and ER modeling, Functional hierarchy and Data Flow; Created Tables, Views, Synonyms, Sequences, Stored Procedure, Functions & Packaged Procedures

Deployed Database Triggers

Optimized queries using rule based & cost based approach

Used SQL Loader for data loading

Was responsible for developing Forms and Reports, Preparing Unit Test Plans and Unit testing.



Contact this candidate