Post Job Free

Resume

Sign in

QA Analyst

Location:
Wildwood, NJ
Posted:
February 24, 2012

Contact this candidate

Resume:

Aparna

Edison, New Jersey

Executive Summary

• Around 5+yrs of extensive and progressive experience as a QA Analyst in designing and testing with an earned reputation for meeting strict deadlines and delivering mission critical solutions on time and within budget.

• Five years experience of testing in SRM module of Oracle Customer Relationship Management as a QA Analyst.

• Strong and extensive business knowledge of Telecommunication domain and Banking domain.

• Expert in ERP testing, SAP-FICO testing and SOA testing as a QA Analyst.

• Having Good Experience in Manual Testing and Automation.

• Extensive work experience in Gray Box Testing, System Testing, Interface Testing, Functionality Testing, Integration Testing, Smoke Testing, Security Testing, Web based and Client/Server applications Testing, User Acceptance Testing (UAT) and Regression Testing.

• Extensive work experience performing Web based and Client/Server applications testing using HP Quality center, Test Director, Team Track and CDETS tools as a QA Analyst.

• Experienced in Requirement Analysis, Test Plan preparation, Test Case Design, Documentation, Execution, Defect Tracking and Analysis.

• Worked on Quality Process CMMI Level5 activities.

• Expert in writing Test Cases.

• Expert in creating, Editing, Executing the Test Scripts.

• Participated in the teams through design, implementation, testing of the software.

• Remarkable experience in SDLC and Testing Planning Process.

• Expert in performing Backend testing by writing and executing SQL queries in TOAD.

• Have good experience in data validation through Excel using Excel formulas.

• Involved in Project Planning, Coordination and implementing various QA methodologies.

• Good understanding of Software Development Process like Waterfall, V model, Agile.

• At ease in high stress environments requiring superior ability to effectively handle multi-task levels of responsibility.

• Exceptional ability to quickly master new concepts, very co-operative team player.

• Excellent communicative, interpersonal, intuitive, analysis and leadership skills.

• Proven ability to work efficiently in both independent and teamwork environments.

Technical Skills

Languages: Visual Basic, SQL, PL/SQL, JAVA, Perl

Operating Systems: MS-DOS, UNIX, and Windows XP/2000/98

Databases: MS SQL Server 6.5, MS Access, Oracle 7.x/9i/10g, TOAD,

Testing Tools: QTP

Management Tools: Test Director, Quality Center, CDETS

Network: TCP/IP, FTP, LAN.

Technical Certifications

•Oracle Certified Associate (OCA)

•HP-M015 (Quality Center Professional)

•HP-M016 (Quick Test Professional)

Professional Experience

CISCO (San Jose, CA) Sep’10 to Feb’12

Projects: TACPAC 1.0, 2.0, 3.0 Versions

Role: QA Analyst

TACPAC – It is web based tool which help CSE’s address customer issues knowing better the Customer Sentiment, SR history and also to provide the complete information like SR Type, Access to tools , CAP information etc regarding the SR’s to CSE engineer for which a new url is launched and also access to tools like CKT, CSE WB are provided.

Cisco’s TAC engineers need to know the current and most recent customer information (such as, how many times the customer has been moved from queue to queue, is the customer in CAP (Customer Assurance Program) status, what SRs the customer currently has opened or has recently closed and the possible relevance to current SRs, what Cisco products and technology does this customer utilize in its network, bugs impacting the customer environment (DDTS), RMA information, etc.). This information is important so that the CSE can gain a better understanding of the entire customer support situation, and will enable the delivery of a more informed service level based on an integrated view of current and recent support experiences. The main goal of this project is to change the feel the customer has when engaging the TAC by enabling “personalized” enhancements to make relevant customer information readily available to CSEs.

Responsibilities:

• Understand the Business Requirements, Interact with Business Analysts and QA Managers to clarify the Issues related to the Business Requirements

• Prepared the Test Plan and Test Strategy documents for different Projects and get signoff

• Review the Test cases prepared by the QA Engineers, make sure they cover all the scenarios as per the requirement and if necessary update the test cases

• Have daily meetings with the QA Engineers and assign the tasks and prepare the status report

• Involved in weekly Project meetings along with the QA Manager and submit the status report and discuss the priority issues in the meetings with the Project Manager, Business Analyst and developers

• Verify that all the requirements, test cases and defects are available in the Quality Center for all the projects.

• Involved in setting up the Environment for the testing along with the developers

• Working with the developer to resolve the issue and load the data to the database

• Checking the database by executing the sql’s, whether the data loaded correctly to the database (or) not.

• Involved in testing the application in multiple browsers viz., IE 6.0 and Firefox 3.0 to ensure the cross-browser support

• Performed Gray Box Testing to validate the data in CKT tool against SRM interface.

• Performed Regression, Functional and User Acceptance Testing

• Validating the large amount of data through Excel by using Excel formula’s and prepare the sheets with the before and after data results

• Logged defects in CDETS tool to assign to the developer

• Giving Signoff from the QA after completion of the Testing

• Prepared Testing Documents for each Functionality of the Application which includes what we need to test, how it should be tested etc and get the signoff from the BA’s and Developers.

• Prepared Test Evidence Documents for the Business to get the signoff

• Involved in Prod checkouts.

• Involved in writing SOPs

• Prepared Test Metrics on monthly basis to know the status.

Environment: Quality Center, Oracle Apps, MS Office, Windows XP, SQL, CDETS, Microsoft Excel, TOAD

CISCO (San Jose, CA) May’09 to Aug’10

Project: Smart Services Dashboard

Role: QA Analyst

Smart Services Dashboard uses various ticketing systems through which the partner/customer issues are logged. These include RNT, Remedy and TSRT. The dashboard is being enhanced to provide better view into trends related to these different cases. The project goal is to deliver a dashboard that will help track the cases which have been escalated and have traversed the different customer service applications – Remedy (Alliance), RightNow and TSRT.

Responsibilities:

• Understand the Business Requirements, Interact with Business Analysts and QA Managers to clarify the Issues related to the Business Requirements

• Arrange Web meetings with the Offshore QA Testers and clarify the Issues related to the Business Requirements

• Arrange Daily meetings with the Offshore QA Testers to get the status of the Day and to assign the work for the next day

• Develop the test plans and test strategies to facilitate the process of testing

• Ensured traceability of test cases back to the Business requirements.

• Performed Manual Testing on front end web application and logged defects in both QC and Cisco Defect Enhancement Tracking Tool.

• Performed Functional, Integration, Regression, System and User Acceptance Testing of Smart services dashboard application.

• Checked the Data table records by executing the SQL Commands in TOAD.

• Created various types of service requests in Right now tool (RNT), Remedy tool and TSRT tool and adding notes to each and every request.

• Verifying the back end oracle table test data to make sure that it is properly reflecting in the dashboard after night batch cycle completes.

• Performed Gray Box Testing to validate the data in CSEWB tool against SRM interface.

• Involved in Daily and Weekly Over All Project status meetings from Testing side

• Involved in Training the Off shore team and the new members in the team with the front end web application and also checking the database table records manually.

Environment: Quality Center, Team Track, Oracle 10g, TOAD, MS Office, Windows XP, Remedy Tool, RNT tool, TSRT tool

CISCO (San Jose, CA) Mar’08 to Apr’09

Organization: Mahindra Satyam

Project: NLS Tac Tool Timers

Role: QA Tester

NLS TAC Tool Timers is a web based application; main goal of the project is to Display of Restoration SLA time (Elapsed time) the engineer worked on the NLS Customer Service request in SRRAT Tool.

• Send Email Alerts to CSE, Manager and/or Subscription mailing list.

• Adding internal note to Service Request for every Alert that goes out from ILS.

SRRAT (Service Request Risk Assessment Tool).This tool is created to assist and ensure that the “at risk” SR’s received timely attention, SR’s are properly progressing and timely actions are taken with respect to excessive CSE backlogs.

Responsibilities:

• Involved in the development of Test cases and Test plans , by reviewing the Business Requirement documents,

• Interacted with Business analysts and QA Managers to clarify issues related to requirements.

• Arrange the meetings with Business analyst and clarify the issues related to requirements.

• Developed the test plans and test strategies to facilitate the process of testing

• Ensured traceability of test cases back to the Business requirements.

• Created NLS type of SR’s in Oracle app’s and validated whether the SR information is reflected in various Cisco tools like CSEWB, CKT and SRRAT.

• Performed Escalation of the SR’s to different levels based on the time which it has to get fixed.

• Manually check the Data base records while executing the SQL commands with TOAD.

• Performed manual testing of application using Quality Center to develop test cases, test scripts, executing the scripts and logging the defects in Quality Center.

• Performed positive and negative test cases to test the response of the application under test by creating data driven tests.

• Provided a weekly Defect report status to Team that includes test metrics and status of defects.

Environment: Quality Center, Oracle Apps, SRRAT tool, CSEWB, ILS, TOAD, SQL, MS Office and Windows XP.

CISCO (San Jose, CA) Aug’07 to Feb ’08

Organization: Mahindra Satyam

Project: Cisco Security Center

Role: QA Team Member

Cisco Security Center is a web based application and its service space is presented to consumers in distinct, non-integrated and disjoint delivery systems, such as MySDN, PSIRT, IPS Signature, TAC and Intellishield Alert Manager, Advanced Services, and BU sites.

CSC has two distinct parts such as Admin tool and End User (Consumer).End User (Consumer) can access the CSC application from Non Cisco networks like internet but where as Admin tool should be access from Cisco intranet.

• Admin has the capability to administrate the CSC application in two ways such as Choose a page to administer and Edit functionality on landing page.

• End User (Consumer) can view credible information about threats and vulnerabilities that may affect their environments. Intellishield Alert Manager allows organizations to spend less effort researching threats and vulnerabilities, and focus more on a proactive approach to security.

Responsibilities:

• Involved in developing test plans from business requirements by interacting with end-users and developers related to production problem fixes.

• Ensured traceability of test cases back to the Business requirements Involved in testing the application in multiple browsers viz., IE 6.0 and Firefox 3.0, Safari to ensure the cross-browser support.

• Involved in modeling QA process workflow using Quality Center

• Involved in Manual testing using Quality Center to develop test cases, test scripts, executing the scripts and logging the defects.

• Interacted with developers to explain where required, software bugs and re-tested the fixed issues.

• Identified and documented all issues and defects to ensure application software functionality.

Environment: Windows XP, Safari browser, TOAD, Quality Center, Oracle 9i, Team Track

National Australian Bank Dec’06 to July’07

Organization: Infosys

Project: General Ledger Migration

Role: QA Engineer

Over the time National Australian Bank(NAB) has implemented a number of systems that could be considered "General Ledgers" throughout its Finance IT architecture, often these systems were implemented to rectify perceived shortcoming with other systems. This "bolt on" approach now means transactions can flow through a number of systems (at each step potentially being summarized or translated) before being ultimately reported. The goal of the GL Migration is to achieve a 'like for like' migration into SAP, of the existing GL processes and functionality within:

Millennium Australia accounting environment for Australian Financial Services (AFS)

Millennium Treasury Sub Ledger accounting environment for Australian, UK and USA Wholesale Financial Services.

Responsibilities:

• Involved in the development of test requirements from business use cases by interacting with business analysts and developers.

• Involved in the development of test plans and test strategies to facilitate the process of testing

• Performed in modeling QA process workflow using Quality Center

• Ensured traceability of test cases back to the Business requirements.

• Created various test account numbers in SAP for various regions (Aus, US, UK) and checked whether they are reflecting in the database.

• Performed System Testing, Integration Testing, Regression Testing and User Acceptance testing.

• Performed SAP Interface Testing to validate source data.

• Performed manual testing of application using Quality Center to develop test cases, test scripts, executing the scripts and logging the defects.

• Involved in Functional, Integration, Regression, System and User Acceptance Testing

• Performed positive and negative test cases to test the response of the application under test by creating data driven tests.

• Provided a weekly project report that includes test metrics and status reports.

• Reported defects and participated in review and project status meetings and interacted with developers to resolve the problems.

• Involved in the business assurance test strategy for the testing platform

Environment: Quality Center, TOAD, UNIX, MS Office, Windows XP, SAP 6.8 and Putty.



Contact this candidate