Post Job Free
Sign in

Project Manager Testing

Location:
Los Angeles, CA
Posted:
September 15, 2011

Contact this candidate

Resume:

Vajeed

Email: ********@*****.***

Summary

• Over 8 years of experience in the field of Software Testing and Quality Assurance.

• Extensive knowledge in Software Development Life Cycle.

• Expertise with Quality Center, Quick Test Pro, Load Runner, PVCS Tracker and Share-point.

• Hands on experience in Manual as well as automated testing.

• Good understanding of the QA life cycle starting from System Test Plan, Test Execution to Reporting Bugs.

• Extensive knowledge of Defect Life Cycle Management.

• Experienced in Black box testing, White Box Testing, Regression testing, Exploratory testing, Integration Testing, Data driven testing, Unit Testing, User Acceptance testing, GUI testing, Usability Testing , End-End Testing, Ad-hoc testing, Keyword driven Testing.

• Experience in Backend Testing and in executing SQL commands

• Configuring and implementing Defect Tracking process to provide Test Metrics.

• Experience working with Load Testing, Performance Testing, Stress Testing.

• Expertise with using the Defect Tracking tools such as Jira, Bugzilla, And Rational Clear Quest.

• Knowledge and experience creating Automation frame work.

• Experienced in Database Testing using SQL queries on RDBMS databases like SQL Server, PL/SQL and Oracle.

• Experience programming in VB, UNIX, C, C++, PERL, HTML, DHTML, CSS, XML, Python.

• Followed up with RTM(Requirements Traceability matrix)

• Excellent verbal, written, communication and presentation skills, team member and self-starter.

• Ability to work independently when ever needed and as well as in a group.

Technical Skills

Testing and Reporting Tools: H.P-Quality Center 8.0/9.0/10.0,HP-QTP(Quick Test Pro 8.0/9.0/10.0), HP- Load Runner 8.0/8.2/9.5, Test Director 8.0, Win Runner 7.5, Merant (PVCS) Tracker 6.0

Operating systems: UNIX ,Windows95/98/2000/XP/NT/Vista, IBM AIX

Database and Tools: Oracle, SQL Server 6.5/2000, DB2, MS Access, TOAD

Defect Tracking Tools Jira, Bugzilla

Languages SQL, and PL/SQL, VB Script, J2EE, Unix, HTML, XML, CSS, Perl

Professional Experience

First Data Corporation, Dallas,TX Feb 2011- Current

Role: UAT Tester

Responsibilities:

• Involved in preparing the System Test Plans based on Business Requirements Documents (BRD).

• Analyzed the Business requirements, Performed ambiguity reviews and communicated the inconsistencies between the requirements and application. Participated in Requirements estimation, Requirements Review meetings.

• Developed Test Design Specifications and Test Scripts for various UAT Test Scenarios.

• Participate in UAT planning and execution activities for business

• Worked on Defect Management Life Cycle.

• Created Oracle, SQL Performance and Functional test cases.

• Executed data base Performance and Functional test cases.

• Created and Executed QTP Test scripts and Monitored testing progress and results in each test cycle.

• Develop and maintain automation test scripts for web based applications using Python.

• Analyzed the test results by using the Expert View and the Keyword View.

• Inserted Database Check points, Bit-map Check point, Text Points in the QTP scripts to uniquely identify the object.

• Inserted the functions which store the Output value in QTP.

• Conducted Regression Testing for various Scenarios using QTP.

• Defect tracking and reporting. Provided production support to troubleshoot/resolve user's Web site challenges.

• Functional responsibilities included performing Smoke Testing, GUI Testing, System testing, Database testing, Usability testing, Security testing, Interface testing and Regression Testing.

• Developed SQL Queries to query the database to test the back end process of the application.

• Participated in daily project meeting with the entire testing team.

• Tracked all the bugs and fixed them using Clear Quest tracking system and generated reports and test metrics.

• Generated Defect Reports and Metrics to provide project status to upper management.

• Used QTP for automating tests. Used HP quality center to analyze requirement coverage, business risk analysis, managing tests, tracking defects

Environment: HTML, QTP, VB-Scripting, Quality center, Performance Center, QTP, Rational Clear Quest, Perl, SQL Server, Windows.Net, J2EE, Java, XML, WSDL, Oracle 9i, Visio, Windows XP ,

United Health Care, Frisco, TX Jan 2010- Jan 2011

Role: UAT Tester

Responsibilities

• Involved in the development of system test plan and test scripts using business and System Requirement documents and Use cases.

• Worked on multiple releases on different environments at the same time.

• Executed the test cases written for Drugs, Test and Procedures in the Test Lab in Quality Center.

• Involved in peer-peer Review of test scripts written by team members before the test scripts were finalized.

• Was responsible for the Web services and Authentication domain in the project.

• Worked extensively on the Registration and Eligibility Part.

• Created Defects and reported for the unexpected behavior of the portal.

• Performed Regression, performance, end-end, User acceptance testing of portal.

• Recorded scripts in automated tools such as Quick Test Pro and Load Runner.

• Used extensively Parameterization to provide varied inputs for the scenario under testing.

• Inserted Transactions in scripts to record the response time of the application by considering the real time scenarios.

• Experience working with developers in fixing the defects.

• Created Back up for the Testing data such as testing scenarios, test scripts, Defects Found or Fixed.

• Helped Work stream leads in performing smoke test.

• Reported Status of work to work stream leads when ever needed as per demand.

• Worked with Database support team to verify that registration process.

• Attended Many BRD document meetings.

Environment: HTML, HTTP, Java, J2EE, Windows-2003, HP-Quality Center, Performance Center, 10.0, HP- Load Runner 9.0, HP-QTP 10.0, Share-point, VB-Script, XML, TOAD, Oracle 10g, SQL Server, SQL, UNIX.

Comcast, Mount Laurel, NJ Sept 08 – Dec 09

Role: IVR (UAT) Tester

Responsibilities

• Involved in complete Testing Cycle from Planning of Project testing Estimates to the deployment of the Project to Production.

• Created the Test strategy document that defines the test environment, phases of testing, entrance and exit criteria into different phases of testing.

• Created Test plans, mentioning the Approach, Description, Assumptions, Constraints and Risk involved in the Project.

• Conducted BAT (Build Acceptance Testing) after each Build is deployed in QA.

• Developed and Executed Test cases and participated actively in GUI, functional, unit, integration, system testing and regression testing for the Web Based application as well as the Windows based application.

• Performed regression testing by executing the baseline scripts to identify functional issues.

• Involved in User Acceptance Testing with Business to make them understand the Major Functionalities of the Application.

• Integrated IVR and routed the Call Records to the billing servers. Worked on designing Test Case, Test Plan for IVR testing and integration.

• Developed various reports to communicate testing issues with the project manager.

• Conducted Defect Review Meeting with Project Team after each Test Cycle to discuss the issues, based on the Severity.

• Since the application contains sensitive information of customers, performed Security Testing to ensure that security was maintained throughout the application.

• Maintained defects in Quality Center and participated in daily defect review calls.

• Ensure that all the test cases are updated in the Quality Center along with Master test plan.

• Created and implemented a quality control process for protocol configurations within the internal IVR / IWR system.

• Verified results by entering pin numbers on IVR systems and validating the results.

• Developed and executed SQL Queries to perform backend database testing.

• Performed through backend tests to validate the server side functionalities; verified the Triggers / Procedures of PL/SQL and table constraints to implement business logics.

• Performed Data integrity testing by executing SQL and PL/SQL statements.

• Performed Post Implementation Testing of the Application after Production deployment.

Environment: Java, J2EE, MS Visual Studio, Quality Center 9.0, IVR, Oracle, Toad, Pl/Sql, UNIX.

T-Mobile, Plano, Texas. July 2007- Jun 2008

Role: QA Engineer/Automation Engineer

Responsibilities

• Reviewed software and business requirement documents to get a better understanding of the system on both technical and business perspectives.

• Analyzed the current system in operation and developing the Test Strategy based on the client’s information.

• Involved in writing and executing Test Plans and Manual Test Cases for different modules of this application.

• Developed manual test cases to perform Functional, System and User Acceptance Testing of the application

• Used the Data driven testing and database accessing techniques to support the scripts.

• Wrote Queries to extract data from various database tables for testing purpose-using Toad.

• Run sessions using the Workflow manager in the testing environment during regression testing.

• Used TestDirector as a defect management system. Map requirements, tracked results and reported defects within Test Director and created graphs and reports for management and steering committee meetings.

• Interaction with Business Development Team, Application Developers, Project Manager and other team members on application testing status when necessary.

• Analyzed software problems and identified software anomalies investigated their cause and effect and follow prescribed QA standards and protocol assigned by SQA Lead/Manager.

• Communicated release status with QA, development, support, and external software vendors to reduce development time.

Environment: Quality center, Performance Center, QTP, SQL Server, Windows.Net, J2EE, Java, XML, WSDL, Oracle 9i, Visio, Windows XP .

Client: Bank of New York. Dec 2006 – May 2007 Role: QA Engineer/Automation Engineer

Responsibilities

• Involved in creating Test strategy, Test plan and Test cases.

• Developed QA Test plan from technical specifications and requirements for this project.

• Manually tested the whole application before going onto automated testing.

• Developed and executed Test Cases and verified Actual results with Expected results.

• Tested entire functionality of the application on different browsers like IE, Netscape, and Safari.

• Used Quality Center for bug tracking and reporting, also followed up with development team to verify bug fixes, and update bug status.

• Exported Test Cases from MS Excel and Requirements from MS Word to Quality Center

• Strong hands-on experience in maintaining and using IBM Rational Clear Quest for Defect reporting, tracking & Change Management

• Also performed User Acceptance Testing (UAT) on all the Trading Apps.

• Interacted with developers and business Users to communicate the defects.

• Tested Options Price Reporting Authority (OPRA) tool for options security trading through Market Data Vendors, last sale information and current options.

• Involved in various testing investment securities like Equity, Fixed income, Mutual funds.

• Established the Automation Framework using a modular foundation of generic functions and libraries.

• Inserted Check Points to Check for the broken Links, Text, and standard properties of an object using QTP.

• Created User Defined Functions in QTP.

• Also designed BPT components.

• Inserted XML checkpoints.

• Worked on both Expert view and Keyword view in QTP.

• Used Parameterization for using various data to test the application using QTP.

• Performed Negative testing to find how the functions and variables perform when they encounters invalid and unexpected values.

• Ability to develop automated scripts from scratch which are unable and maintainable.

• Actively involved in black box, white box, negative testing of the application.

• Performed Regression testing on the successive releases.

• Worked on Parameterization in QTP and data driven testing.

• Experience in using Quality Center as a repository for automation test cases.

• Expertise in advanced descriptive programming in QTP using VBScript. Used Recovery Scenario wizard.

• Performed data driven testing, designed Input/output check points to validate the data and develop effective automated QTP Scripts.

• Was completely involved in Descriptive programming for objects which were giving issue.

• Used VBScript heavily for writing the automation test scripts.

• Developed, modified and debugged scripts in Automation frame work using QTP for future release.

• Modifying and creating SQL queries and stored procedures for quality assurance and analysis.

• Involved in extensive Data Validation using SQL queries.

• Verify the files in the specific landing directory in UNIX.

• Monitor the servers as the scripts are executed.

• Executed smoke Testing to test the main features of the application as and when required

• Define a number of test cases using quality data end-to-end business processes during the UAT and validated the system set up for transactions and user access in UAT.

• Participated in walkthrough’s and technical Reviews all through the testing phase.

Environment: Quality Center 9.2, QTP 9.2, Load Runner, Oracle, Excel, MS Word, Java Script, , XML, VB Script, ASP.Net, UNIX, Rational Clear Quest.

Canara Bank, Hyderabad, INDIA Sept 2004 – Nov 2006

Role: QA Tester

Responsibilities

• Involved in the complete QA Lifecycle activities like Pre-testing phase, acceptance testing phase and testing phase.

• Analyzed System and Functional requirements developed & executed detailed System Test plan, Test cases, and Test scripts for testing the functionality using QTP.

• Created SQL views and Queries to validate the data in Database Tables for the data updates and displayed thru web screens and reports.

• Involved in automated testing of web screens using QTP.

• Develop and maintain automation test scripts for client -server and web based applications using Python and Shell Scripting.

• Used QTP to automate manual Test scripts for Regression.

• Created test cases and performed various levels of automation testing such as Unit Testing, Integration testing, Regression testing, System testing, End-to-End Testing, API testing and User Acceptance testing(UAT) for different modules.

• Exceptional proficiency in executing UAT scripts, in order to ensure Operational Quality, System Integrity and Verify System Processes meet user needs.

• Used VBScript file to load all the function libraries using Library functions in QTP Utilities.

• Implemented QTP to handle Shared Object Repositories.

• Performed the Back-End testing to ensure data consistency on front-end by writing and executing SQL queries on the Oracle database

• Prepared detailed Test Metrics on a weekly basis for the project members to Know the status of the application

Environment: QTP, VB-Script, UNIX,Python,J2EE, Java, XML, Quality Center, Load Runner, MS Office, ORACLE, SQL Server, PL/SQL, Perl,TOAD, Web Sphere, Windows 2000/NT.

Associated Road Carriers, Hyderabad, India Sept 2003 – Aug 2004

Role: Manual Tester

Responsibilities

• Involved in analyzing the applications and development of test cases

• Applications are tested manually.

• Designed application using VB and SQL

• Tested Various Modules manually.

• Identified the test requirements based on business requirements

• Responsible for doing System testing of the entire applications along with team members

• Analyzed Results with Business Analysts.

• Worked with technical analysts to develop functional specifications for the features

Environment: Visual Basic, Test director7.6.

EDUCATION

Bachelors in Electronics and Communication.



Contact this candidate