Post Job Free
Sign in

QA Analyst

Location:
Fayetteville, NC, 28314
Posted:
January 16, 2009

Contact this candidate

Resume:

HEMA SINGH

Sr. QA Analyst

SUMMARY:

• Having 6+ Years of extensive experience with an earned reputation for meeting demanding deadlines and delivering critical solutions on various levels of Quality Assurance for Client/Server and web based applications.

• Strong hands on experience with Load Runner, QTP, Win Runner and Manual Testing. Developed Test plans, Test Scenarios and Test Scripts for various applications to ensure proper business compliance.

• Testing expertise includes formulation of test assets, strategies and reports generation.

• Excellent understanding of Software Quality Assurance techniques and complete knowledge of software development life cycle (SDLC).

• Extensively worked on testing tools QTP, Win Runner, Load Runner, Quality Centre and Test Director.

• Good Understanding of GUI and Web based applications.

• Experience on Agile development methodology.

• Experience in Design and implementation of Test Automation Frameworks.

• Strong skills in using Black Box test methodologies and Functional Testing, System Testing, GUI Testing, Regression testing, Load Testing, Performance Testing, Stress testing, Configuration Testing, Database Testing and User Acceptance Testing.

• Have Experience in Test Case Designing, Test Execution and Test reporting.

• Experience in testing Web Services in Multi-Tier environment.

• Testing experience on various environments like Windows and Linux based setups.

• Experience in Setting up Test Environments, Creating and managing Test Data for various phases of testing.

• Used Techniques such as Equivalence Partitioning and Boundary Value Analysis to prepare complex set of Test Cases.

• Expertise in writing SQL statements for backend testing.

• Expertise in Bug Tracking and Reporting using Quality Centre.

• Good planning and organizational skills in coordinating with clients in all aspects of Project from inception through completion.

• Ability to work in tight schedules and on different applications concurrently.

• Excellent verbal and written communication skills with an eye to detail.

• Solid Analytical and Trouble shooting skills.

CERTIFICATION:

• ISTQB, Certified Tester at Test Manager Level.

EDUCATION:

• Bachelor of Science from Science College, Jiwaji University, India.

• Post Graduate Diploma in Computer Application from APARK Institute, India.

TECHNICAL SKILLS:

Functionality/Regression Tools WinRunner 7.0, QTP 8.2, 9.0

Performance Testing Tools LoadRunner 8.0 , 9.0

Test Management Tools Test Director 6.0, Quality Centre 8.2, 9.0

Bug Tracking Tools BugZilla

Configuration Management VSS

Build Management Ant

Programming C, C++, Shell Scripting and XML

Scripting Languages VB Script, Java Script

Modeling Tool Rational Rose (UML 2.0)

Database Systems Oracle DB server

Web and Mail Servers Apache Web server, Jakarta Tomcat, IIS, Web Logic.

Operating Systems Linux (Red Hat), SCO UNIX, Windows2000 / XP / Adv. Server

Protocols/Standards POP3, SMTP, WAP, HTTPS, MAPI

PROFESSIONAL EXPERIENCE:

Client: HSBC, Syracuse, NY Mar ’08 – Till date

Position: Sr. QA Analyst

Online banking system

Project Description: HSBC Bank North America is one of the Top Ten financial services companies in the United States offering 3 million customers access to global markets and services. The ‘Online Personal Banking Module’ of the website helps account holders to check their accounts and balances by entering their username and password. They are also provided the service of instant transfers of funds between checking and savings accounts. The option of paying their bills using online banking services, changing their profile, etc is also available. The fundamental goal of the project was to enhance the entire inside product line to be more robust in the current global segment.

Responsibilities:

• Worked with business users to define the scope of testing for each release. Established communication between test team and the business users to help test team understand business needs.

• Hired new resources to successfully carry out testing life cycle. Provided detailed training to the hired resources to help them understand the testing process, functionality and dependencies in the application.

• Reviewed and approved test cases created by test team. Designed test cases for complex functionality of the application.

• Conducted Data Driven testing using parameterization in Quick Test Professional (QTP) to test the application with different sets of data.

• Used Quick Test Professional (QTP) to validate links, objects, images and text on Web pages to make sure they function properly.

• Developed automation scripts in Quick Test Professional (QTP) to automate regression testing.

• Modified Object Repository to help Quick Test Professional (QTP) identify GUI objects. Enhanced the scripts in Quick Test Professional (QTP).

• Recorded various tests, enhanced the scripts, and documented the test steps/procedures.

• Created test cases and mapped them with requirements using Quality Centre. Managed the testing process, schedule batch tests, log and track defects using Quality Centre.

• Validated data integrity by creating and executing SQL queries and Database Checkpoints.

• Developed test metrics for weekly evaluation of test status.

• Created Load/Stress scenarios for performance testing using the Load Runner Controller.

• Created Vuser Scripts in Load Runner by recording, incorporating Rendezvous Points

• Enhanced Load Runner Vuser scripts by Parameterization, checkpoint, and correlation to test the new builds of the application.

• Performed stress testing of the application to verify that the required load would have no negative impact on performance. This was done through creating and executing different scripts on Load Runner.

• Used Load Runner performance monitor to analyze the performance/stress/load conditions of the application.

Environment: QTP 9.0, Load Runner 9.0, Quality Centre 9.0, Java, Servlets, JSP, Web Logic, Oracle, UNIX, Internet Explorer 7.0.

Client: MetLife, Jersey City, NJ Jun ’07 – Mar ’08

Position: QA Analyst

Multi Access Integration

Project Description: MetLife is one of the leading insurance companies. It provides the auto quotes for the customers with policies processing. Personal Liability included with your AUTO insurance policy, protects you and relatives who are part of your auto insurance policy. A variety of personal property insurance coverage’s are available, including auto and homeowner’s. I was involved in testing of AUTO, HOME, Property and Casualty insurance products.

Multi Access Integration (MAI) is designed to get the quote for auto insurance online. Customers can log onto the site of MetLife and get their quotes, they can look up their premiums and they can retrieve their quotes at any time through new business model, from the date and can do their payments.

Responsibilities:

• Involved in gathering specifications and requirements from development personnel prior to testing.

• Manual Testing was done to perform User interface Testing.

• Involved in the Automation Feasibility Study of the application.

• Involved in designing Automation Framework for AUT Using QTP.

• Responsible for creating business specific Automation flows.

• Performed Black Box Testing including Smoke, Regression and Functional Testing using QTP.

• Responsible for modifying the Object Repository to help QTP identify the GUI objects.

• Responsible for creating the folder structure for scripts based on the functionality.

• Involved in analyzing the functionality of the AUT and designing the common function libraries which can be reused during automation.

• Responsible for enhancing the test scripts as per the scripting standards.

• Analyzed the requirements and design documents to capture performance requirements.

• Responsible for creating Load Runner scripts for above application using Virtual User Generator.

• Parameterized the dynamic content in the Vuser scripts using correlation.

• Worked closely with the development team to schedule the execution of Performance tests.

• Handled the daily production/execution of the regression test packs, maintaining the test logs and daily status reporting of the testing that has been performed.

Environment: Load Runner 9.0, Quick Test Professional 9.0, Quality Center 9.0, Oracle, PL SQL, DB2, VB.Net, ASP.Net, PVCS, C#, Java, VBScript, XML, HTML, DHTML, Java Web logic Server, UNIX, Windows NT/XP, MS Office Tools.

Client: Wachovia Bank, Charlotte, NC Sep ’06 – Jun ’07

Position: QA Engineer

Wachovia Connection application

Project Description: This application maintains the information concerning the entire consumer’s banking transactions. In addition, it stores the image of the check in the database for the check transactions. The Wachovia Connection includes an improved design offering intuitive menu and search tools and easier access to popular functions and reports, as well as enhancements to Information Management and Image Services capabilities.

Responsibilities:

• Participated in Requirement Analysis, Business Analysis and Risk Analysis of applications.

• Involved in the Quality Assurance Activities and Testing of the Web application which was designed and developed for the Wachovia Connection application for Wachovia Bank.

• Test Case management and bug tracking, and bug reporting was done using Quality Centre.

• Developed Test Plans for project and managed the scope of testing sub-projects.

• Also involved in developing test scripts and conducting functional and regression testing.

• Tested the applications and used Quick Test Professional to automate some of the applications.

• Developed Test Scripts and reusable functions in Quick Test Professional..

• Created Standard checkpoints to check the values of different object’s properties.

• Created Page Checkpoints to verify if there are any broken links on the page and how long does the web page take to load.

• Created Database checkpoints to verify the contents of the database accessed by the web site.

• Parameterized the test scripts to check if web application performed well with different sets of data and under repetitive stress to flush out any memory leak bug.

• Interacted with users and conducted UAT effort.

• Worked with SQL Queries to perform database testing.

• Weekly conducted status meetings and defect review meetings with test team.

• Reported periodic project status and updates to the QA Manager with the help of different test metrics.

Environment: Quick Test Professional 8.2, Quality Centre 8.2, J2EE, Web Logic, Oracle, UNIX, MS-Word, MS-Excel

Client: State Farm Insurance, Bloomington, IL Jul ‘05 – Sep ‘06

Position: QA Engineer

Online Quotes

Project Description: State farm Insurance is one of the leading insurance companies in the United States. The aim of the project is to help the customers to get a free online quote instantaneously. This application also enabled the field agents/customer representative of the company to view prospective customer's data.

Responsibilities:

• Analyzed business requirements, module specific functionalities, identified testing requirements and formulated test plans.

• Created test cases with the help of requirements specifications for GUI, Functionality, Regression, Integration and User Acceptance Testing (UAT) using Quality Centre after establishing the critical values and workflows.

• Performed positive and negative test cases to test the response of the application under test by creating Data Driven Tests.

• Automated the scripts for Regression testing using Quick Test Professional.

• Verified Load Time, Broken Links and Number of links on the Web Page using Page Checkpoints.

• Verified that the correct data is displayed by the application using Database Checkpoints.

• Created Output Values for those object value which changes during each iteration and which is being used by test script for further test run, during Data Driven Testing.

• Automated Performance testing with Load Runner using virtual user generator.

• Created Performance testing scenarios using Load Runner Controller.

• Analyzed performance of the application using various graphs in Load Runner Analyzer.

• Used Quality Centre to organize and manage all phases of the software testing process, including planning tests, executing tests, and tracking defects.

• Defects were raised, tracked and resolved using Quality Centre.

• Provided the management with test metrics, reports, and schedules as necessary and participated in the design walkthroughs and meetings.

• Attended Change Request (CR) meetings to document changes and updated Test Cases and Test Procedures to reflect those changes.

Environment: Quick Test Professional 8.2, Load Runner 8.0, Quality Centre 8.2, SQL Server, UNIX, VB, ASP, IIS, XML, .Net, VB.Net, Visual SourceSafe, and XP Professional

Client: Euclid InfoTech Pvt. Ltd., India Aug ’04 – Jul ’05

Position: QA Engineer

Tender Information System

Project Description: Tender Information system is a tool that records the information about tenders available on different web sites and monitors these web sites for the new tender entries and any kind of changes compared to previously recorded information. This tool stores the URLs of different tender sites, Stores the derived URLs, and records the steps for reaching these derived URLs as play pieces. Also stores required information in database and at a regular interval monitor the sites for the new tender entries and any kind of changes by comparing the current available information with the previously recorded information.

Responsibilities:

• Analyzed the Business Requirements Document (BRD), modified the Master Test Plan and prepared detailed Test Cases for the AUT.

• Prepared Test Data for the AUT as per the specifications of the BRD.

• Documented and executed Test plans, Test cases and Test scripts based on baseline requirements.

• Conducted System and Integration testing, Functional testing and interacted with developers to resolve technical issues.

• Analyzed the user requirements by interacting with system architect, developers and business users.

• WinRunner was used to automate the Regression Testing after every version change of the application.

• Created Automation test scripts for Functional, GUI validation, Data Driven Testing using WinRunner.

• Automated Test scripts, inserted verification/check points to validate the test cases and created synchronization points to handle timing issues with Win Runner.

• Participated in all the bug meetings and QA meetings.

• Executed test scripts using Win Runner, analyzed the results & reported bugs using Test Director.

• Developed Base line scripts for Performance testing of the application using Load Runner.

• Created Performance Test Scenarios and modified Run time settings to simulate real time production environment In LoadRunner Controller.

• Executed Performance Testing using LoadRunner Controller.

• Analyzed Performance of the application using different graphs in LoadRunner Analyzer.

• Conducted the Database testing of the application by writing SQL Queries and creating database checkpoints.

Environment: WinRunner 7.0, Load Runner 8.0, Test Director 6.0., JSP, Servlets, Oracle, Windows2000 server

Client: Intellicus Technologies, India Sep ’03 – Aug ’04

Position: QA Engineer

Intellicus Enterprise Reporting Suite

Project Description: Intellicus Enterprise Reporting Suite is a collection of report designing, deployment, on demand generation and delivery components, built using extensible architecture. These components are independent and can be deployed on any network model to suit enterprise software architecture. This empowers organization and its partners on the Web, with a simple, yet powerful, high performance information delivery capability. This can be used as standalone enterprise reporting system or it can be integrated with any enterprise application –Web-Based or Client Server using any of the provided integration API mechanisms in JSP, COM or .NET.

Responsibilities:

• Involved in writing and implementation of the various test cases.

• Performed Manual and automated Functional Testing.

• Prepared Test Case Scripts using TSL in WinRunner.

• Created GUI checkpoints for single and multiple properties of different objects in the application.

• Created Bitmap checkpoints to check the graphs generated by the tool with the data provided during test.

• Created Database Checkpoints to check the integrity of data fetched by the application from the target database.

• Verified the reports and graphs generated by the application with the provided data.

• Created Synchronization Points to make TSL scripts wait for the application to perform any time consuming function, before moving onto the next step.

• Analyzing User and Functional requirements to point out gaps between them and the application.

• Used Test Director for Test Case and Test Procedure preparation and bug reporting and tracking.

• Followed up with development team to verify bug fixes, and update bug status.

• Performed System Testing and User Acceptance testing.

• Reported periodic project status and updates to the QA Lead and QA Manager.

Environment: WinRunner 7.0, Test Director 6.0., JAVA, Windows NT, Oracle, SQL Server.

Client: DACCIT Pvt. Ltd., Indore, India Jul ’02 – Sep ’03

Position: Test Engineer

Execution Manager - Test Case Management Tool

Project Description: Test Execution Manager provides a solution to the problems and many more, which are encountered while keeping a track of software testing process and the results of the testing process. Test Execution Manager is a Web-based application for maintaining the information related to definition, execution environment and execution results of Test Cases for testing the functional and performance capabilities of any software system.

Responsibilities:

• Documented Requirement Specifications by discussing it with Senior Testers and Test manger on the team.

• Gathered ideas from the key stakeholders about the Quality Risks, Involved in the preparation of Quality Risk Analysis document using informal techniques. Identified and documented the recommended action to mitigate each risk.

• Test Activities included Functional, GUI, Integration Testing.

• Wrote test cases and step by step test procedures.

• Executed test cases Manually.

• Bug Report generation using BugZilla.

Environment: BugZilla, IIS, SQLServer2000, ASP, VB6.0, Windows2000 server



Contact this candidate