Post Job Free
Sign in

Quality Assurance Project

Location:
North Lauderdale, FL
Posted:
December 12, 2013

Contact this candidate

Resume:

Sharon A. Smith

**** *. ***** ******

North Lauderdale, FL 33068

954-***-****

**********@*****.***

Dedicated IT professional with over twelve years’ experience within the role of a QA Analysts that embraces the passion

for quality, rejects status quo and act like an owner throughout all faucets of any project assigned to.

Summary of Qualification

• Skilled in black box testing of web and mobile software applications

• Experience in software development at all stages (SDLC) such as planning, requirement, development and

testing efforts such as QA/UAT

• Posses in-depth knowledge of the various testing strategies Integrated, System/Functional, Regression,

Security, Usability, GUI development and Performance testing.

• Strong ability to Translate Customer requests into Objective Project Goals

• Experienced in interacting with and making presentations to staff and customers

• Strong networking and people skills with an ability to get the job done

• In-depth knowledge and participant in Agile/Scrum methodologies

• Automated testing using record and playback techniques

TECHNICAL EXPERTISE

• Software: MS Office 2010, SharePoint, JIRA, Pivotal Tracker, TOAD, PG Admin III, QMetry, XML, OnTime,

HP/Mercury QTP/QC, QTrace, Test Complete, Cross Browser, GWT, ImageNow, PeopleSoft, Oracle 6i, 10g, 11i

and R12, Selenium

• Programming Languages: JavaScript, PLSQL, HTML, Ruby on Rails

• Operating Systems: Win XP, Vista, LINUX

• Databases: Oracle, MS SQL Server, Postgress

PROFESSIONAL PROGRESSION

QA Lead, MCNA Dental Sept 2012 – Present

As a Lead Quality Assurance Analyst I was brought on board to develop and implement a QA team. I am currently

responsible for the planning, execution and reporting of test results on all products developed within the IT department. In

addition, I provide all direction as needed to the test engineers and analysts based on project need and directions for

ensuring that the appropriate test strategies are executed. I am also responsible for mentoring, motivate and train Quality

Assurance Analyst and Test Engineers as needed. I work independently with little to no direction from mid level to upper

management. I am also responsible for providing updates which includes and are not limited to progress as well as delay

contributors directly to the Software Development Director and CIO on a timely basis .

• Validate reports required for multiple state run programs such as Medicaid and Healthy Kids

• Write and execute SQL queries to validate data output received within reports generated

• Performed data analysis at the database level using tools such as Toad and PG Admin

• Validate inbound and outbound data files output according to their various EDI format/layout according to each

state requirement

• Provided development and implementation of training program for user audience to use tested

applications.

• Development and maintenance of Test Plans, Test Cases, Test Results and Traceability Matrix using HP Quality

Center as repository to validate functionality of multiple application.

• Authorship of end user manuals to guide users with proper operation of newly implemented web applications

• Performed usability, GUI, functional and regression testing

• Utilized Pivotal Tracker for reporting software defects and for monitoring their resolution status

• Schedule and coordinated regular QA team meetings to discuss testing process and resolve issues

• Serves as liaison between QA team members and developers to resolve issues

• Tested the WEB GUI component of the system for usability, GUI, navigation, performance

• Provided functionality, regression, release acceptance, performance and integration testing

• Reviewed Functional Specification to develop detailed test cases

• Executed test cases in QA Environment within multiple browsers

• Tested installation of the application and reported numerous problems

• Executed cross-browser and cross-platform tests in virtual machine (VMware) environment

• Developed and implement quality assurance standards and procedures according to best practice for the team.

• Verified bugs fixed in new releases

• Conducts QA Team meeting and regular discussions on new features, software bugs, software testing problems

and focus

QA Analyst, Cross Country Home Services April 2000-Sep 2012

Within the role of a Quality Assurance Analyst, my key responsibilities includes and was not limited to providing support

across all phases of the SDLC for a combination of simple and complex implementations. These efforts are performed

through collaboration within the IT department and extend across multiple business units to execute and validate test

cases/scenarios based upon system requirements. In addition, identifies and documents defects throughout test

execution in a fast-paced, fluid and changing environment using defect tracker tools.

• Validation of web applications to ensure performance within established business practices and company

performance objectives

• Manage and perform test efforts for huge data migration during system conversion and upgrade (PeopleSoft to

Oracle)

• In-depth knowledge and skillset around ETL process and procedures as required for data extraction, load and

conversion

• Reported measurement results for performance and progress of QA resources using documented metrics.

• Used SharePoint as the main/daily communication source for document distribution and results update among

business and the remainder of project team

• Development of Test Plans and Test Cases to validate functionality using

• Entry and tracking of defects into Test Management tool (HP Quality Center) based on severity level and

functional module.

• Mentoring of junior QA resources within established QA policies and procedures.

• Querying of Oracle database with TOAD/PG Admin III to ensure usage of current and accurate test data.

• Tested mobile/web application (automated/manually) developed for members to process and track the results of

their service requests

• Studied business/functional requirements and technical specifications to create test documentation within the

scope of the project for (internal/external) on web applications (user interface)

• Created and executed detailed test cases for validating web applications such as provider and external member

portal on mobile devices Performed Smoke, Functional, Integrated, Usablity/GUI, and Regression testing as

applicable during the test efforts

• Identified software failures and reported them using Bugzilla which integrates with Jira

• Participated in regular PMO project meetings to provide estimates to PM/team for QA efforts

• Participated in peer reviews for Test Plan and Test Cases

• Communicated with QA team members and developers to resolve issues

• Tested the WEB GUI component of the system for usability, GUI, navigation, performance results

• Develop and execute formal test plans/test cases to ensure the delivery and quality for multiple projects of varying

size

• Performed testing on various applications systems and reporting systems such as SharePoint, OnTime and

ImageNow for company wide use

• Validates that user expectations are achieved during the testing process

• Documents and reports system defects to the project team using

• Identifies and communicates business risks relative to testing and implementation for evaluation by project teams

and management

• Works with business analysts and project managers to determine the most effective testing methods and

inspection strategy for projects of varying complexity and size

• Experience with functional testing, integration testing and regression of Web applications

• Ensures that measurable test objectives are established for each system being tested

• Ensures that defects uncovered in the test are recorded, summarized and utilized in post project reviews in an

effort to improve the development and test processes

• Works with Systems Development team and business to ensure the timely, thorough testing of technological

enhancements prior to release to a production environment

• Writing use cases and test plans, testing criteria, including data type validation, pass/fail conditions, boundary

conditions, as well as experience designing, developing and testing processes for data

• Review business requirements, functional specifications, and design documents to create detailed test

cases/scenarios and test scripts

• Ensure proper mapping of test steps, test scripts, test cases to requirements using Requirements Traceability

Matrix (RTM)

• Accomplish assigned work by designated due date

• Identify issues/risks and alert supervisors with sufficient lead time to avert crisis

• Identify changes in scope or work effort that could result in budgetary overrun or the missing of delivery dates

• Operate independently as appropriate and work cooperatively as part of a team to accomplish the goals

• Design and execute functional and non-functional test cases; debug and report product issues; and

participate/lead defect triage within the project team

• Contribute to the development of quality assurance best practices and establish quality measures; collaborate

with other QA team members to bring continuous improvement to our strategy, processes and tools

• Validate product deployment transitions from the test environment to staging and from staging to production

• Write and execute PL/SQL queries to validate database table and contents, read extract files and perform field

level validation

• Collaborate with necessary peers and proactively pursue required knowledge and/or problem resolution

• of the program and the team

ACADEMIC CREDENTIALS

• HS Graduate

• Testing Certification

• Some College



Contact this candidate