Sheena David
(QA Analyst)
PROFESSIONAL PROFILE:
• Accomplished professional with over 7+ years of experience in Software Quality Assurance,
Development and support of Software applications
• Extensively worked as a QA Analyst in Web applications, Client Server Applications and Database
Systems for different software industries.
• More than 5 years’ experience in Manual Testing and also worked on some Automation testing,
with automation tools and Quality Center.
• Expertise in Installing and configuring the test environment on both Windows and UNIX platforms.
• Expertise in testing various applications in Java, Oracle and C++.
• Well versed in Analysing Requirements, System Specifications, Use-Cases and Technical
specifications.
• Experienced in developing Test plans, Test Cases and Test Scenarios for Web and Client/Server
applications.
• Experienced in creating and implementing testing strategies and SQA strategies using Agile-testing
methodology in UAT phase.
• Experience working in an Agile environment, by being a very active participant along with the
technical leads in Tasking and estimating stories.
• Worked with Mainframe Technologies like COBOL, JCL, VSAM, DB2, and CICS on MVS, OS/390
and Z/OS systems.
• Experience in GUI Testing, Regression Testing, Data Driven Testing, Functional Testing, Performance
Testing, Database Testing and User Acceptance Testing.
• Experience in Black Box and White Box Testing.
• Good at bug-reporting and bug-tracking using Test Management tools like Quality Center, Clear
Quest, Bugzilla and Jira.
• Efficient in reviewing project requirements and creating test plans and test cases.
• Proficient in test execution and management of testing defects.
• Considered an innovative fast learner and self-starter who operates independently and also is a good
team player.
TECHNICAL SKILLS:
JAVA,CICS, Web Services, MQ Series, SPUFI, FILE-AID, JCL, CA7, CA11, TSO, TEST DIRECTOR, HP
QUALITY CENTER, Team Foundation Server (TFS), QTP and SQL, SQA Methodologies, Agile/Scrum
methodologies, QTP, COBOL, JCL, DB2
TMX Finance - Atlanta, GA March 2011-April 2014
QA Analyst – Manual & Automation Testing
Scope:
TMX Finance offers a title loan product which allows customers to meet their liquidity needs by borrowing
against the value of their vehicles while retaining use of their vehicle during the term of the loan. The
Company holds more than 728 stores in twelve states.
Qfund is the application which facilitates, controls all the activities required for the business which
includes Customer Registration, Title Loan process, Rescinding, Renewal, Repossession, Sale etc.
Responsibilities:
• Involved in gathering and analyzing the Business & Technical Requirement.
• Designed and Developed Test plans based on BRD and FSD’s
• Involved in creating Test Strategy documents, planning test execution activities, defect reporting
and analyzing test metrics.
• Leading Defect Review Meetings.
• Followed Agile/Scrum Methodologies for all of the project activities.
• Involved in walkthroughs with BA’s, Development team and Quality team. Performed Manual
Testing and Automation testing on the Application.
• Writing and Maintaining all the Functional and Regression Test Cases in Microsoft Test
Manager (MTM).
• Solid experience in testing DW Reports which includes Cash Sheet, Daily Summary Report
etc. using various complex SQL queries from the database.
• Performed Functional, System, Integration, Performance, End to End, Regression & UAT
testing.
• Checked database to determine successful transaction of test data from the application by
establishing connectivity using SQL commands.
• Involved in cross browser testing on IE8 and IE9.
• Created Test Data for QA and UAT using the automation test scripts in QTP.
• Developed different kinds of reports that showed the number of test cases executed, number
passed/failed and the number of test cases left to execute on a daily and weekly basis.
• Involved in writing SQL queries for generating Reports from TFS to Management.
• Responsible for writing Test Cases, executing Test Cases and created Test Execution reports
using Team Foundation Server (TFS) which included Test Execution Burn Down, Defect
Report etc.
• Developed Test Automation plan and strategies to run test cases in QTP.
• Analyzing Test cases, Test scenarios, Test Strategies, Test data etc., to automate them in QTP
and running the scripts after defining them.
• Solid experience in Mobile Testing with respect to Text Messaging.
• Tested various Customer Print Forms for different products and states with respect to their
Form Definitions.
• Extensive Backend Testing when data migrations happen from one application to another.
Environment: JAVA, Web Applications, TFS, SQL, QTP, Mobile Testing, DW Reports, VBScript, MTM,
XML web services testing.
Blue Cross Blue Shield, Michigan Jan 2009 – Feb 2011
Testing Analyst
Scope:
Blue is a web-application built to transmit transactions in real-time. Ultimately, Blue will allow Plans to
manage claims inventory, request claim status, view claim formats, request medical attachments and
send electronic attachments on-line. The Blue software is designed, developed and distributed by the
Blue Cross and Blue Shield Association (BCBSA).
Responsibilities:
• Analysis of all Technical Transformation Rules, Business Translation Rules for both functional and
business requirements from the Mapping Specifications and Functional Design specifications.
• Executed all the written test cases with respect to Business Requirements with respect to Claims
Processing, Enrollment and Group and Individual Member billings.
• Mainly involved in 410 to 510 conversion project.
• Executed SQL statements for data validation to check data integrity, error handling, data
redundancy, and data consistency.
• Acted as a Defect Manager responsible for reviewing, assigning and tracking defects until
closed.
• Reviewed the Business Requirement Documents and the Functional Specification and discussed
with the team to make sure the requirement understanding is consistent.
• Prepared Test Plan from the Business Requirements and Functional Specification.
• Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any
Change Control in requirements leads to test case update.
• Tested to verify that all data was synchronized after the data is converted and also used SQL to
verify/validate test cases with respect to the UI Data.
• Validated notices, claims detail reports, formats from the provider.
• Developed automated test scripts using QTP to perform Functional and Regression testing.
• Worked closely with the development team in making the detailed QC test plan regarding the
scope and focus of the testing.
• Performed Functional and UAT Testing.
• Involved in Agile and Waterfall Development Cycle, and used Agile testing methods such as
Scrum and Extreme Programming (XP).
• Experience in running batch on UNIX and checking the results in Log files.
• Performed regression testing for various application releases by executing baseline scripts,
identified functional errors and interacted with developers to resolve technical issues.
• Worked with Flat files on UNIX environment and acquired knowledge of shell scripting.
• Executed test scripts and reported defects in Quality Center tool.
Environment: QC/ALM, VB Script, Windows XP, Java, DB2, J2EE, UNIX, ETL SQL Server 2008, SQL
Scripts, MS Word, SoapUI, QTP.
J B Hunt Transport Inc, AR Jan 2007- Dec 2009
QA Analyst
Scope:
JB Hunt Transportation Services is a leading transportation and logistics company with over
20,000 employees with revenue of over $2 billion a year with customers in the United States,
Canada and Mexico.
The Finance System runs business processes that involve the transfer of funds in and out of the
company. Enhancement projects are undertaken on a regular basis to add new functionality and to fix
production bugs.
Responsibilities:
• Worked on the System Testing and User Acceptance teams for multiple enhancement projects for
the Finance Department, involving Java, PeopleSoft Financials and Mainframe applications.
• Participated in Requirements review meetings with the Business Analysts and Business Team
and was involved in creating Test plan documentation.
• Created Target/progression Test cases in Quality Center, based on project requirements.
• Also involved in creating and updating regression test cases.
• Involved in mapping test cases to requirements and maintaining the Requirements Traceability
Matrix.
• Created test data, executed test cases and managed defects during the System Testing and UAT
phases.
• Executed testing for in-house built custom CICS based screens for vendor maintenance and
also batch file processing for AP invoice load using file load JCLs .
• Validated batch file processing results by using java based online application pages and also by
running SQL queries against the underlying DB2 database using QMF.
• Involved in testing item creation and maintenance, payment processing and also write-off
functionality within the Accounts Receivable module.
• Performed Functional, System, Integration, Performance, End to End, Regression & UAT test
• Logged and managed Testing defects using IBM Rational ClearQuest.
• Conducted weekly Defect Status meetings to check status of defects and provide information to
assist with the defect resolution.
• Involved in providing timely testing status during testing phases and also in generating Test
Results summary reports.
• Responsible for ensuring that testing objectives and the testing exit criteria are met.
• Wrote SQLs queries using QMF to query the DB2 database on mainframe.
• Worked closely with development team in resolving bugs and verified and closed bug reports
once they are resolved.
Environment: COBOL, JCL, DB2, CICS, VSAM, SQL, FILE-AID/MVS, FILE-AID/DB2, CEDF, QMF,
XPEDITOR, ENDEVOR, TEST DIRECTOR 8.0, MS OFFICE, MQ Series, Java, PeopleSoft Financials.
Tyson Foods Inc, Sep 2006 – Jan 2007
Test Analyst
Scope:
Tyson Foods, Inc. is one of the world's largest processors and marketers of meat products, as well as
prepared foods such as appetizers and snacks.
The SolArc RightAngle application is used to plan and enter data for freight movement, and to control
chicken feed movement between company plant locations.
Responsibilities:
• Worked on the User Acceptance Team for projects in the Freight Movement Department.
• Involved in all phases of the project Software Testing Life cycle.
• Created and executed manual test cases for the SolArc RightAngle application, to test the
freight movement process.
• Executed Automated Test scripts using QTP for test cases that needed to be repeated several
times.
• Ensured that the test cases were successfully executed and defects were resolved and retested,
thus satisfying the project requirements.
• Conducted Defect Status meetings to gather defect resolution status, discuss defect details and
to close out testing defects.
• Provided testing status reports during the Test execution phase and also contributed to the Test
results summary report.
• Prepared End-user documentation in MS Word.
• Exported requirements, test cases into the Quality Center, run manual and automation test
cases from test lab.
• Generated regression test reports in MS Excel (graphs and bar charts) to document all
different build test results. Also, validated the regression results against the baseline.
• Reproducing customer-reported issues and providing interim fixes.
• Worked closely with the Developers and Engineering Teams in the review and modification of the
product and its specifications using Agile testing methodology.
• Involved in daily Sprint stand-up meetings.
• Create and execute test scripts for Integration, System and Regression testing based on
analysis of Business Requirement Documents.
• Performed Functional, System, Black box, Backend and performance testing of the web
applications.
• Reported all the issues and tracked defects using PVCS Tracker and JIRA.
Environment: Unix, Windows 2000 Server, Windows XP, DOS, MS Excel, MS Word, MS PowerPoint,
PVCS Merant Tracker, Putty, XML, Java, Test Runner, Smartscript, Quality Centre 9.0.
Google, Inc. Dec 2005 – Aug 2006
Quality Rating Evaluator
Scope:
Google Quality Rating Evaluator acts as a Quality tester for search keywords and to rate the websites that
appear in the search engine results. Google uses computers and sophisticated algorithms to perform
some of its tasks related to ranking websites, but the company also relies on Quality Evaluators to do part
of the search quality evaluation.
Responsibilities:
• Researched Google search queries and their results. Evaluated the search result pages based
on relevance to the search query and utility to the user, using a Google proprietary online
tool.
• Assigned ratings to the web search results based on a predetermined rating scale.
• Provided feedback and suggestions based on personal understanding of web searches.
• Interacted with superiors and other quality raters to resolve differences in ratings.