Post Job Free

Resume

Sign in

Test Cases Sql Server

Location:
Denver, CO
Salary:
90000
Posted:
March 30, 2017

Contact this candidate

Resume:

Professional Highlights:

Well experienced in Manual, Automation, Web Services, iOS, Android and Database testing.

Handled creating and maintaining automation scripts and frameworks with Selenium 2.0, and UFT.

CSM Agile certified; with expertise in SDLC and STLC methodologies.

lead testing in SCRUM environments with defined sprints and daily scrum meetings.

Experienced in developing automation framework, test plans, test strategies and test cases for sprints.

Well versed in writing automated test scripts using Java, VB Scripting

Proficient in documenting QA defined deliverables, such as Test plans, Test Scenarios and Test Cases, Test Results, Recording, Tracking and prioritizing defects using test management tools and designing test strategies.

Well versed in executing and handling testing with BDD and ATDD framework approach.

Experience working with Test management tools like RQM, HP ALM, Rally, and TFS 2012

Interacting with developers, Business Analyst and maintain detail QA documentation for training.

Convened daily Scrum meetings attended Sprint Cycle and Retrospective meetings with Developers, Stake holders, and Business Analysts.

Technical Skills:

Languages

Java, VB Script, UNIX

Testing Tools

UFT 12.5, Selenium 2.0, SOAP UI, Jenkins, Android Studio

Databases

My SQL

Test Management Tools

ALM, 12.5, Rally, RQM, Clear Quest, MS TFS 2012, JIRA, Eclipse

Environment

Windows, Linux

Work History:

UAT Analyst at Service Link (Fidelity Nationals Financial) Nov 2015 - Current

Responsibilities

Designed and managed Test Procedures and Test Plans for UAT including Traceability Matrix

Worked with software development and Business testing teams to develop and integrate business use case scenarios for user acceptance testing (UAT) and user training

Worked with business leaders to determine training needs for software deployment

Collaborated with managers and department leaders to identify and oversee UAT testing and tester resources

Created Automation Test Scripts and frameworks using Selenium Web Driver and java script in Eclipse

Analyzed skill and knowledge gaps to determine appropriate training solutions to training requirements for deployment are met

Worked in conjunction with the Business Analysts and Development teams to ensure that applications meet requirements and are of high quality

Reviewed Business Requirements and Functional Specifications

Oversaw UAT utilizing a standard approach

Defined, captured, and reported testing and quality metrics

Developed, maintained and enhanced testing standards, guidelines and processes.

Executed Projects within the stipulated timelines assigned.

Environment: TFS 2012, SQL Server 2008, Selenium Web Driver, Java Script, Eclipse

Information Systems Analyst at Dimension Data Apr 2014 – Nov 2015

Responsibilities

Created Test Plans, documented Functional and End to End test scenarios for E-Commerce based application for each new requirement and worked with the offshore teams to identify and document the impact/regression scenarios accordingly.

Executed SQL queries to compare the data in database with GUI and web services.

Executed UNIX / Linux commands and transferred files from UNIX / Linux to Windows.

Performed Functional & Database testing for Web based applications using SQL queries and UFT.

Performed root cause analysis on all the issues logged by users during the UAT cycles and identified the areas of focus/improvements from a testing coverage and test case completeness standpoint.

Established an Automation framework using UFT & ALM with BDD approach to automate all test scripts from reusability and efficiency point of view.

Developed functional libraries using VB Scripting for key features of the web based application from customer usability standpoint.

Identified and created object repositories for the automation framework, for scripting accessibility across multiple platforms.

Executed all test cases and logged critical and other identified defects using ALM tool for test reporting and status updates.

Environment: ALM, UFT, SQL Server 2008, VB Scripting, SOAP UI

Software Test Engineer at US Patent & Trademark Office Dec 2011 – Apr 2014

Syneren Technologies

Responsibilities

Developed functional test cases and Performance test scripts from Use Cases, user interface specifications, and technical specifications from business requirements using Rally tool.

Prepared test data by executing SQL queries and Joins

Measured and validated system performance requirements via automated test development.

Performed data form change analysis for performance test plan implementation.

Conducted system and production service tests on a regular basis.

Added assertions to validate the xml in Web Services SOAP and Restful services and validated WSDL File for web services

Attended Sprint review Meetings with End Users and other teams to discuss User Stories for each cycle.

Performed functional and regression testing, reviewed manual test cases, executing where necessary to identify functions required to enable scripting/coding. Wrote validations using client-side and server-side Scripts in Rally.

Used Selenium RC/Web Driver to generate automated test scripts for functional testing, GUI testing and enhancing the existing scripts. For a complex Database project.

Created SQL queries for data validation testing at the back-end for database related applications.

Performed Web services testing with SOAP UI tool using WISDL files and other related test data.

Identified application components to be automated based on both the business priority and expected benefit of automation.

Conducted data driven testing and automation of Performance test cases by using Selenium RC/Web Driver.

Documented bugs using Clear Quest/ Rally and tracked them to completion by communicating and co-coordinating with the development as well as the support group.

Environment: Clear Quest, Selenium Web Driver 2.0, Rally, Req Pro, Java Scripting, SQL 2008, SOAP UI

Software Test Engineer June 2011 – Dec 2011

Angarai International

Responsibilities

Analyzed business requirement documents, functional requirement documents, use cases and identified business critical transactions.

Developed test plans, test strategies and developed test cases from the requirement documents.

Extensively documented Test Requirements and Test Plan using Quality Center to track the stakeholders requested enhancements and changes.

Organized the test cases in Quality Center for traceability.

Performed manual testing and automation testing of the web application.

Performed various types of testing like Functional, Integration, System, Regression and Acceptance testing etc. during different stages of the application development.

Performed Back End testing of the database by using SQL queries to verify data conversion and data integrity.

Developed scripts using Regular Expressions to handle dynamic data names using QTP.

Environment: ALM/QC, QTP/UFT, VB Scripting, SQL Server 2008,

Business and QA Analyst Jul 2010 – June 2011

Responsibilities

Creating Business Requirements document based on Functional requirements of end user.

Detailing the functionality in the Functional spec document for Dev and QA team.

Presenting the changes and design outline with dev team to QA before the QA cycle.

Reviewing assigned clarification and bugs to provide clarification during testing phase.

Conducted Impact analysis meetings with Developers to understand the impact of the new functional changes/requirements to the existing functionality and identify regression scenarios and to provide comprehensive testing coverage for the Web application during each new release.

Created Test Plans, documented Functional and End to End test scenarios for each new requirement and work with the teams to identify and document the impact/regression scenarios accordingly.

Reviewed all test cases for each requirement and provided a sign off on the test case creation from a test coverage completeness and requirement traceability standpoint.

Set up test case walk thru sessions (both functional and regression) to seek feedback from the BAs and Dev team and a formal sign off from the BAs on each requirement.

Documented Defects in QC for review and follow up with Developing teams for fix and retesting.

Performed a thorough root cause analysis of all the issues encountered during the QA, UAT and Post Prod, analyze the defect trend and identify the areas of focus/improvements for the upcoming releases

Environment: HP Quality Centre, SQL Server 2008

Education

MS, Morgan State University (3.9/4.0 GPA)

BS, Jawaharlal Nehru Technological University (3.6/4.0 GPA)



Contact this candidate