Post Job Free
Sign in

Test Cases Data

Location:
Alexandria, VA
Posted:
July 20, 2017

Contact this candidate

Resume:

Jahangir Khan

E-mail: ****************@*****.***

Phone: 571-***-****

SUMMARY:

Over 8 years of experience in Software Quality Assurance Testing.

Proficient in review of all kind of testability docs like requirement specification, functional documentation, conversion of the requirements document into detailed test plan, test cases and test scripts.

Experienced in test design, development and quality assurance of cost-effective enterprise applications.

Experienced in manual and automated testing of web based and client-server applications.

Proficient in performing Black Box which includes functional, non-functional and regression testing.

Experienced in smoke, functional, integration, regression, GUI, cross–browser and user acceptance tests (UAT) as well as static and dynamic testing, black box.

Experienced in working with HP ALM/Quality Center including Site Administration.

Extensively experienced in creating and developing Test plan, Test cases, Test data and executing Test cases to produce bug or defect and reports on resolved bugs using HP ALM/Quality Center.

Experienced in software QA performing manual testing and automated software testing process using manual and automation tool HP ALM/Quality Center.

Profound knowledge of web services testing using Service Oriented Architecture (SOA) tools SOAP UI and REST API.

Experienced in writing SQL queries and executed procedures to perform Back-End Testing for the expected result and checking the data integrity using PL/SQL Queries.

Strong Knowledge in designing and creating reusable automated scripts using QTP/UFT descriptive programming, environment variables and function library with delivering customized test results.

Experienced using Team Foundation Server (TFS) to test data, track issues and open bugs.

Experienced testing different test environments Dev, SAT, Pre-Prod and Prod.

Strong VB scripting skills and implemented various advanced object identification mechanism.

Expertise on working QTP/UFT tools such as, Run Results Deletion Tool, Test Batch Runner Tool etc.

Experienced in developing and implementing Data Driven Framework, generic and application based Keyword Driven Framework and Hybrid Framework in QTP/UFT.

Good knowledge of web technologies XML and API testing

Experienced in analyzing the results of the tests that were used to assist in the identification of system defects and breaking points.

Strong organizational, presentational and a good team player with strong interpersonal skills.

EDUCATION:

Master in International Business Administration (IMBA-Major in Information System)

Stratford University VA, USA

Master in Business Administration (MBA-HR), IQRA University Islamabad, Pakistan

Bachelors in Information Technology, Preston Institute of Management Sciences and

Technology (PIMSAT), Islamabad, Pakistan

TECHNICAL SKILLS:

Testing Tools

ALM/ Quality Center, UFT, HP Quick Test Pro, SOAP UI, Rest API, Selenium, Test Complete

Defect Tracking Tools

TFS, Quality Center, FogBugs, Rally

Programming Language

C++, Java, VB Script, JavaScript, HTML, XML, SQL, PL/SQL

Data Base

Oracle, SQL Server, MYSQL

Operating System

Windows

Certifications

SAFe (Scaled Agile Framework) Practitioner

PROFESSIONAL EXPARIENCE:

United States Patent Trademark Office (USPTO), Alexandria VA August, 2016 - Present

Senior Test Engineer

The CPC-IP is a shared repository for patent scheme approved by the USPTO and the EPO. The intent is to provide another resource for the patent examiners in their work.

Responsibilities:

I have been actively involved in peer review of the documents which eventually be submitted to client for future reference and implementations.

Peer Review of Test Cases, Test Plan by giving feedback on any updates or corrections also provide knowledge transfer to other members in the team

Plan and execute tests and related QA activities, including performing manual test execution, ad-hoc testing and Regression test execution and communicate with the developers and other team members to ensure defects resolution.

Conduct root cause analysis of identi1ed defects, providing development/business partners with sufficient detail to understand, replicate and resolve.

Performed manual and automated test procedures for functional testing of Web services using SOAP UI 5.3.0.

Participated in Sprint planning, task estimates, task sequencing, task assignments, sprint reviews and retrospectives.

Attended daily scrum calls as part of agile methodology.

Provide the test lead with accurate estimated LOE for updating the regression cases and executing test cases.

Communicate daily status to Test Manager, Project Manager and Test Lead to keep them informed of testing and address any issues that results from testing.

Raise requests and coordinate activities with external groups as required for test environment setup, test data setup and test execution.

Using Rally to track issues, tasks, defects and communicate with the team consistently.

Environment: Rally, SOUP UI, Rest API, MS Office, Selenium, Test Complete, Windows, SharePoint, .Net, JAVA, Lync

Federal Energy Regulatory Commission (FERC), Washington DC Oct, 2015 – July, 2016

Senior Assurance Test Engineer

FERC Online is the single entry point for all of FERC’s electronic access applications eFiling, eRegistration, eSubscription, eService, eComment, eLibrary and Company Registration.

Responsibilities:

Created Test logs, Test case and Test Summary Reports during all testing phases of the software development lifecycle, for new and existing applications.

Created and maintained test plans that define test objectives, methods and tools to be employed for the assigned Projects.

Create root Dockets and Sub-Dockets in+ FOLA interface then submitted an eFiling for the newly created dockets to verify the integration between different applications.

Process eFiling submissions in FOLA interface and later verified approved submissions in eLibrary.

Participated in the creation, distribution, and walkthrough of software test cases, scripts and other documents surrounding testing activities and ensure that all testing activities and deliverables are conducted/ produced in compliance with company standards.

Evaluated business requirement and technical specification documents in order to craft test strategies and LOE assessments that are needed in order to fulfill test objectives

Used Team Foundation Server (TFS) to track issues, tasks, defects and communicate with the team consistently.

Participated in Sprint planning, task estimates, task sequencing, task assignments, sprint reviews and retrospectives.

Verify all security level’s (Public, Privileged and CEII) Accession Numbers in eLibrary and make sure the data is uploaded correctly.

Developed Decision Analysis Resolution (DAR) for automation tools selection to help accelerate Functional and Regression testing for the existing FERC’s applications.

Participate in Sprint Planning, Task Estimate, Task Sequencing, Task Assignments, Sprint Reviews and Retrospectives.

Assist team with making appropriate commitments through story selection and task definition.

Provided operations and management support to improve the effectiveness of Client’s Section 508 Program.

Supported SharePoint team to execute tests and generate the documents.

Performed leadership responsibilities included creating tasks, assigning resources, monitoring and reviewing testing team’s progress, delivering of results on schedule and overall testing quality assurance.

Environment: TFS, SQL Server, MS Office, Windows, SharePoint, .Net, JAVA, Skype for Business, CUPC

American Institute of Research, Reston, VA Nov, 2013 – Sept 2015

Test Automation Engineer

AIR's intuitive online reporting system allows educators to easily analyze their data. Data from assessments guide students, parents, and educators to improved teaching and learning. AIR works with our clients and educational stakeholders to create customized paper and Online reports that present data that lead to appropriate interpretations and effective actions.

Responsibilities:

Maintained and integrated manual Regression test cases to UFT using requirements those were gathered from ALM Requirements module.

Involved in improving testing efficiency by preparing and executing Batch tests using UFT.

Performed Functional testing using UFT; extensively used Checkpoints for object, text, table and pages.

Wrote VBScript to develop user defined reusable custom functions, dictionary objects, and descriptive programming by using UFT.

Created a set of reusable and scalable scripts using VBScript in order to perform Functional testing and Regression testing.

Developed and maintained UFT test scripts to support Regression testing whenever a Change Request was approved.

Developed Data Driven Framework to test large data sets.

Familiar with user acceptance testing.

Reviewed weekly Testers’ status reports and taken necessary actions.

Analyzed of different Testing Metrics and measurement perfectly for the software testing life cycle.

Worked with team members for understanding Business Specification Requirement Document and translated the requirements to test cases.

Carried out negative and positive testing of the frontend GUI assuming different roles of the application.

Prepared the Traceability Matrix, Test Results Documents.

Conducted walkthroughs with Business Analysts and Developers to understand the functionalities and the risk involved in the application and discussed about possible application changes or improvements which would increase the business value of the product.

Identified the different test scenarios and created appropriate test cases and test data to effectively test the application.

Scheduled manual test cases through HP ALM; imported test cases from excel to ALM.

Performed manual testing of the application under test using ALM along with used ALM for test management.

Allocated work to testing team to prepare Regression test cases in ALM for monthly regression project.

Performed Back-End testing using SQL queries to make sure that updated data have been uploaded correctly into the database tables.

Environment: ALM, UFT, Java, VB Script, XML, SQL Server, MS Office, Windows, SOAP UI.

Department Of Defense (DoD), Washington Aug 2011 – Oct 2013

Test Engineer

The GCSS (Global Combat Support System) is a United States Defense Information Systems Agency (DISA) information technology program that has the objective of improving the effectiveness and efficiency of IT systems that are used to fulfill its mission responsibilities related to GPS (Global Positioning System) nationally and internationally for The Defense of US.

Responsibilities:

Designed test cases and test scenarios, conducted functional, ad-hoc and exploratory testing for the variety of applications with expanded test coverage.

Performed input field validations by Data Base Testing using Data Tables and Flat files, Created both Positive and Negative data for the same.

Analyzed and identified the areas of a project for Test Automation.

Created Automation scripts in such way that to results can be generated in Text file, Excel and HTML formats.

Involved in Automation Scripts development, debugging, reviews, batch scripts execution, analyzing results and reporting defects.

Worked closely with the Test manager and Lead to perform installation and configuration of Quality Center and created user, developed user groups access controlling and filtering.

Worked as Quality Center Site Admin and managed project content in QC environments.

Scheduled and executed manual test cases through HP Quality Center; involved in defect reporting in Quality Center.

Checked data flow from front-end to back-end and used SQL queries to retrieve data from database.

Created a set of reusable scripts using VBScript in QTP to perform end to end testing.

Developed test scripts using QTP to perform Functional, Smoke and Regression testing.

Created recovery files using Recovery Scenario Manager and associated to the script to instruct QTP.

Developed and maintained QTP test scripts along with updating test input data to support Regression testing whenever a Change Request was approved.

Developed Data Driven Framework and Hybrid Framework with defining function library, reusable VBScript classes, keyword functions and enhancements such as, checkpoints and comments etc.

Attended daily scrum calls as part of Agile methodology.

Interacted with developers to resolve the bugs.

Attended weekly review meetings to discuss defects along with testers, Developers.

Environment: SQL server, Quality Center, QTP, PL/SQL, Windows, Java, Oracle, SOAPUI.

Wells Fargo, Richmond, VA Jan, 2009 – July 2011

QA Tester

The Commercial Loan Origination (CLO) project supports the AFS Commercial Loan Origination lender-servicing module created by Automated Financial Systems (AFS). The Commercial Loan Origination (CLO) is an automated workflow system that can be utilized by bank personnel to originate, approve and close commercial loans. The Commercial Loan Origination (CLO) application is combined in AFS with the Loan Administration application (LA), which was released into production in an earlier release. This LA application includes payment, advance and inquiry functionality, along with additional customizable administration options.

Responsibilities:

Evaluated Business requirements- understood system change requirements and participated in the analysis and design of project specifications with the project management team.

Worked closely with developers and SME to make sure the new application met the business requirements.

Implemented agile QA processes and practices, including defect and test management.

Performed data-driven testing to read test input data from an Excel File so as to test the application with different positive and negative data.

Responsible for creating and filing bugs using Quality Center.

Prepared test cases and test requirements into HP Quality Center.

Used Quality Center to track and report system defects and bug fixes.

Created and prepared the test data and test scripts in QTP for Data Driven Testing.

Used Quick Test Professional (QTP) to create, manage and execute test sets.

Developed automated test scripts in Quick Test Professional (QTP) to expedite Regression testing.

Conducted user acceptance testing.

Maintained meetings with the technical teams and management, developed proper documentation, validated current production environment.

Attended weekly review meetings to discuss defects along with testers and Developers.

Environment: Quality Center, QTP, Oracle, Java, Java Script, SQL, Windows.

References available upon request



Contact this candidate