Post Job Free
Sign in

Project Engineer

Location:
Washington, DC
Posted:
January 22, 2016

Contact this candidate

Resume:

GREGORY L. JOHNSON

EXECUTIVE SUMMARY

Quality Assurance consultant with 15 years of Software Testing expertise. Highly proficient in the Software Development Lifecycle, and in analyzing and generating test strategies, test plans and test cases. Extensive experience in manual testing, including UI test scripts, using Microsoft Visual Studios 2010. Perform Functional, Regression, Systems, Integration, Verification, User Acceptance, Black Box and White Box testing.

EDUCATION

Bachelors of Science in Business Management

Howard University Washington, DC

CERTIFICATIONS

SCRUM Masters Certification May 2012

SECURITY CLEARANCE

Holds an active SECRET clearance granted November, 2009

TECHNICAL & APPLICATION EXPERIENCE

Microsoft Windows, Microsoft Office Suite, Microsoft Access, HP ALM, Microsoft Visual Studio Team Services Foundation Server (TFS) and Microsoft Test Management (MTM), Microsoft Project, WinRunner, TestComplete, Rally, IBM Rational Quality Manage (RQM), IBM Clear Quest (CQ), Test Director and Quick TestPro (QTP)

PROFESSIONAL EXPERIENCE

Sr. Manual Test Engineer October 2015-Present

(Contractor) US Coast Guard Crystal City, VA

Perform formal full-spectrum system testing activities for the Application Product Lines Enterprise Services (APLES) project.

Provide valuable input and coordination to support the development and implementation of enterprise wide testing activities, plans, processes, procedures, and standards.

Perform detailed analysis of documented user requirements and directs the design of test plans in support of user requirements for moderately complex to complex software and IT systems.

Review user application system requirements documentation. Design, define and document unit and application test plans. Transform test plans into test scripts and execute those scripts.

Participate in all phases of the risk management assessment and software/hardware development. Ensure the proper execution of test scripts and documentation of test results in test logs or defect tracking systems.

Ensure that test designs and documentation support all applicable client, agency, or industry standards, time lines, and budgets. Develop test data for use in performing the required tests.

Ensure that testing conclusions and recommendations are fully supported by test results.

Ensure that the project managers are fully informed of the testing status and application deviations from documented user requirements.

Analyze test results, documents conclusions, and make recommendations as supported by such analysis.

Sr. Test Engineer March 2015-October 2015

(Contractor) NARA College Park, MD

Developed and executed automated (and non-automated) test plans, scenarios and cases to demonstrate system compliance with requirements and conformance to system design.

Prepared test script and all related documentation

Being a key resource for ownership of one or more Testing Tools including but not limited to Quality Center, IBM Rational, Blaze, etc

Designed and deployed automated and non-automated performance testing strategies to test design performance including capacity, search and access, multiple users, etc.

Load testing of web-based systems and application including Static, Dynamic and cyclical testing.

Participated in SCRUM meetings.

Supported User Story reviews.

Perform Baseline Regression Testing

Ensured that each sprint and release of the system is fully tested (Complete coverage of user stories, capabilities and constraints)

Ensured that the system has met the Acceptance Criteria

Reported on and review all defects entered

Supported User Testing

Created/Updated Test Procedures

Created Test Plans and Reports

Attended Technical Meetings as needed

Provided Monthly Progress Reports

Sr. Test Engineer June 2013-February 2015

(Contractor) USPTO Alexandria, VA

Project: Office of Enrollment and Discipline (OED)

Designed and implemented system test and regression plans, test cases, and automated test scripts.

Reported issues by using a formal bug tracking system and interface with the development team to help resolve bugs/issues.

Worked very closely with developers, requirement Analyst and PM’s in an Agile/SCRUM environment.

Developed a processed to evaluate and establish standards to determine requirements’ suitability for test automation.

Supported implementation and maintenance of automated tools and procedures to collect test related statistics and improve the Function Testing Division (FTD) testing process

Designed, developed and conducted Functional Test Plans according to procedures defined for the Functional Test Division (FTD) to verify

functionality for specified applications, hardware and software deployments.

Designed and developed test cases and procedures to appropriate requirement(s)/design.

Conducted analysis of requirements throughout the life cycle of the project plan, ensuring that requirements are testable.

Designed and developed Test Strategy to ensure traceability to each requirement from the test case.

Captured and recorded defects in a single system and participate in bug triage process.

Designed and develop Test Reports and Results. Analyze and evaluate test results and make improvement recommendation.

Executed testing.

Ensure all CM procedures are followed.

Conducted QA on all test results.

Conducted regression testing to determine the effects of any performance improvement activities

Project: Project and Release Management Division (PRMD)

Participates in applicable customer and development team meetings.

Develops and/or revise Product Verification Testing (PVT) documentation according to the existing Release Control Branch (RCB) processes to include, test plans, test cases, test summary reports and other support documentation for 200 large new release project cases.

Deposit PVT Documentation into the Enterprise configuration management tools to include, IBM Rational and ClearQuest.

Conduct Software PVT to include regression testing, interface testing, and end-to-end testing as appropriate.

Manages and writes test reports, discrepancy and test observation reports.

Conduct Software Compatibility Testing (SCT) for the various workstation baselines used at the United States Patent and Trademark Office (USPTO)

Coordinates User Acceptance testing with the RCB Test Manager and Development Team

Conducts and monitors testing and verify requirement traceability and test ability

Produces and evaluates the test information and specification documents for each new release

Sr. Test and Developmental Test Engineer January 2012-June 2013

TASC, INC Fort Meade, MD

Project: Program Executive Office for Mission Assurance (PEO-MA)

Project: Joint Interoperability Test Command (JITC)

Developmental Test Engineer for DISA's PEO-MA Secure Configuration Management Team (SCM)

Performed installations and upgrades of the SCM program systems assigned (ACCM, OAM, HBSS and ePO)

Documented discrepancies in the installation/upgrade/user guide identified during testing

Developed test plans for evaluating the functionality, interoperability, performance, scalability, and usability of each release of the SCM programs.

Executed test plans in structured test events.

Briefed major impacts that were identified during testing to the Senior Test Engineer, Govt Test Lead.

Participated in regular meetings with the engineering team of the SCM programs assigned..

Assisted the Senior Test Manager and System Administrator as needed with lab configuration and maintenance.

Server, Storage and Infrastructure Lead for the DoD Enterprise Test Environment (DETE)

Supported and assist the Test Executive Office (TEO)

Supported and assist test team with the research into test requirements, test planning and analysis support for the eventual test of a variety of DOD software systems, analysis of data and test reporting.

Supported Integrated Product Teams (IPT)/Working-level Integrated Product Teams (WIPT) at customer site or via teleconference

Software / System Acceptance Testing at defense contractor sites and recommend/review test procedures and test plans

Reviewed requirements documentation and support development of test plans / documents

Attended weekly project status meetings to discuss timeline and milestones of the project

QA Test Engineer July 2011-December 2011

Bam Technologies Arlington, VA

Project: Various Projects with the U.S. Air Force

Provided test and requirement support for multiple applications, including quarterly & annual reports, student test scheduling, virtual education, tuition reimbursement assistance, and management for multiple military branches, including the U.S. Air Force, Army, Navy, and Marines

Developed and maintained standard systems testing documents including test plans, procedures, and reports

Developed and managed defects using MTM and TFS Test Manager

Attended weekly project status meetings to discuss timeline and milestones of the project

Wrote detailed test cases based on requirement documents that specified system’s functional expected results

Performed extensive ad hoc duties as assigned to support all testing project needs

Test Engineer August 2009 - June 2011

Consolidated Analysis Center Incorporated (CACI) Arlington, VA

Project: eBusiness

Optimized the AT&L eBusiness Software Test Plan that defines testing criteria per application release

Manage, maintain, and execute system and deployment test scripts for the Executive Information System (EIS), External Customer Support (ECS), ITM Training Calendar, AT&L Property Pass and Executive Conference Facilitator (ECF) and the 419 System

Lead tester for the AT&L eBusiness initiative to upgrade the ITM Training Center, ECF, and EIS to the Oracle 10g platform

Developed and manage defects using Merant PVCS Tracker and TFS Test Manager

Developed and distribute defect reports to upper management and facilitates Defect Status Meetings with project team

Conducted peer reviews of test management documentation and test scripts

Lead test procedures and deployment test readiness reviews for the ITM Training Center, Property Pass, ECF, ECS and 419 System

Developed test summary reports and perform post deployment validation

Developed and maintain standard systems testing documents including traceability matrices, test plans, procedures, and reports

Attended weekly project status meetings to discuss timeline and milestones of the project

Supported the organization's continuous process improvement initiative

Supported the software development life cycle (SDLC) reviews by assisting with standardizing test artifacts

Supported the migration of applications from the CIO enclave to the AT&L sub-enclave by performing post-deployment validation

Supported the Oracle database patch effort by validating applications in the test and production environments

Senior Quality Assurance Test Analyst November 2005 – August 2009

Pitney Bowes/Group1 Software Lanham, MD

Project: OnRoute

Senior Quality Assurance Lead of the OnRoute product

Served as liaison between all software developers located on-site and at remote sites around the country

Created numerous project requirements and performed project scope analysis

Attended daily project status meetings to report the plan, status and performance for all test project tasks

Utilized project status meetings to perform face-to-face follow-up meetings with Developers to resolve reported defects

Conducted requirements analysis for completeness and testability

Developed widely viable test strategies under stringent time constraints

Identified and created OnRoute test tools and developed comprehensive test plans

Performed risk analysis and communicated risks to other team members

Executed the best possible coverage tests

Performed new functionality, stress, integration, system, and regression testing for the OnRoute project

Captured test data and metrics and communicated test status to other sub-departments

Recorded defects in bug tracking software, to include useful debugging process documentation

Reported and managed defects in Rational ClearQuest

Verified all defect fixes

Used SQL to query Oracle database and verify test data

Quality Assurance (QA) Test Engineer December 2001- August 2005

Allegis Group Inc., Baltimore County Government Baltimore, MD

Project: Website Redesign

Worked actively with Sponsors to document and prioritize list of business and functional requirements

Organized requirements using the Unified Modeling Language (UML) to assess the requirements viability

Used object-oriented approach (OOA) to account for system encapsulation, inheritance, and organization

Wrote detailed Test Cases based on requirement documents that specified system’s functional expected results

Developed test cases aimed at capturing and assessing system’s performance requirements

Performed extensive Black Box testing to assess if system performed according to functional user specifications and standards

Performed extensive White Box testing to assess the inner workings of the system and to identify the root cause of Black Box testing of defects

Wrote WinRunner scripts using Test Script Language (TSL) to automate the processes of Regression Testing and to expeditiously assess any changes in the application

Wrote SQL statements to query and communicate with the associated tables in the Oracle database

Conducted Integration testing to expose faults in the interfaces and in the interaction between integrated components

Identified application bugs and entered trouble tickets into PVCS Tracker and Mercury Interactive’s TestDirector

Followed-up on documented bugs until they were resolved by Systems Administrators, Database Administrators or Developers

Performed tests on Windows 95 and Windows 2000 platforms, Java based application and Oracle 9i database

Maintained user login IDs and passwords to the Testing Server that were checked against UNIX database and Lightweight Directory Access Protocol (LDAP) to access the directory listing

Established TestDirector Test Sets and executed WinRunner Automated Test runs through TestDirector’s Test Lab

Attended weekly project status meetings to report the plan, status and performance for all project Tests

Utilized project status meetings to perform face-to-face follow-up meetings with Developers to resolve reported defects



Contact this candidate