Post Job Free

Resume

Sign in

Quality Assurance Test Engineer

Location:
Accokeek, MD
Posted:
August 15, 2023

Contact this candidate

Resume:

VICTOR L. LEE

Accokeek, MD

Email: adyymp@r.postjobfree.com

Social Profile: http://linkedin.com/in/victor-lee-68483138

Cell: 443-***-****

SECURITY CLEARANCES

Intelligence Agency Sponsored Top Secret/SCI w CI Poly – Active Status

PROFESSIONAL SUMMARY

23 years of software quality assurance testing experience

17 years of software quality assurance Test Lead / Management experience

23 years of developing and implementing quality assurance processes & procedures within various company’s Software Development Lifecycle (SDLC) QA environment

Published author of both books entitled: ‘Feng Shui for Intelligent Process Improvement’ & ‘12 Steps to Intelligent Process Improvement (IPI)’ that were utilized as principles to aid in implementing QA processes & procedures within various company’s QA environment

Recipient of Lockheed Martin’s “Strategic Battle Command Certificate of Appreciation” for DRRS-A 2.3 OSD Web Services, NECC Command and Control Messaging (C2M) and External Interface Management (EIM) programs during eight End to End Testing Events

Recipient of SAIC’s “2000 Recognition and Appreciation Award” for the PICS/DCPR product release 13.6.1

Member of the Software Process Improvement Network (SPIN) – for both the Atlanta & Virginia chapters

Undergoing the official patenting of my personal software quality assurance testing framework derived from the aforementioned self-published books – Pending Status

PROFESSIONAL EXPERIENCE

Jan 2022 – Present

US Department of Intelligence, Reston VA

Senior Test Engineer (Contractor)

Provided test support for DIA in modernization of legacy system including development in an AWS Cloud environment as a part of a SAFe (Scaled Agile Framework) project and delivery.

Completed bi-weekly build verification testing for both low and high systems, remotely and on-site at SCIF.

Provided quality analysis for features completed and provisioned test cases for traceability and automation testing.

Provided QA support for the Development team, utilizing Scrum and Agile testing, performing exploratory testing with production of test outlines for refinement.

Member of a joint Kanban test team, responsible for daily test work items for QA and test artifact production, test outlines, test scope and cases.

Liaison for the development team, escalating test issues among the many teams and jointly resolving.

Lead issue support initiatives to gather, collect and report findings for feature implementations that affected the Program teams.

Operated and maintained an AWS EC2 instance, using CentOS operating system, obtaining, installing certificates and software bundles as needed for testing.

Author of bug reports leveraging Chrome Development tools for reporting specific API failures for expedient fix to teams.

Contributor to overall Program Test Strategy, providing QA process feedback to Test Lead for implementation and improvement.

Performed API failure analysis using Chrome Development Tools and Postman API testing with JSON for bug & issue analysis.

Apr 2021 – Jan 2022

Federal Bureau of Investigation, Washington DC

Senior QA Tester (Contractor)

Prioritized and planned customer testing engagements across five Pega software programs that resulted in the timely go-live of enterprise-wide applications impacting over 35,000 end users.

Perform manual/automated testing against the FBI’s inhouse software application on a PEGA platform in order to hash out any defect anomalies that were not aligned to the system requirements.

Tracked, escalated, and remediated technical blockers and provided engineering feedback for the resolution of over 500 system defects.

Orchestrated knowledge transfer sessions with stakeholders to establish testing best practices, gather business requirements, and identify blockers.

Onboarded, coached and mentored a team of seven junior test team members, serving as the team lead and presenting testing analytics to stakeholders during daily scrums.

Developed relationships with key members of the customer software engineering, stakeholders and leadership teams to foster cross-group collaboration and communication.

Managed testing documentation using both Confluence and Jira tools to capture bugs, requirements and metrics that were leveraged for sprint planning.

Jan 2019 – Mar 2021

US Department of State, Lorton VA

Senior Test Automation Engineer (Contractor)

Member of the test automation team performing Automation Testing/ Smoke Testing/ Functional Testing/ API Testing and Regression Testing utilizing Tricentis Tosca, Azure DevOps and Team Foundation Server (TFS) for the FOIA Modernization for the U.S. State Department.

Work in an Agile environment attending Daily Stand Ups, Sprint Reviews and Retrospectives to discuss Product Management test goals by way of Epics, Features, PBIs and Tasks that provide traceability of the test scripts within each Sprint.

Developed and maintained the Test Strategy Plan for both FOIA and CDS product lines.

Member of the AMO Teams Committee providing product line service support in working with the client to develop divisional SQA best practices for the current automation testing framework.

Experience designing, building, testing and deploying effective test automation solutions with a focus on continuous integration and continuous delivery pipeline (CI/CD) leveraging the Azure DevOps SaaS platform.

Oct 2017 – Jan 2019

US Department of Commerce, Washington DC

SQA Test Manager (Contractor)

Responsible for leading and managing the following technical tasks: Framework for SQA and Testing, Independent SQA, Test Support and Monitor, Analyze and Report Quality Measures.

Define and implement IT quality assurance practices and procedures; manage a group of quality assurance analysts and test engineers who test, evaluate, validate IT initiatives and identify issues in software or services.

Analyze discrepancies in service or performance and make recommendations for product or service updates within a complex, project-based Agile environment.

Managing the quality assurance of software testing of reoccurring user-acceptance testing (UAT) events.

Implementing and managing an automated software testing platform utilizing Progress’ Telerik Test Studio to conduct automated functional, regression and performance testing.

Jan 2016 – Sept 2018

Paradyme Management, Rosslyn VA

Senior QA Test Lead (Contractor)

Providing systems, integration and performance testing for a new IT system (NextGen) to replace a legacy system.

Responsible for all activities associated with properly testing NextGen, including coordinating their work amongst many stakeholders, such as the PMO, developers, business owners, UAT and fellow test team members channeling all communications via Team Foundation Server (TFS) test tool.

Operating in an Agile Scrum software development environment.

Operating in the full scope CI/CD pipeline (e.g., developing test plans, creating test scripts, testing the system, coordinating with business analysts and developers to resolve defects, facilitating System, UAT, Regression & Performance testing) utilizing TFS test tool.

Utilizing automated testing tool Telerik Test Studio to create functional test scripts to leverage for Load, Stress and Soak testing in order to pinpoint the system’s performance bottlenecks and track results in order to convey to the technical development group to implement the proper enhancements.

Providing input on the functional requirements in order to perform effective testing as they are written.

Designing customized test cases and scripts based on the requirements for both internal testing (within the Scrum team) and UAT testing (business owners and end users) utilizing Team Foundation Server (TFS) test tool.

Leading user acceptance testing (UAT) and performance testing.

April 2015 – Dec 2015

Aviture Inc., Alexandria VA

Software Quality Assurance (SQA) Test Lead (Contractor)

SIRIS is a government owned high-performance web-based capability that revolutionizes the way operators and analysts apply dynamic geospatial collaboration to focus on the mission in a real-time, 3D geospatial environment; it’s a capability that better equips them to make informed and timely decisions.

SIRIS 2.x is an enterprise-level system framework that facilitates effective sharing of information between remotely piloted aircraft (RPA) operators, the supported warfighters in the field and other mission partners that unites the collection of features from SIRIS 1.x into an interconnected network of information, instantly reducing task saturation and increasing a user’s situational awareness and capacity for key decision-making activities.

Supported all phases of the CI/CD pipeline which relied on process and pre-established guidelines/standards to perform the functions of the job to include all levels of testing (manual and automation) and documentation leveraging the Jenkins, Jira/Confluence and AWS project tools.

Worked with all Aviture engineers to ensure high quality of software through manual and automated testing processes, tools and techniques.

Lead the development of SQA test plans, test procedures, standards and tool requirements and engineering activities within project and teams that were housed in AWS.

Created traceability matrix to link requirements, to functional specs, to test scenarios/cases via AWS.

Understood the technical aspects of software testing using Agile practices.

Participated as a member on the Agile team in reviewing user stories, estimate and create sprint backlogs, participated in sprint reviews, demos and retrospectives.

Triage and resolved bug findings via JIRA submitted by fellow project stakeholders and me.

Collaborate with developers to recreate the bug findings for root-cause analysis thus leveraging the Jenkins, Confluence and AWS tools.

Effectuated communication of information, both verbally and written to all members of a project team.

Mentored team members in testing processes and the utilization of testing tools in order to execute the deliverables within the test plan.

Worked closely with project teams to ensure our testing met their needs and timeframes.

Provided necessary support for business user/end customer during various phases of the SDLC.

Had the ability to manage time effectively with minimal supervision.

Exemplified a passion for Quality Assurance through automation with a solid work ethic and team player attitude.

Sept 2014 - Feb 2015

CRGT Inc., Washington DC

Test Lead (Contractor – Securities and Exchange Commission)

Responsible for all aspects of testing thru Test Planning, Test Strategy, Test Script Developing and Test Execution leveraging ALM to ensure continuous integration and continuous delivery.

Leading a Testing Team of 5-10 resources; ensuring that the team follows the testing standards, guidelines and testing methodology as specified in the testing approach; review test and validation results to ensure that they meet the entry and exit criteria.

Responsible for all aspects of Product Testing including Test Planning, Documentation, Execution, Defect Management and Reporting.

Responsible for all Quality Assurance Testing, including Manual, Automation and Performance Testing.

Responsible for categorizing and prioritizing analysis and mapping workload for planning.

Knowledge of integration points and the order in which these must be executed.

Responsible for the implementation of all of QA methodologies, best practices and standards.

Mentored team members in technology, architecture and delivery of applications.

Nov 2013 - Sept 2014

Octo Consulting Group, Alexandria VA

QA Test Subject Matter Expert (Contractor –US Patent & Trademarks Office)

Served as the USPTO’s Functional Testing Division’s (FTD) Test/QA SME by assessing their current test division’s processes and providing recommendations for process improvement and implementations based upon the industry’s best practice standards for the following areas:

oIdentify positive and negative testing requirements for a test engagement by analyzing the requirements and ensure negative testing for each test case.

oReview test plans, test cases, bug fixes and maintain upgrades as well as new code testing.

oIdentify test strategies for environmental setup and test deployment, including remote and offsite testing.

oCreate Test Data strategies and add in bad data to a test data set.

oDevelop defects tracking and reporting processes.

oIdentify defect severity levels and priorities across multiple tools.

oThe usage of Agile & Waterfall methodologies and frameworks for NextGen & legacy projects.

oProvide guidance to the client staff so they can learn Test Automation and Quality Assurance practices.

oSupport new development testing strategies and existing or legacy system test strategies.

oDevelop and test utilizing remote testing and develop appropriate plans to support remote and onsite testing.

oTest planning, analyzing test engagement requirements/user stories/acceptance criteria and translating them to a comprehensive test plan.

oDefect tracking and reporting and the processes to govern the defect tracking tools (RALLY, Clear Quest (CQ), and Rational Quality Manager (RQM).

oBug Scrub or Bug Triage processes and mitigation of defects.

oUsing test metrics and reporting, defined metrics to track completion phase and used metrics to support continued test- process improvement.

Apr 2013 - Oct 2013

APEX Systems, Inc. Quantico VA

Lead SQA Test Analyst (Contractor –NCIS)

Served as Lead SQA Test Analyst in testing of a Consolidated Law Enforcement Operations Center (CLEOC) web-based, incident-based, reporting, case management and data collection system for unclassified criminal investigations which will be the primary tool for the reporting of unclassified criminal investigations and collection/analysis of crime statistics within the Department of the United States Navy.

Tested the auto-generation functionality of the Case Control Number (CCN) for all classified (Category 3/5/9) and sensitive investigations (Category 1, 2 and any case designated as sensitive by NCISHQ regardless of case category).

Tested of the following additional web interface functionalities: role-based access, integration/assimilation of the existing systems and serve as a common interface, enhancement information of sharing among relevant units, commands, agencies, and communities, tracking and monitoring from initial incident to final adjudication and archive, workflow management, associating attachments, monitoring deadlines, enhanced legal and administrative processes to include auto-populated data, customized reporting and data extracts using the VITALS data warehouse, individual/unit search and minimization of duplicate entries all to satisfy the Defense Incident-Based Reporting System (DIBRS) requirements.

Tested of existing enhancements for both Navy and Marine Corps users to remain consistent with shared functionality used by NCIS.

Tested of the collection and transmitting of the data required for the DON to be Defense Incident- Based Reporting System (DIBRS) compliant.

Dec 2012 - Apr 2013

VRH Consulting, Inc. Washington DC

Lead Quality Assurance Analyst (Contractor – US Department of Energy)

Lead Quality Assurance Analyst (“LQAA”) to provide support to the Loan Programs Office, Office of the Secretary, the U.S. Department of Energy (DOE).

Provided comprehensive testing on DOE’s Quicksilver system and associated applications, identify issues during testing, and work with the development team to resolve these issues.

Responsible for testing software applications, including defect tracking, database testing and web testing of the applications developed in .Net.

Participated in requirements gathering, development of document traceability matrices, development of test designs and test design plans, develop and run test scripts.

Performed manual and automated testing: unit tests, functional tests, integration tests, regression tests, performance tests, acceptance tests; bug tracking, interface with user group community, propose improvements and other duties as assigned or requested.

Sept 2008 - Sept 2012

Aviture, Inc. Springfield VA

Senior SQA Lead Test Engineer (Contractor- Lockheed Martin)

Worked independently as site test lead in developing and executing test cases / scenarios to verify requirements in an ARMY Battle Command Software (ABCS) environment for all Defense Readiness Reporting System – Army (DRRS-A) products while utilizing AGILE methods.

Lead Test Engineer for the Battle Command Websis (BC Web) product, a user-driven common thin-client solution aimed at reducing ABCS footprints by providing a single thin-client to consolidate Army Battle Command data (Tactical, Strategic, Fires, Log & Airspace) via a web browser, a Common Ozone Widget Framework (OWF) providing the ability to cross communicate between communities of interest hence widgets; providing common viewing components for ABCS data and lastly Simple Object Access Protocol (SOAP) extensions to implement message content integrity and confidentiality.

Responsible for the generation, documentation and execution of test case/plan to verify requirements in the ABCS environment more specifically to ensure the successful transformation of electronic XML files & messages via SoapUI client.

Attended daily SCRUMs in order to meet with the major stakeholders of the project to gain the latest updates/action items of the IPT involved with software pre-delivery and post-delivery of testing events.

Isolated, analyzed and reported software defects utilizing Rational ClearQuest & Xplanner bug tracking tools.

Planned, implemented, tested, documented and maintained solutions for the integration and testing of in-house developed COTS/GOTS components, elements, subsystems and/or systems.

Worked with systems engineers and software developers to synthesize customer contractual needs and requirements into system test solutions that acknowledge technical, schedule and cost constraints.

Established functional and technical specifications and standards, solve hardware/software interface problems, define input/output parameters and ensure integration of the entire system or subsystem.

Reviewed, evaluate and derive requirements for testability, develop and direct preparation and execution of comprehensive test plans, procedures and schedules for complete systems and/or subsystems.

Coordinated subsystem and/or system testing activities with programs and other organizations.

Wrote discrepancy reports and performs integration regression testing to verify/validate incorporated fixes to software, components, subsystems and systems.

June 2007 - Aug 2008

Allstates Technical Services, Huntsville AL

Senior SQA Test Engineer Lead/Process Engineer (Contractor – Intergraph Corporation)

Lead SQA Test Engineer to support the Department of Defense (DoD) in deploying the Joint Technical Data Integration (JTDI) Service Pack 2.0 & 2.5 systems specifically for the US Air Force, US Navy, US Army and US Coast Guard at the Redstone Arsenal in Huntsville, AL and the NAS Patuxent River, MD locations utilizing CMM Level 3 specifications to provide traceability analysis of the Operational Requirements Document (ORD) for deployment of the JTDI 2.0 & 2.5 systems on a specific platform.

Lead a team of 5 in testing of the current JTDI system in which consists of a three-tier web-enabled Delivery Management System (DMS) that automatically delivers updated technical, supply and maintenance information to aviation and ground support organizations both ashore and afloat.

Tested to ensure that the JTDI system allows users with appropriate network access to download and/or browse technical data on specific weapons systems, the three JTDI tiers are top-tier, mid-tier and the PEMA Tier, (formerly PEDD – Portable Electronic Display Device).

Lead a team of 7 in providing support for software testing for results in the JTDI 2.0 & 2.5 product that will run on a designated computer initially located at Redstone Arsenal in Huntsville, Alabama and finally at the Demilitarized Zone (DMZ) Patuxent River, Maryland.

Implemented SEI’s (Structural Engineering Institute’s) Team Software Process (TSP) to include producing the following CDRL’s: Software Test Plan (STP), Software Verification Test Procedures (SVTP), Acceptance Test Plan (ATP) and the Software Test Report (STR).

Performed a Software Verification Test (SVT) on each software version, including JTDI updates prior to official release in accordance with the JTDI performance specification.

Performed an Acceptance Test (AT) on the new version at the Patuxent River, Maryland location including JTDI updates prior to official release in accordance with the JTDI performance specification.

Performed testing of the two deployed JTDI top tier virtualization servers which consists of Egenera hardware and software in a production environment in which all weapon system website technical data and structure to the new top-tier environment has been transitioned.

Performed an analysis of the JTDI top tier virtualization servers to determine best allocation of JTDI functionality to specific refreshed hardware platforms and identify specific deltas that exist between U.S. Army and U.S. Navy requirements in terms of allowed Ports and Protocols, allowed applications, site specific Information and Physical Security Requirements.

Sept 2006 - May 2007

Tek Systems, Atlanta GA

Senior QA Tester/Process Engineer (Contractor – United States Center for Disease Control Prevention)

Served as a QA Tester to support the software development Individual Learning Account (ILA) and Integrated Resources Information System (IRIS) projects within the Management Information Systems (MISO) Branch by producing test plan documents for client/server applications, participated in test requirements efforts, created and running test scripts, compiled test scripts into test procedures, performed regression testing, tracked status of defects utilizing Rational’s ClearQuest component, created manual test scripts for mainframe applications and performed manual functional testing for mainframe systems.

Served as QA Performance Test project lead by researching and performing demo of Mercury Interactive’s LoadRunner testing tool, served as liaison between CDC QA/Management team and HP Mercury sales team by communicating the attributes and benefits of LoadRunner, assisted in the cost analysis of the LoadRunner implementation.

Served as QA Implementation Lead / Process Improvement Gap Analysis Workgroup coordinator on the MISO CMMI implementation team by meeting with upper management to create strategic plans for the implementation effort, created and presented CMMI implementation training material plans for employees to reference and adhere to, initiated a Champions Committee to assist in executing and mentoring the organization on the CMMI practices and conducted internal audits to prepare employees for external auditing events.

May 2005 - Aug 2006

Enterprise Information Services, Vienna VA

Quality Assurance Specialist II (Contractor – United States Food & Drug Administration)

Served as a Quality Assurance Specialist II in support of the Food Nutrition Services (FNS) FPRS and STARS food stamp projects participating in independent validation and verification (IV&V) testing (functionality & regression testing) in writing test plans, and analysis using Rational’s ClearQuest, TestManager, and RequisitePro testing software suite for the United States Food & Drug Administration (USDA).

Followed the CMM Level process in order to ensure proper functionality for the system requirements by documenting and communicating change enhancement requests (CERs) and documenting failed tests results utilizing Rational’s ClearQuest bug tracking system with the development team.

Provided day-to-day activities (testing, writing test plans and test summaries, analyzing test results using Microsoft SQL Server Query Analyzer, attending project meetings in support of the thorough testing of FNS applications from documentation through software release in client-server and web-based environments.

Sept 2003 - May 2005

Acumen Solutions Inc, Vienna VA

SR. SQA Analyst/Tester (Contractor – Nextel Communications)

Served as a SQA Analyst/Tester for EGS (Encryption Gateway Server) test group. Responsible for testing of messages and files sent via the EAI (Enterprise Application Integration) layer for encrypted and decrypted confidential files using DataIntegrator and Nextel’s public and private keys to ensure Sarbanes Oxley (SOX) compliancy

Involved in the preparation and execution of test plans and test cases for various interfaces developed within EAI using a variety of technologies including CrossWorlds, UNIX, MQSeries, MQSI to route, filter, and format XML messages among disparate applications to receive XML messages from MQSeries, performs its tasks and then returns the XML messages back to MQSeries, and a custom File Transfer Protocol (FTP) application

Delivered detailed status reports documenting test results to Senior Management and provide daily QA status reports for the entire project team

Worked on all phases of the testing lifecycle, including installation testing, system testing, integration testing, performance testing, regression testing, failover testing, and disaster recovery testing

Worked with EAI project managers, architects and developers to analyze business/system requirements and develop high-level plans (including documentation) for validating EAI functionality and performance

Acted as the EAI QA liaison to external teams in providing end-to-end testing support

Worked on Automated SQA Team for performance testing of the Order Entry System OSS (Open-Source Software) platform using Mercury Interactive’s LoadRunner and TestCenter

Responsible for determining, reporting, and resolving issues found during testing such as long response times in the application or throughput issues related to the network and application servers

Tested the client’s CRM (Customer Relationship Management) Commissioning systems utilizing Mercury’s Vugen to create, debug, and execute Vuser scripts

Utilized Mercury’s TestCenter and Test Director (Quality Center) to create, manage, and execute various load tests against a 3-tier architecture hosting the PeopleSoft HR and Financials web-based application as well as enter & track defects

Tested the quality assurance for two main projects, the implementation of an internal PeopleSoft human resource management system application and the implementation of a temporary WNP (Wireless Number Portability Eligibility Check) application on the company’s website

June 1999 - Aug 2003

Telcordia Technologies, Inc. (Bellcore), Piscataway NJ

Software Quality Assurance Engineer/Process Engineer

Telcordia Technologies (a Science Applications International Corporation (SAIC) company) is the world’s largest CMM Level 5 rated organization supplying telecom operation support systems and other telecom network related services to firms throughout the world.

Served as a software quality assurance test engineer for the Plug-in Inventory Control System/Detailed Continuing Property Record (PICS/DCPR) project in which I developed and maintained test cases under the CMM Level 5 specifications for both, batch and GUI projects according to specific release requirements. PICS/DCPR, Telcordia’s second largest software system, is an inventory management system controlling acquisition, movement, repair, re-inventory, billing reconciliation, and journalization for both plug-in and hardwired equipment.

Performed on-line and batch job software testing for rectifying software modification requests and features utilizing MVS operating system, IMS (Information Management System) databases, JCL (Job Control Language) and TSO (Time Sharing Option) for initializing the on-line or batch jobs while maintaining the Quality Methods of Operations (QMO) standards methodology.

Assumed position of lead metrics coordinator in which I developed, maintained and provided the quality assessment report for the scope of the product releases.

Served as a member of the Quality Method of Operations (QMO) Champions Committee in which I analyzed and discussed with fellow committee members the latest issues within the QMO release, conducted various Internal Audits and was responsible for performing in-depth QMO research for solutions of assigned action items.

May 1998 - Aug 1998

Northrop Grumman, Ferndale MD

Assistant Program Analyst (Summer Intern)

Supported the Infrastructure Standardization Project team (ISP) of the Electronic Sensors and Systems Division (ESSD) of Northrop Grumman.

Performed IIS IT acquisition and asset management support for ESSD, processed and tracked orders for PC hardware and services in accordance with approved requests and performed processing of charge distributions for equipment purchases utilizing Lotus Notes NT 4.0 software application.

Assisted the ESSD’s focal point for warranty and reliability issues for all leased hardware.

My understanding of the network file structures and PC configuration both in the legacy and ISP environments was a major asset of the ISP team.

Assisted in the desktop computing installation and consulting in which my overall experience at ESSD allowed for me to resolve customer problems with desktop computing equipment and related network infrastructures.

Nov 1996 - May 1998

Earl G. Graves School of Business & Management, Baltimore, MD

Laboratory Technician (MSU Work Study Program)

Installed various software packages and hardware components in addition to analyzing and rectifying hardware and software related issues.

Assisted professors by acting as their aid in overseeing and providing ‘



Contact this candidate