Post Job Free
Sign in

Test, Quality

Location:
Ottawa, ON, Canada
Posted:
July 17, 2024

Contact this candidate

Resume:

DAVID XU

Ottawa, Ontario, Canada ● 613-***-**** ● ********@*****.***

OBJECTIVE

Seeking Senior Software Tester opportunity

QUALIFICATIONS

20+ years professional experience of Software Testing in developing automated test cases using HP UFT and ALM, Quick Test Pro, Quality Center Server, WinRunner, LoadRunner, Test Director, Selenium, Rational Robot and Rational Functional Tester Test Tools

20+ years working experience as Quality Assurance Specialist /Software Tester in developing quality assurance policies and test packages, including test strategies, test plans, test cases, test cycles, automated test scripts, test analysis and reporting

20+ years hands-on skill in developing, maintaining and executing automated test scripts as well as performing manual testing to test JavaEE Applications, SAP applications, .Net Applications and Cloud Applications

4+ years software development and engineering experience with strong expertise in C/C++, VB, VB script, TSL, Java/JavaEE, SQL, XML, Linux/Unix and TCP/IP

Solid working knowledge of relational database systems (Oracle, DB2, MySQL, MS SQL Server, etc.), operating systems (Mainframe, Unix/Linux and Windows)

Strong communication skills, problem-solving ability and team-working

TECHNICAL SKILLS

Operating System: UNIX (SUN Solaris, AIX, HP-UX, SCO UNIX),

Linux, Windows XP/ Vista/, Windows 7, Windows

10, Windows 2003/2008 Sever, VxWorks, IBM

Mainframe

Programming Language: C/C++, Java/JavaEE, JavaScript, VB, VB Script,

COBOL, TSL, PERL/CGI, SQL, Shell Scripts,

HTML, CSS, XML, MFC, UML, .Net

Network: TCP/IP, FTP, HTTP/HTTPS, LDAP, ATM, ISDN,

Ethernet, SNMP, Novell Networks

Database and Tool: HP UFT and ALM, HP Quick Test Pro, Quality

Center Server, WinRunner, LoadRunner, Test

Director, Selenium, Rational Robot, Rational

Functional Tester, Rational Purify, Rational Rose,

JMeter, TSO/ISPF, JCL, CA JMR, Compuware File-

AID, Visual Test, Visual Studio, IBM MQ, MS .Net

Framework, ClearCase, ClearQuest, Citrix, Crystal

Report, Oracle 9i/10g/11g, DB2, MySQL, MS SQL Server, Informix, MS Access, Oracle VM VirtualBox, VMware, Oracle SQL Developer, DBVisualizer, Entrust, WinSCP, CVS, SMART, SAP, INFOMAN, Eclipse, Windows IIS, Active Directory, Apache HTTP Server, InstallShield, Tomcat, SPUFI, Eclipse, Iris Kiosk, Cognos BI tools, Cognos ReportNet, PowerPlay, Upfront, Cognos, SAP, JIRA, EJB, Servlets/JSPs, WebSphere, SoapUI, Azure, AWS (Amazon Web Services)

WORK EXPERIENCE

Senior Software Tester 09/2023–Present

Military Command Software Centre (MCSC) - Director Human Resources Information Management (DHRIM) - Chief Information Officer Group (CIOG), National Defence - Government of Canada

Project Monitor MASS (Military Administrative Support System)

Responsible for the development of test strategies, test plans, test cases, test cycles, automated test scripts, test analysis and reporting to test Project Monitor MASS (Military Administrative Support System) using Micro Focus UFT One and ALM testing tools

Conducting testing and implementation readiness by reviewing and walking through the BR (Business Requirements), BUC (Business Use Case) and SUC (System Use Case) and application stereotype to analyze the system requirements and provide the estimation of the test level of effort (Test Impact) to the Manager

Developing, maintaining, and executing test packages including test strategies, test plans, test cases, test cycles, automated test scripts, test analysis and reporting for functional testing, integration testing and regression testing

Implementing automated test scripts for functional and performance testing with Micro Focus UFT One and ALM testing tools, and perform automated tests for smoke testing, functional testing, regression testing and performance testing on Windows Server 2022, Citrix Thin Client

Developing and maintaining test documentations including test cases, change requests, amended requirements, test impacts, defect logs, proof of test, release schedule and plans for the functional, integration and regression testing

Attending development meetings with management, development, IT Support, architects, and clients to analyze system requirements/scenarios, and preparing testing documents (test procedures, test data, expected outcomes, software versioning information, release notices and update procedures).

Developing and executing SQL queries to create and verify test data, such as Leave Requests, Activity Management, pre-conditions and expected test results

using SQL Navigator 8.0 and based on Oracle 12g database

Setting up, creating, initializing, executing and optimizing Micro Focus UFT One and ALM test environments, creating automation test standards, user guides, criteria, projects and common function libraries

Planning work and collaborating on code development, build & release management and deployment of solutions across different environments with cloud-based Azure DevOps services (Azure Repos, Azure Boards, Azure Pipeline, Azure Test Plans, etc.

Analyzing and documenting performance testing results and reporting results to Architect team to help eliminate bottlenecks and establish baseline for future regression testing.

Logging, reporting, and tracing defects based on test results for each release testing and constantly updated test status and critical issues to project management and release management using bug tracking tools including Azure Boards and/or ACDC (ASST Communication and Development Center)

Documenting and maintaining Training Guide & User Manual using Infowiki, MS Office 365 and OneDrive

Mentoring team members to improve testing skills, conducting presentations and demos to team members, and doing knowledge transfer upon request

Updating baseline test plans with retrofits after release sign-off, and archiving current test documents, such as test plans, defect list, test impacts, quality matrix and test scripts using MS Office 365

Senior Software Tester 01/2016–08/2023

SDAT – Service Delivery Assurance Team, Information, Science and Technology Branch, Canada Border Services Agency

Project eManifest, Traveller System and ICS, including

eManifest Portal

CAED (Canadian Automated Export Declaration)

Trusted Traders

CPSG (Commercial Passage)

ARD (Automated Risk Determination)

CTAS (Commercial Threat Assessment System)

MDM (Master Data Management)

RAPM (Risk Assessment Program Maintenance)

ExCC (External Client Communication)

CDEM (Commercial Data Acquisition and Communication Services)

CoDaCS (Commercial Data Cleansing and Standardization)

RDS (Reference Data Service)

SWI (Single Window Initiative)

ACROSS (Accelerated Commercial Release Operations Support System)

MOPIL (Mobile Primary Inspection Line (PIL))

BWT (Border Wait Time)

Responsible for the development of test strategies, test plans, test cases, test cycles, automated test scripts, test analysis and reporting to test Project eManifest, Traveler System and ICS (Integrated Customs System) using HP UFT and ALM testing tools following Agile (SCRUM) or RUP methodologies

Conducted testing and implemented readiness by reviewing and walking through the BR (Business Requirements), BUC (Business Use Case) and SUC (System Use Case) and application stereotype to analyze the system requirements and provide the estimation of the test level of effort (Test Impact) to the Release Manage team.

Worked as a leading tester in developing, maintaining and executing test packages which includes test strategies, test plans, test cases, test cycles, automated test scripts, test analysis and reporting, for the functional, integration, regression and performance testing of the high criticality, availability, reliability, and security Java-based applications (eManifest, Integrated Customs System (ICS), etc.), using HP UFT and ALM and LoadRunner testing tools

Designed and developed automation test framework for CBSA high criticality, availability, reliability, and security Java-based applications (Traveler System, etc.) with Selenium, TestNG, Java, ExtentReports, JSON, XPath, XML, MAVEN, Eclipse, and Oracle database. Executed automation test cases/test suites with AWS DevOps Continuous Integration /Continuous Deployment (CI/CD) tools, such as Jenkin, Git, Maven, etc., and analyzed test result reports

Developed and maintained the test documentations including test impact, test plan, test cases, change requests, amended requirements, test impacts, defect logs, proof of test, release schedule and plans for the functional, integration and regression testing of the complex IM/IT applications eManifest program, Traveler System and Integrated Customs System (ICS)

Applied iterative (RUP) or agile (SCRUM) methodologies to develop the whole test package for testing activities, including testing processes, procedures, strategies, plans, cases and data of test beds in synchronized test libraries in multiple development and testing and production environment (YT1, YTM, YH1, RTL-1 and PSL servers) with multiple parallel releases, such as D2 Release R101, D3 Release R179, D5A Release R180, D5B Release R220, D4 Release R206, MDM release 448, eHouse Bill release R855/R856, and CERS release R898.

Created and updated the test plan and test cases for the Release R191 and R195 to test the Mobile application BWT (Border Wait Time) and MOPIL (Mobile Primary Inspection Line (PIL)) based on the IOS and Androids platforms in YTST environment.

Defined, created and executed the HTTP and XML format scripts using the Soap-UI to create test data in to the DB2 database for the functional, integration and regression testing of the eManifest and Traveler System, i.e., Release R448 and R553 for the MDM Releases and the Mobile application BWT and MOPIL based on IOS and Androids platforms.

Performed integration testing, including manual and automation tests for projects ARD (Automatic Risk Determination), CTAS (Commercial Threat Assessment System) and RAPM (Risk Assessment Program Maintenance) in YT1 and the Mobile application BWT and MOPIL based on IOS and Androids platforms in YTST environment using IBM ODM (Operational Decision Manager) in Release R180, Release R220 and Release R191 and R195 in eManifest program and Traveler System.

Attended development meetings with management, development, IT Support, architects and clients to analyze the system requirements and scenarios for preparing testing documents (test procedures, test data, expected outcomes, software versioning information, release notices and update procedures).

Developed, maintained and executed HP UFT and ALM Automation test scripts to perform automated tests including a smoke testing, functional testing, regression testing and performance testing for eManifest Portal on Windows, WebSphere, Mainframe, Unix/Linux and AWS DevOps platforms

Performed various testing activities, including unit tests, functional testing, regression testing, integration testing, performance testing, load testing and documentation testing for CBSA eManifest Portal which used Java, EJB, Servlets/JSPs, XML and WebSphere technologies, with various database systems (DB2, Oracle, PostgreSQL etc.) and IBM MQ in multiple development and testing and production environment (YT1, YTM, YH1, RTL-1 and PSL servers) with multiple parallel releases, such as D2 Release R101, D3 Release R179, D5 Release R180 D5B Release R220, D4 Release R206, MDM release 448, eHouse Bill release R855, R856 and CERS release R898, across Windows, Mainframe, Unix/Linux and Cloud-based AWS DevOps platforms

Set up, created, initialized, executed, and optimized HP UFT and test environments (ALM, Selenium, etc.), communicated with multiple scrum teams to do best practices for test automation & Continuous Integration / Continuous Deployment (CI/CD) deployment, created automation test standards, user guides, criteria, projects and common function libraries

Worked with developers to run load and stress testing for eManifest Applications (ACROSS, ExCC and CDEM) to provide the best practices to improve the system performance.

Analyzed and documented performance testing results of and reported results to Architect team to help eliminate bottlenecks and establish baseline for future regression testing.

Logged, reported and traced defects based on test results for each release testing and constantly updated test status and critical issues to project management and release management using bug tracking tools including AWS DevOps, SMART and JIRA.

Documented Training Guide and User Manual using MS-Office suite for new testers, conducted presentations and demos to team members and mentored/coached and transferred knowledge to the team members to improve testing skills.

Updated baseline test plans with retrofits after release sign-off, and archiving current test documents, such as test plans, defect list, test impacts, quality matrix and test scripts using MS-Office suite.

Senior Software Tester 01/2013–12/2015

Commercial Portal Team, Commercial Systems Development Division, ITSB, Canada Border Service Agency (CBSA), Ottawa, Canada

Project eManifest Portal and ICS

Responsible for the development of test strategies, test plans, test cases, test cycles, automated test scripts, test analysis and reporting to test ICS (Integrated Customs System) Web Applications using Quick Test Pro and Quality Center Server Tester tools

Facilitated testing and implemented readiness by reviewing and walking through the application systems (i.e., Proof of Test and Stereotype) and documentation (i.e., Project Charters, Test Contents, Change Requests, Impacts, PR logs, Release Schedule and Plans, Models, Prototype, Business & System Use Cases)

Conducted testing and implemented readiness by reviewing and walking through the BR (Business Requirements), BUC (Business Use Case) and SUC (System Use Case) and application stereotype to analyze the system requirements and provide the estimation of the test level of effort (Test Impact) to the Release Manage team

Applied iterative or agile methodologies, such as RUP or STAR methodologies, to develop the whole test package, including testing processes, procedures, strategies, plans, cases and data of test beds, etc., in synchronized test libraries

Attended brainstorm meetings with management, development, IT Support, architect and clients to analyze the system requirements and scenarios for preparing testing documents (test procedures, test data, expected outcomes, software versioning information, release notices, and update procedures, etc.)

Created and executed the HTTP and XML format scripts using the Soap-UI to ingest test data in to the DB2 database for the eManfesit projects, such as MDM releases

Regularly loaded XML files into IBM MQ system, then accessed to IBM Mainframe system via TSO/ISPF (Time Sharing Option/Interactive System Productivity Facility) for running JCL (Job Control Language) batch jobs using CA JMR (JOBLOG Management & Retrieval) application to create, populate, customize, refresh and authorize test data in DB2 database system

Developed, maintained and executed SQL queries by running SPUFI SQL batch job with Compuware File-AID to verify actual output in DB2 database system with the expected test results

Developed, maintained and executed Quick Test Pro automation test scripts to perform automated tests (e.g., smoke test, functional test, regression test and performance test) for CBSA ICS Web Applications on Windows, Mainframe and Unix/Linux

Performed various unit tests, functional testing, regression testing, integration testing, performance testing, load testing and documentation testing for CBSA Web Applications with various database systems (i.e., DB2, Oracle, etc.) and IBM MQ across Windows, Mainframe and Unix/Linux platforms

Analyzed and documented performance testing result, and reported to Re-Engineering team to help eliminate bottlenecks and establish baseline for future regression testing

Logged, reported and traced PR (Problem Report) based on test results for each release testing and constantly updated test status and critical issues to TLC and release management using bug tracking tools, i.e., SMART, JIRA

Documented Training Guide and User Manual using MS-Office suite for new testers, conducted presentations and demos to team members and mentored/coached and transferred knowledge to the team members to improve testing skills

Analyzed and determined PRs to provide assistance for troubleshooting and submitted quality matrix upon release sign-off

Updated baseline test plans with retrofits after release sign-off, and archived current test documents, such as test plans, PR list, Test Impacts, Quality Matrix and test scripts using MS-Office suite

Senior Software Tester 01/2012–12/2012

Testing Team, DRMIS System, the Department of National Defence

and the Canadian Forces (DND), Ottawa, Canada

Project DRMIS (the Defence Resource Management Information System)

Responsible to develop quality assurance policies and strategies, test plans, test procedures, test utilities (metrics, forms and tools), test analysis and test scripts (automated and manual) for DRMIS SAP (Desktop & Web portal) financial applications/systems using Quick Test Pro (HP QTP 11) and Quality Center Server (HP QC ALM 11) testing tools, etc., and following Agile methodology

Facilitated testing and implementation readiness by reviewing and walking through the application systems and documentation (Project Proposal, Change Management, Release Schedule and Plans, Business Requirements and System Use Cases)

Applied iterative and agile methodologies to develop the whole quality assurance policies and test package, including testing processes, procedures, strategies, plans, cases, utilities (metrics, forms and tools) and data of test beds, etc.

Attended standup and bi-weekly meetings with management, development, IT Support and clients to analyze the system requirements and scenarios for preparing testing documents (test procedures, test data, expected outcomes, software versioning information, release notices, system update procedures, and QC server updates assessment, etc.)

Developed, maintained and executed Quick Test Pro automation test scripts (e.g. smoke test, functional test, regression test and performance test) by building the blocks of BPT (Business Process Tests) for DRMIS SAP applications across Windows, Mainframe and Unix/Linux platforms

Responsible for configuration, maintenance and administration of HP Quality Center ALM 11.0 server, such as server configuration, user accounts and authentication maintenance, QTP 11 License Server management, database management, project migration and organization, etc.

Troubleshoot HP Quality Center ALM 11.0 server issues and communicated with HP to assess need for QC upgrades, execute and verify upgrade procedures.

Planed and performed upgrades (QC Patch 9 and QTP Hotfix 44, etc.) in TDC test environment, and ensured the upgrades are appropriate for end user environment DWAN deployment

Performed various unit tests, functional testing, regression testing, integration testing, performance testing for DRMIS SAP applications on Windows, Mainframe and Unix/Linux platforms with DB2 database systems

Logged, reported and traced defects and change requests based on test results for each release testing and constantly updated test status and critical issues to Architect and Manager using bug tracking tools, i.e., SolMan, Charm

Documented the Training Guide and User Manual of automation test projects using MS-Office suite for new testers, conducted presentations and demos to team members and senior management, and mentored/coached and transferred knowledge to the team members to improve testing skills

Senior Quality Assurance Specialist 01/2011–12/2011

HR Systems, Information Technology Services Branch (ITSB), Public Works and Government Services Canada (PWGSC), Gatineau, Canada

Project LIMS 8.0 (the Leave Information Management System)

Responsible to develop quality assurance policies and strategies, test plans, test procedures, test utilities (metrics, forms and tools), test analysis and test scripts (automated and manual) for project LIMS 8.0 .NET / SAP Applications using Selenium, Rational Functional Tester, Rational Robot, and Quick Test Pro & Quality Center Server testing tools

Facilitated testing and implemented readiness by reviewing and walking through the application systems (existing LIMS 7.4 system) and documentation (Project Proposal, Change Requests, Release Schedule and Plans, Business Rules, and System Use Cases)

Applied iterative methodologies to develop the whole quality assurance policies and test package, including testing processes, procedures, strategies, plans, cases, utilities (metrics, forms and tools) and data of test beds, etc.

Attended weekly meetings with management, development, IT Support and clients to analyze the system requirements and scenarios for preparing testing documents (test procedures, test data, expected outcomes, software versioning information, release notices, and update procedures, etc.)

Submitted Automation Test Feasibility reports to senior Management, and facilitated best practices by prototyping auto-test framework using Selenium, Quick Test Pro, Rational Functional Tester and Rational Robot

Developed, maintained and executed Selenium, Quick Test Pro, Rational Functional Tester and Rational Robot automation test scripts (e.g., smoke test, functional test, regression test and performance test) for LIMS .Net applications on Windows XP and Windows 7

Created new builds and installation scripts to install and configure LIMS 8.0 on Windows XP, Windows 7 and Citrix Thin Client (Windows 2003 Server), and iteratively performed automation Smoke Test and Mail Functionality Test (MS Outlook 2003/2007)

Developed and executed SQL queries using Oracle PL/SQL (Oracle 10g Instant Client and VMware Oracle Client) to create and verify test data, such as Leave Requests, Over Time Requests, pre-conditions and expected test results (PRIs, Compensation Specialists and Management Reports) for LIMS 8.0 and Oracle Upgrade Testing (from Oracle 9i to Oracle 11g)

Performed various unit tests, functional testing, exploratory testing, regression testing, integration testing, performance testing, load testing and installation testing for LIMS .Net applications with Oracle 9i/10g/11g database systems across Windows XP, Windows 7 and Citrix Thin Client Server platforms

Set up, initialized and optimized Selenium, Rational Functional Tester, Rational Robot, Quick Test Pro and Quality Center test environments, created test standards, user guides, criterias, projects and common function libraries and SAP function libraries for automation testing

Iteratively ran Load and Stress testing, measured performance of system environment (Wi-Fi, Novell network, Ethernet network, Oracle 11g databases and Thin Client Server, local system vs. virtual system, etc.) to ensure that the system meets the performance baseline and fails /recovers gracefully

Analyzed and documented performance testing result, and presented optimization report to senior Management to help development eliminate bottlenecks and establish baseline for future regression testing

Logged, reported and traced defects and change requests based on test results for each release testing and constantly updating test status and critical issues to Project Leader using bug tracking tools, i.e., ClearCase

Assisted development in system troubleshooting, and submitted quality matrix upon release sign-off

Documented the Training Guide and User Manual of automation test projects using MS-Office suite for new testers, conducted presentations and demos to team members and senior management, and mentored/coached and transferred knowledge to the team members to improve testing skills

Performed MSI installation of production on Windows XP, Windows 7 and Citrix Thin Client (Windows 2003 Service) via Application Explorer and CD media.

Senior Software Tester 05/2005–12/2010

IT Testing Section, Quality Management Division, ITSB, Canada Border Service Agency (CBSA), Ottawa, Canada

Project PAXIS, DAS, IPIL, IBQ, AMPS, KIOSK and TITAN

Responsible for the development of test strategies, test plans, test cases, test cycles, automated test scripts, test analysis and reporting to test ICS (Integrated Customs System) Web Applications using Quick Test Pro, Quality Center Server, WinRunner, LoadRunner, Test Director and Rational Functional Tester tools

Facilitated testing and implemented readiness by reviewing and walking through the application systems (i.e., Proof of Test and Stereotype) and documentation (i.e., Project Charters, Test Contents, Change Requests, Impacts, PR logs, Release Schedule and Plans, Models, Prototype, Business & System Use Cases)

Applied iterative methodologies, such as RUP and STAR methodologies, to develop the whole test package, including testing processes, procedures, strategies, plans, cases and data of test beds, etc., in synchronized test libraries

Attended brainstorm meetings with management, development, IT Support, architect and clients to analyze the system requirements and scenarios for preparing testing documents (test procedures, test data, expected outcomes, software versioning information, release notices, and update procedures, etc.)

Regularly loaded XML files into IBM MQ system, then accessed to IBM Mainframe system via TSO/ISPF (Time Sharing Option/Interactive System Productivity Facility) for running JCL (Job Control Language) batch jobs using CA JMR (JOBLOG Management & Retrieval) application to create, populate, customize, refresh and authorize test data in DB2 database system

Developed and executed SQL queries by running SPUFI SQL batch job with Compuware File-AID to verify actual output in DB2 database system with the expected test results

Developed, maintained and executed Quick Test Pro and WinRunner automation test scripts to perform automated tests (e.g., smoke test, functional test, regression test and performance test) for CBSA Web Applications (i.e. PAXIS, DAS, IPIL, IBQ, AMPS, KIOSK and TITAN, etc.) on Windows, Mainframe and Unix/Linux

Performed various unit tests, functional testing, regression testing, integration testing, performance testing, load testing and documentation testing for CBSA SAP & Web Applications with various database systems (i.e., DB2) and IBM MQ across Windows, Mainframe and Unix/Linux platforms

Set up, initialized and optimized WinRunner, Quick Test Pro and Quality Center test environments, created automation test standards, user guides, criterias, projects and common function libraries

Iteratively ran Load and Stress testing, measured performance of system environment (network, MQ, databases, server, etc.) to ensure that the system meets the performance baseline and fails /recovers gracefully

Analyzed and documented performance testing result, and reported to Re-Engineering team to help eliminate bottlenecks and establish baseline for future regression testing

Logged, reported and traced PR (Problem Report) based on test results for each release testing and constantly updated test status and critical issues to TLC and release management using bug tracking tools, i.e., SMART, INFOMAN and Eclipse

Analyzed and determined PRs to provide assistance for troubleshooting and submitted quality matrix upon release sign-off

Conducted Quick Test Pro, WinRunner and Rational Functional Tester workshops regularly to enforce best practices for testing strategies, testing methodologies and testing tools to provide on time and high-quality data and software

Documented Training Guide and User Manual using MS-Office suite for new testers, conducted presentations and demos to team members, and mentored/coached and transferred knowledge to the team members to improve testing skills

Updated baseline test plans with retrofits after release sign-off, and archived current test documents, such as test plans, PR list, Test Impacts, Quality Matrix and test scripts using MS-Office suite

Software Tester 01/2005–04/2005

DHRIM 6-5 Team, Department of National Defence (DND), Ottawa, Canada

Project DentIS, DGADR DC, DHH CF Ops, CDA ILP

Responsible for developing test packages in accordance with DND’s in-house test procedures, including test strategies, test plans, test cases, test cycles, automated test scripts, test analysis and reporting, using WinRunner, LoadRunner, Test Director and Rational Robot

Facilitated testing and implementation readiness by reviewing and walking through the application systems and documentation (i.e., Project Charters, Plans, Models, Prototype, Business & System Use Cases, System Requirements Specification)

Applied iterative methodologies to develop the whole test package, including testing processes, procedures, strategies, plans and cases as well as data test beds, etc., with WinRunner, Test Director and Rational Robot

Utilized WinRunner, LoadRunner, Quick Test Pro, Test Director and Rational Robot to develop, maintain and execute test scripts for automation test, such as smoke test, functional test and regression test, on DND web applications (i.e., DentIS, DGADR DC, DHH CF Ops and CDA ILP) across Windows, Mainframe as well as Unix platforms

Attended workshops, client meetings and test brainstorm meetings, mentoring and coaching team members on testing methods, best practices and tools

Collected defect information, analyzed test results and coverage to generate written reports, technical and procedural documents, such as Project Defect Report for peers, end users and QC Management

Quality Control Engineer 10/2001–12/2004

Quality Control Team, Cognos Incorporated, Ottawa, Canada

Project Baltic, Atlantic, Amazon and Adriatic (Cognos ReportNet, PowerPlay

Multilingual, Upfront and PowerPlay Web Explorer)



Contact this candidate