Post Job Free
Sign in

Project Test

Location:
Raleigh, NC
Posted:
August 25, 2016

Contact this candidate

Resume:

Augusta Davis Hayward

**** ***** ***** ***

Raleigh North Carolina, 27609

Cell 919-***-****

Email to: ***********@*****.***

QA Project Management

Solutions-oriented Information Technology Manager with notable success in directing a broad range of project IT initiatives for Multi-Million-dollar project and vendor management services.

Track record of increasing responsibility in Test Management, Systems Analysis/Development, Test management vendor services and full lifecycle Project Management.

Demonstrated capacity to create and implement test processes and methodologies that drive awareness of best industry practices to support current business process and organization.

Hands on experience in leading concurrent stages of system development efforts including requirements definition, design, testing and support.

Good Leadership abilities: able to coordinate and direct project-based test efforts while managing, motivating, and leading project teams.

Adept at developing effective test plans, methodologies, procedures, and project documentation.

Effective communicator of strategies, methodologies, status and progress reporting to EPMO, PMO and other business executives via various communications modes.

Core Competencies

Test Planning • Strategic Vision • Collaboration

Test Process and Procedure • Negotiating Skills • Vendor Management

Risk Assessment • Organizational abilities • Facilitator

Decision Making • Interpersonal Skills • Power Point Presentation

Technical Proficiencies

Certification: Lean Six Sigma Green Belt

Platforms: SOA, UNIX (Sun Solaris, HP-UX) Sun SPARC, Windows 95/98/NT/2000/XP, OS/2 Warp, VM3270, IBM Tivoli, Netware, DB/2, SQL Server, Web Servers, MacOS.

Networking: TCPIP, Ethernet, SNA, IPX/SPX, ISO/OSI, Token Ring, LDAP, SNMP, Frame Relay, Switches, Routers, QOS Policy, MIBs, Netscape Directory Servers. VLANS, ProComm, Browser Tools, LDAP, SNA, APPC, APPN, FMI, FTP.

Tools: SharePoint, SILK Test Tool, Test Director (Quality Center), Microsoft Project, MS Office suite, Visio, CMVC, Snag It, Inet Tracker, PVCS, Team Track. Same time Meeting, Wiki’s, Online chat tools.

Testing & Software Development Methodologies:

Waterfall, SDLC, User Acceptance Testing (UAT), Business Acceptance Testing (BAT), Functional Testing, System Testing, Regression Testing, Manual Testing, Web Application testing, Automation. Onshore/Offshore collaboration.

Academic Experience: PMI Agile Certificate, Java Programming, Structured Data Algorithms, Project Management, SQL, HTML, NIST, ISO 27000, COBOL, RPG, Fortran, SQL, Unidata, System Design Analysis.

Professional Experience

NCDHHS

Information Technology Manager/ IT Test Manager

12/2012 – 7/2015

Responsible for the oversight of IT Test Infrastructure. Lead in the Test Process and Procedure development of software test planning and test execution. Oversight of System User Acceptance testing activities for NCDHHS (DMA, DMH, ORHCC, DPH) MMIS Multi Payer System. Demonstrated capacity to create and implement test processes and methodologies that supported current business processes within DHHS organization. Hands on experience in monitoring concurrent stages of system test execution efforts by vendor and test team including UAT, SIT, REG, FSIT, and Performance. Coordinated and directed project based test efforts including gathering business user requirements; reviewing, approving and tracking deliverables along critical path impacting schedule and making adjustments as necessary to remain within the triple constraint. Coached and mentored testers, business users, and subject matter experts in test process and execution. Provided support to users with process issues or test execution blocks. Collaborated and engaged subject matter experts in defining software testing requirements for change requests prior to implementation and customer sign off test deliverables. Managed and tracked costs estimates for testing. Reviewed and approved all recommended changes to agree upon processes, procedures, and functional system changes. Monitored schedules and resources by assigning and directing daily duties of testers, vendors and business users. Troubleshoot and resolved application issues escalated from customer and other departments. Created UAT test plans; Defect Management Test Plan; Customer Sign off Plan, Communications Plan and Test Process Procedure guides for business users, test teams, and subject matter experts. Obtain approvals from each agency on project testing scope, deliverables, release schedules, and change service requests. Entered, tracked, prioritized and reported defects. Status and progress reporting to PMO, EPMO and business users weekly and daily as needed. Communicated variances to schedule, scope and resource conflicts and offered solutions. Identified, assessed, communicated and documented project process related risks with mitigating solutions. Contributed to the development of project and process standards and monitored the implementation thereof. Updated and maintained project plan and documentation via a version control process. Managed all test assets in Silk Test tool.

North Carolina Community College System Office

Business Technology Application Analysts/ System Tester

6/2008 – 12/2012

Promoted to: Business Technology Application Analyst 11

As System Tester, responsible for testing SCR’s and troubleshooting help desk tickets for the Community Colleges System Office. Responsible for entering issues and defects into tracker database for resolution and assisting programmers in resolving coding errors and defects and other duties as assigned. Mentored Junior testers in Unit Test development and Software Test process execution.

Responsible for the manual testing of Colleague Modules and Sub modules; for example: Student (Registration, Admissions), Financials (ARCR, FA) HR, and Web Online Registrations for distant learning.

TekSystems, RTP, North Carolina.

QA UAT/BAT/Performance Project Manager

8/2007-10/2007

As a QA BAT/UAT Performance Project Manager, I was responsible for managing and coordinating UAT/ BAT and Performance testing activities for multiple releases of online purchasing application in an onshore, offshore environment.

Key Contributions:

In a weak matrixes environment, responsible for coordinating vendor, business and users test activities for UAT, BAT and Performance Testing for quarterly releases of 2 e-commerce web applications (Whole Offer and Whole Offer Foundation).

Define and implemented QA strategies, procedures for pre-test activities with offshore software development teams. In collaboration with project team, I aligned timelines; roles and responsibilities; and deliverables to meet objectives. I Attended daily, and wkly status team meetings. I tracked, monitored, and escalated issues that could impact critical timelines. Consulted with PM’s and team members for process improvements to Software test planning and UAT/BAT/ Performance testing. Facilitated meetings and MS Presentations to communicate status, and QA process to offshore teams, business team members, and vendor. Attended spec review meetings and provided feedback in to the quality of design documents. Tracked and communicated key milestones for release deliverables. Using Microsoft Project, I created a project plan to include tasks for both QA, business users and to track milestones and deliverables. Ensured Performance testing strategy will verify that all Service Level Agreements have been met. Maintained lessons learned for future QA strategy development.

North Carolina Dept Health and Human Services, Raleigh North Carolina, 27604

User Acceptance Project Team Lead

10/2005- 8/2007

Promoted to UAT Team Lead to establish and manage OMMIS State User Acceptance Test phase for the NCMMIS+ Replacement System. Managed and coordinated concurrent project related State User Acceptance activities while overseeing vendor project management efforts to plan and implement User Acceptance testing via a collaborative partnership between Vendor, State, and Providers. Acted as liaison between State, Vendor and other stakeholders to define and implement State user test requirements while managing overall client/stakeholder expectations. Recruited and managed experienced test analysts and built a cohesive team to support and implement the State’s business UAT testing objectives.

Key Contributions:

Responsible for defining and implementing the UAT testing phase for the State of NC, and the coordination of UAT test phase pre test activities with stakeholders.

I advocated and defined Risk Based testing approach to identify critical functionality for testing to effectively utilize limited resources within project constraints (budget, scope and resources). Developed and implemented Risk assessment process to identify business process impacts. Facilitated quality peer review sessions to improve the level of consensus amongst the teams and assure the quality of processes and deliverables.

As the team lead, I acted as SME in test process and methodology development for State User testing of the MMIS. Vendor contract performance was monitored by establishing agreed upon defect thresholds, acceptance criteria and reporting requirements. Established test processes and procedure for State user teams, to identify and document test scenarios. Used Microsoft Word to document test plans, test processes and procedures. I used Visio to create process flows within procedure documents. Defined a strategy and initiated the coordination of Providers to participate in State UAT. Defined software testing techniques (Exploratory, Functional) to be used for State UAT test phase. Used Microsoft Project to create WBS for test team resources and to track schedule and milestones.

State training was provided in the of industry testing practices via round table discussions, presentations and Executive meetings to provide a level of comfort in suggested testing approaches, and to improve State awareness of Information Technology for better decision making. Consulted with and maintained communications with OMMIS PMO.

OMMIS Program Planning State requirements and Statement of Objectives were defined for all Testing and Turnover phases. Collaborated with stakeholders in the over view of business processes and system functionality for the “Current System,” and the “To Be” system functionality. Responsible for the creation of the Change Management Plans, Communications and Issue’s Management Plan, User Acceptance test plan. Advocated Industry best practices for Quality Software Development to State business teams to influence design methods and testing.

OMMIS Artifacts and Review

Gathered and analyzing artifacts from subject matter experts. Gaps in requirements, processes and documented findings identified and were submitted for State review. Reviewed and analyzed testing requirements for all test phases. Facilitated interviews with business owners to gather and define requirements for business systems functionality. I Analyzed source data requirements for building test scenarios for State UAT testing. Change history process was established for documentation control of updated spreadsheets with requirements. Identified, monitored and communicated risks via the Risk Register log.

Affiliated Computer Services,

Lead ARCR Systems Test Analyst

12/2001 - 9/2004

Key Contributions:

Convinced Management to obtain automated testing tool to speed up regression test effort. Established proof of concept sessions with different competitors of automated tools to support decision to buy or not to buy.

Promoted to lead Accounts Receivable Cash Reporting Test Lead in the software testing effort for Accounts Receivables Cash reporting release for the North Carolina State Community Colleges. Managed software development and test management planning activities to test the Cash Reporting module in full system, functional, regression, integration and automation test phases

Responsible for creating, and implementing System test plan for ARCR project. Manage, communicate and enforce software testing processes between vendors and programming teams. Responsible for the acquisition of an Automated testing tool and created an ATLM plan. Attended code reviews with programmers and vendor teams and followed all issues to closing.

Consulted in the implementation of software automation tools Test Director and Winrunner. Used Test Director to create test cases and link them back to system requirements for coverage baseline. Used tool to create test scenarios and entered actual testing results and reported status on test cases executed, pending, failed, etc. Used Win Runner to automate testing scripts for regression testing of stabilized code. Troubleshoot failed scripts and restart test execution.

Attended weekly meetings and report overall testing status for project. Communicate risks and slippage to schedule due to limited resources or to Project Manager. Responsible for training clients on the use of new Enhancements in a classroom setting. Collaborated with Technical writer in creating, editing User procedure documents and Software Low level and High level design documents.

ACS Systems Test Analyst

Manually tested Colleague Application Enhancements for the State Community College Systems. Test consists of using a multilevel approach in testing new enhancements added to (Datatel) Colleague Student System involving data entry, maintenance screens, batch processing, calculations and reporting. The following test techniques were used: functional, regression, integration, system and relational database testing of existing modules.

Web Application testing of web advisor a Student Registration module with Staff Maintenance screens by gathering test requirements from SLDC, HLDC, Specifications and User Documentation to create test cases that will test all data field Components, Data Integrity and Integration between Application and Relational Database. Use Query Language Unidata and existing reports to verify results of output data.

Collaborated with programmers/customers to evaluate issues, application design and usability issues outside of the scope requirements, participates in User Acceptance Testing. Logged and tracked all issues using PVCS, INET Tracker System. Created, maintained, and updated Test plans, Specification and User Documentation using PVCS Version Manager for Document Control.

Alcatel

Software Test Engineer

9/1999-10/2001

Key Contributions:

Suggested design enhancement capabilities to QOS application that improved recognition of alert statuses of the network.

Managed Test, Administration, and Configuration of test environments. Identified and compiled existing Project information and software components for testing. Determined high-level test requirements from System Design, Product Requirements, Functional Specifications Documents, Design Review, and Use Cases.

Performed peer-reviews of technical documentation, including Design Specifications, Feature Descriptions, and Customer Documentation. Created Comprehensive Test Plans with Test Models, to include hardware/lab requirements, test equipment requirements and Network Distributed Subsystems. Created Risk Analyses to assure adequate Test Coverage of high-risk areas.

Planned and Executed System, Functional, Regression, and GUI testing of Network Management Systems Applications to validate usability and user requirements. Backend testing included Directory Server databases and Switch Client software, (Quality of Service and Policy Based Management) Schema checking, and Internet.

Executed test strategies using Manual and Automation testing methodologies. Configured Switches and Routers using Ethernet, Frame Relay, TCPIP to validate QoS and HRE-X actions and Conditions.

Installed and Configured Directory Servers to accept Client Data from target of test. Configured LDAP client on Switch. Used Browser tools to validate data in Schemas in Directories.

Suggested Enhancements that were incorporated into application to improve usability of application. Used Scopus for defect tracking and resolution, while providing test results back to Management. Served as Liaison between SQA remote team and developers during Quality Assurance Testing to provide them with application updates, test configuration requirements.

Created and designed defect reports and test cases using Test Director, and Microsoft Word, PowerPoint Excel and Access.

International Business Machines

Software Testing Consultant

5/1995 - 8/1998

Provided testing support for the components of MVSESA S/390 Net View under Tivoli Management enterprises. Executed test scripts for Stress and Regression test of menu items of application. Reviewed, validated and retested code fixes using CMVC. Maintained test case databases on Internal Web sites (Lotus Notes).

Provided Manual-testing support to IBM’s 3270 Emulation Communications Product. Some exposure to QA Partner to automate test cases. Interactive testing on Emulation Product to establish connectivity to AS400 mainframe using various Connections. Connectivity testing consisted of configurations of interfaces and attachments for LAN, SDLC, TCP/IP, Twin ax, IPX/SPX (Microsoft and Novell), Comport, SNA/IP, and WAC Modem. Link configuration consisted of 802, TCPIP, APPC, APPN, FMI, and SNA Networks, NWSAA.

Validated functionality of Emulation product application after connection has been established. Perform Load balancing and Fail over testing on Novell Netware Servers. Used CMVC, Lotus Notes database to document and track defects. Attended status reviews.

End User Support Specialist, 2nd level Support/ Contractor for IBM

Analyzed and resolved hardware and software failures. Provided second-level internal on-site support of IBM shelf products for users. Setup workstations and devices, Backup and Restore Systems. Installed Operating Systems, Applications, Network Configurations per User Requirements.

PC Support Specialist for IBM, Charlotte NC

Ghosted user’s current system to save data and upgrade operating systems. Implemented System Migration of down leveled Operating Systems. Upgraded Hardware on PCs to prepare for Network connectivity. Installed Operating Systems, Applications, Printers, and Network Configurations per User Requirements.

Additional Work Experience

Weck, RTP NC, Computer Operator, 1/1995 – 3/1995. Nortel, RTP NC, JIT Documentation Operator, 6/1994 – 12/1994 Memorex Telex, Raleigh NC, Computer Operator, 2/1993 – 1/1994. Wake Technical Community College, Raleigh NC, Computer Lab Asst, 6/1991 – 9/1991. Hill haven Hospital, Raleigh NC, Ward Clerk, 5/1991 – 12/1995. Holly Hill Hospital, Raleigh NC, Unit Secretary, 3/1985 – 6/1989. Women’s Center, Raleigh NC, Surgical Scrub Tech, 2/1987 – 7/1987

Education and Credentials

Regent University 8/2012-5/2016 BSIT Information Security Grad: 5/2016

Regent University 8/2016 – 5/2017 Graduate: MA Business and Design Management.

Professional Training and Certificates

Western Wake Community College, Cary NC.

Lean Six Sigma Green Belt Certification 1/2016 - 6/2016 Project Management Exam Certificate, 36hrs completed on 3/14-3/17/08. Project Management Training, NCCCSO 36hrs…completed on 12/01/08

International Institute of Software Testing (IIST), 2006. Testing in a Rapid Application Development Environment…2007 Evaluating Business Requirements …IIST, 2007. Principles of Software Automation….IIST, 2007. Risk Based Testing…IIST, 2007 Microsoft Project…Global Knowledge, 2006. Understanding Networking Fundamentals…. Global Knowledge, June 02, 2000. Principles of Functional Testing…. Rational University, March 28, 2000 Smart Window, Smart Flow, and SAI…Spirent Communications, November 2, 2000 Switch Expert Training…Alcatel IND Division, October 17, 1999 Internet Security Seminar…Alcatel IND Division, September, 21, 1999 Accounts Receivables Cash Reporting Training……Datatel, August 2003 Accounting principles….Durham Technical Community College, August 2003

Professional Affiliations

Member- PMI

Student Member- Institute of Electrical and Electronic Engineers

ISACA

NCACPA

Professional References can be provided upon request



Contact this candidate