Sign in

Quality Assurance Microsoft Office

Bayonne, New Jersey, United States
January 04, 2018

Contact this candidate


Bayonne, NJ ***** 901.***.****


Creative and Persistent In Achieving / Producing Highest Quality Outcomes

Recognized for thorough development, integration, and execution of strategic functional, performance, and security test plans. Design, maintain, and execute automated regression testing for new and existing products to assure integrity. Diligently resolve problems while meeting milestones and making projects successful. Operate remotely and independently, communicating successfully in a team environment and demonstrating a willingness to help teams in achieving goals. Worked simultaneously on multiple projects / tasks. Flexible with adapting to new procedures and technologies. Passionate that customers receive the highest quality products.

Quality Assurance Management

Application Development

Test Progress & Results

Testing / Bug Management

Development & Maintenance Management

Business Analysis

Production Support

User Interface Management


Platforms: Microsoft Windows 98/NT/XP/8.0/8.1/10.0, Microsoft Server 2003/2003 R2/2008/2008 R2/2012/2012 R2, Unisys A-series, IBM I-series

Applications: Microsoft Office XP/2003/2007/2010/2013/2016, TFS

Tools: Beyond Compare, SoapUI, TFS, HP Quality Center and UFT/QTP


FISERV, Lincoln, NE 1998 - 2017

Senior Quality Assurance Analyst (2006 – 2017)

Provided in-depth knowledge of specialized fields in leading on and offshore teams to test software to meet defined requirements. Effectively presented project status reports with senior project management and peers’. Participated and provided input in technical walkthroughs, coding, unit testing, system testing, QA and UAT support.

Served as a consultant in the refining of project work estimates and requirement processes.

Calculated work estimates and testing requirements meeting project milestones and client commitments.

Maintained and setup testing environment(s) ensuring servers met minimum system requirements for the application and configuration mimicked customer settings.

Utilized HP Quality Center (QC) to create test plans and document the results. Test plans consisted of detailed functional and nonfunctional steps and the expected results of each step. Unexpected results were tracked to make sure they got resolved. Test plans were marked with what platform, browser, and device the test should be performed on. While I was trained on how create automated tests I did not get the opportunity to create automated scripts in QC.

Created automated regression test scripts using HP Unified Functional Testing (UFT). To verify that fields, buttons, and data existed on a page(s), building repositories of data for verification, and to verify that a new or changed application does not cause any issues. If an issue was found I would use the script test results to help research what the cause of the issue is.

Collaborated and trained staff on how to resolve any known issues for new applications and enhancements to existing software, reducing installation and support of customer issues.

Led on and offshore teams for more than 5-years, while continuously meeting and exceeding expectations through team involvement and ownership of projects.

John Layman Page 2

Ensured offshore personnel took an active role in product development and monitored productivity using multiple communication methods regularly, including conference calls, email, and instant messaging; confirming they had a clear understanding of project expectations.

Assisted documentation department in creation of application help documents and procedures, aiding in installations by customer or support staff, providing user guides helping clients learn in greater detail, and troubleshooting common mistakes.

Developed, and tested, beta platforms prior to a full product launch, giving trackable data to confirm sites tested all application enhancements, and verifying software has been thoroughly analyzed.

Worked with beta clients, implementing, training, and supporting new / updated versions of software.

Quality Assurance Analyst (2003 – 2006)

Created test plans based on Software Development Life Cycle (SDLC). Articulated task progress to executive management and worked with all members of development and operational teams. Identified, tracked, and reported potential performance, infrastructure, and capacity issues prior to application reaching production. Interfaced with System Engineering and Software Development teams regarding system and application level issues.

Performed automated regression testing, maintaining application integrity.

Executed automated performance test scenarios, emulating user traffic under load.

Reduced verification time 75%, by implementing a new defect verification tool and process that still meets company procedures and policies.

Refined process for reporting application changes and testing results to be digitalized, allowing for tracking of product through every stage, resulting in quicker turnaround time from development to release.

Increased integrated testing and improved process of maintaining and upgrading bases. By improving procedure documentation and developing an automated regression team, to help keep up with the continuous Integration(CI) and Continuous Deliver (CD) of products.

Upgraded and enhanced automated regression bases for new platform releases, verifying integration of applications are compatible with features.


FISERV, Lincoln, NE

Administration Support (1998 – 2003)


Bachelor of Science (BS), Technology Management


Professional Development

Lean Six Sigma Yellow Belt Certification, 2017

Contact this candidate