Sign in

BA and QA Analyst- 22 years experience

Sterling Heights, Michigan, United States
February 15, 2018

Contact this candidate


Professional Summary

I'm an Information Technology (IT) Professional with over 22 years of industry experience. I have performed numerous roles, ranging from Software Quality Assurance Analyst- Team Lead, Technical Business Analyst, Change Control and System Analyst. I have been responsible for data mining, business requirement gatherings, QA testing, managing and developing QA strategies, introducing and managing change control processes, evaluating and documenting inconsistencies with clients, and making improvement recommendations. I have excellent communication and interpersonal skills and works well in a team environment and / or independently.

Technical Skills

SQL Developer, TeamViewer, Audatex,Share Point, CarsAdmin, JIRA, Confluence, HP Quality Center- ALM, MYCO Administration, Microsoft Office Suite including Visio, Excel,Project, PowerPoint, Outlook, Access, Paint Shop Pro, Remote Backup, Adobe Reader & Writer, Adobe Eforms, Dashboards, DOS, Novell, Windows, UNIX,

Charles Proxy, Omnibug, Omniture tags, C, C++, Visual Basic, HTML, Java,

Lotus Notes, Test Complete, QA Complete, Bugzilla, Open Text’s Process Administrator and Forms, Microtech’s replication software and hardware

Basic Network knowledge: Firewall, Switches, DNS, Routers, Domain Swaps

Citrix virtual environment

Professional Experience

Alliance Inspection Management (AIM) 8/2016 – 7/2017

Technical Business Analyst

Multiple role position, where I worked as a Business Analyst, QA Analyst and a Data Analyst. I used my skills to assist my Product Owner and my team in any way I could.

Creating queries using SQL Developer to analyze data. This assured inspection data was correct or needed to be corrected.

Using SQL Developer against several database schemas, Audatex, OEM automotive sites and data input from inspector partners to update pricing and part information for AIM’s inspection partners. In one project updated over 550,000+ records.

Using JIRA to write technical tickets and documentation so the administrators would be able to fix production issues.

Administered daily checks and balances to confirm production applications were performing correctly. When necessary, deep dive into any issues which were found in production. Initiate a JIRA ticket explaining the issue and a possible fix. Once fixed, review, test and adjust documentation when required.

Used data mining techniques and SQL queries to confirm data for automotive partners.

Produced release notes for Inspection software upgrades. The release notes were used by the inspectors when doing vehicle inspections.

When needed, did QA activities: create test cases, run test cases, create testable data, approved JIRA tickets to QA and to Production.

Supported Inspectors when they had issues with AIM’s mobile inspection application. Using Team Viewer to remote into the inspector’s laptops to replace the database or to adjust other settings within the application or database.

Meeting with inspection partners on new functionality which they would like to see in the inspection application. Taking what we learned from the meeting and then incorporating it into the requirement documentation. Writing the technical requirement tickets so the developers could do the necessary upgrades. Using a hybrid agile process for writing and testing test cases so the new functionality can be ready for production.

Gathered and shredded Personal Identifiable Information (PII) mail per AIM’s PII process and procedures.

VectorForm 3/2016 – 5/2016

QA Technology Team Lead

Test execution through use of browsers (IE, Firefox, Chrome, Safari), Mobile Applications on Android, IOS, including mobile devices, and tablets.

QA Technology Lead, responsible for all aspects of testing, test plans, test cases, execution metrics, data setup, environmental setup and RTM Matrix. Assigning work load to other analysts, helping them when needed, and managing the schedule of execution between analysts.

Working with Account Managers on how the best way the clients should run their UAT cycles. This included listening to the clients and finding out what their expectations for UAT was. Design and reviewed UAT test scenarios with clients.

Evaluated and then documented inconsistencies in client’s narrative and project workflows. Discussed these inconsistencies with client, allowing improvements to the requirements and development of a better product.

Reviewed outstanding issues with business team to clarify business rules and requirements

Analysis and creation of test cases using system, user acceptance requirements, and copy and comps to create an automotive website. This helps assures better test results, therefore a better product.

Reviewed and suggested several different QA test tools for the team to use.

Ally 8/2012 – 3/2016

Senior User Acceptance Test Analyst- QA Team Lead

Project QA Team Lead on SmartAuction Auction application, responsible for all aspects of testing, test plans, test cases, execution metrics, data setup, environmental setup and RTM Matrix. Assigning work load to other analysts, helping them when needed, and managing the schedule of execution between analysts.

Test execution through use of browsers (IE, Firefox, Chrome), Mobile Applications on Android, IOS, including mobile devices, tablets and emulator tool Perfecto.

Responsible for all defects- make sure all defects are reported correctly, attending triage meetings to discuss open defects, and creating defect reports for management.

Help facilitate the new process and procedures for reviewing and approving test cases through Quality Center - ALM.

Trained others in the daily procedures for writing defects, creating user ids, and the use of technical tools needed in testing.

Started the testing process and procedures to test Omniture tags using Omnibug for web applications and Charles Proxy to test mobile application.

Worked with Citrix Admins on getting the team access to desktops with multiple different types of browsers. This is to help facilitate browser compatibility testing.

Reporting and managing downtime reports for UAT environment.

Successfully tested a legacy application without any documentation by reverse engineering the system and writing test cases off the reverse engineering.

Worked with manager to help set the process and procedures for creating all testing cases in Quality Center.

Reviewed outstanding issues with business team to clarify business rules and requirements.

Developed procedures for “smoke test” for test environment.

Using Quality Center I was able to create test cases, run “Test Runs”, create and manage defects.

Analysis and creation of test cases using system and user acceptance requirements for Smart Auction, DPSR project and others. This helps assures better test results, therefore a better product.

Ford Direct 12/2011 – 8/2012

Quality Assurance Lead Analyst / Business System Analyst

Using JIRA’s bug tracking system to create, manage, and report on all defects related to dealership website testing. Capture, review and respond to dealer functional requests.

Responsible for all aspects of quality assurance pertaining to Ford’s Dealers 3500 Websites. This includes Functional Testing, User Acceptance Testing, Akamai Testing and System Integration Testing in Agile testing environment. Create quality assurance test cases and test lists to ensure quality of the dealers’ websites.

Preserving, generating and testing dealerships domain names, sub-domains, and other domain management tasks.

Administer the Lead Processing System (LPS), testing all types of leads to make sure they are precise and accurate. Developed several types of Lead reports for Ford Direct and other software vendors.

Designed and then developed several forms using Microsoft Word and InfoPath, these forms helped with speeding up the process and creating a standard for others to follow.

Mentoring others in quality assurance methodologies, this allows us to have standards and less chance of items being missed during testing.

Monitor and report on data feeds into the dealer websites. This allows for accurate data and pricing for each of the 3500 dealer websites.

Review and explain dealers’ functional requests with development staff. Supervise and support the Software Development Life Cycle on all dealership related projects.

MYCO updates on all new vehicles listed on the dealership websites. This includes all images, vehicle options, exterior and interior colors, vehicle specs, and pricing.

Responsible for testing GES online, this is a Service Appointment system. This allows customers of the dealership to put in a service appointment.

Ipad and Iphone dealer website testing.

Strategic Business Partners 08/2010 – 11/2011

Senior Software Quality Assurance Analyst / System Change Control Manager

QA Test Lead on a Procurement forms and workflow for large government entity. This included System Integration Testing, Functional testing, Data Warehouse testing, Requirement Testing and User Acceptance Testing. This also included the creation of defects in QA Complete. Supported the Procurement workflow once it went into production.

Researched several automation testing tools: Created automation testing tool research matrix, evaluated software demos, and created reports which help define the best automation tool for Strategic Business Partners.

Put into operation the use of Test Complete and utilized it to create several automation tests. Used Test Complete to do load tests on our workflows, forms and websites.

Implemented the use of QA Complete. Developed the current standard in which Test Libraries and Test Cases need to be entered. This tool not only managed the test cases but also managed defects, requirements and automation tests.

Evaluated and then documented inconsistencies in client’s narrative and project workflows. Discussed these inconsistencies with client, allowing improvements to the requirements and development of a better product.

Encouraged others on the QA team to use consistent process and procedures.

Researched and developed test data which mimics client’s Data Warehouse. Maintained the consistency of the test Data Warehouse so all tests had the same data using SQL queries.

Served as QA Lead on 15 different projects and created over 500 test cases.

Managed and developed User Acceptance Testing strategies which demonstrated to the customer how the application works and received constructive criticism in a beneficial manner.

Introduced and managed the Change Control process. Responsible for the policies and procedures, creation and documenting of forms, and training individuals on how to use the Change Control process.

Flagstar Bank 04/2001 – 03/2010

Change Control Analyst / System Analyst

Established, designed and implemented Change Control form in Adobe Eforms. Adobe Eforms allowed the creation of a workflow which allowed multiple individuals to review the form at the same time.

In an IT department of 300 IT professionals, reviewed and discussed 3000 or more changes per year. Performing changes on all types of operating systems, internal and external applications, network, and security administration.

Influential in getting change control and incident management used in IT. Trained and discussed potential issues with management, working as a mediator between different IT teams when necessary, ran SQL queries on databases to produce weekly and monthly statistical reports. Created process and procedures for IT and audit.

Crucial in making a 99.999% "Up Time" for all production systems. Worked with IT teams, project managers and their managers to find an appropriate time to make changes to production.

Change Control and Incident Management were appropriate for SOX and OTS audits. Produced and analyzed reports, documenting updates to process and procedures, and helping others with their SOX and OTS items.

Worked with the Applications Development Teams to determine the best route in which to test products before implementing a change into the production environment.

Supervised and analyzed data exports valued in excess of $10 billion.

For each version of the Change Control system I was responsible for functional, user acceptance and regression testing of the system. Responsible for the management of the beta test teams as well as review of all defects before releasing them to the applications development team.

In a disaster recovery scenario, saved Flagstar Bank from losing billions of dollars by orchestrating the delivery of diesel fuel. Without the diesel fuel, computer rooms would have to be shut down, and all mortgage business would have stopped.

Reviewed, selected and then supported two different Microtech’s Replicator systems for company media needs. These devices allowed us to produce hundreds and thousands of different CDs and DVDs with labels. Supporting these systems included updating software, fixing hardware issues, and purchasing CDs, DVDs and cases.

Responsible for creating and revising maps of Flagstar Bank’s Computer rooms using Microsoft Visio. The maps allowed IT management away to plan out where the server racks and servers would need to go.

Created and designed media labels using Paint Shop Pro for promotional items, motivational items, data exports, and for sending audits to SOX and OTS.

Administered Remote Backup software which was used to backup depositories and other branches around the country.

Health-Media Inc. 09/2000 – 02/2001

Quality Assurance Analyst

Designed and implemented the format in which test plans and test cases were created.

Reviewed and employed new procedures for automated test scripts.

Evaluated and interviewed potential QA testers with management. Created daily and weekly test load for testers. Assured the application was tested in a timely manner and the project met its timeline goals.

Produced and then tested test plans for internal and external websites, hardware, operating systems, printers and CPUs, in-house and vendor software applications and firewall accessibility.

McKesson Pharmacy Systems 07/1995 – 09/2000

Quality Assurance Analyst

Served as QA Analyst and responsible in completing all testing on time. This included, automation testing, white and black box testing, unit testing, incremental regression testing, regression testing, system testing, smoke testing, acceptance testing, load testing, beta tests, install/uninstall testing, compatibility testing, and requirement testing.

Tested internal and external applications, hardware, in-house software, vendor software applications, web-sites and printer functionality.

Educated and trained customer support on any new software applications. Customer support was then able to handle any concerns coming from clients.

Supervised technical writers on manuals and supporting documentation given to customer support. Provided technical writers the necessary direction, information and format needed to complete the documentation. This guaranteed the information within the documentation was correct and precise.

Created test plans, wrote documentation for customers, testers and for the helpdesk support, and reviewed test plans created by other QA analysts.


Associates in General Studies (with a focus in Computer Science)

Oakland Community College


Delivering Outstanding Customer Service, Oakland University, MI

Contact this candidate