Brandon O’Brien
Senior Performance Engineer
Contact
Atlanta, GA 30316
ad30mc@r.postjobfree.com
Profile
A very disciplined individual whose remarkable experience working with stakeholders and business owners has honed his talents in ensuring that scheduling, testing and reporting are accurate and on point; was able to develop management delivery schedules which included system management and preparation performance test executions; has rendered assistance, reporting, troubleshooting and operational support; has assumed responsibility in developing performance test strategies, performance testing methodologies, tools infrastructure, processes and efficient execution of the projects within a set of quality standards measurement metrics.
Key Skills
LoadRunner
Jmeter
SQL
Sharepoint Development
Javascript
PHP
Safe Agile
C#
HP Loadrunner /Vugen Scripting
Devtest Scripting
Data Mining
UNIX
Cloud Computing
HTML Development
CI/CD Methodologies
Experience
June 2021- February 2024
Lead Performance Engineer • General Motors • Roswell, Georgia
As a Performance Engineering lead, my responsibilities were to deliver lead a small team for the purpose of improving customer facing applications. I worked directly with customers from group managers to the CIO level assisting them with improving their IT strategies. I mainly operated in Ecommerce and CRM application spaces.
Working as an individual contributor, responsible for defect detection, code optimization and performance testing.
Lead CRM domain which involved customer/dealer transactions and record storage.
Created and implemented a comprehensive report for the business and shareholders.
Troubleshoot and perform application tuning.
Tested Azure cloud applications.
Executed test using the LRE suite.
SAFE Agile certification completed.
Executed Tests that involved APIs and HTML requirements
Analysis of response time and scalability requirements
Creation of work-load characterization
Planning and execution of performance tests prior to production release
Development of test scenarios, test data, scripts, and exit criteria
Analysis of test results to expose potential performance issues
Definition of performance monitoring and alerting requirements
Set-up and customization of application performance monitoring tools
Proactive monitoring of enterprise applications in production
Activation of alerts when key performance thresholds have been exceeded
Custom developed monitoring solutions to fill gaps in industry tool-sets
Conducted extensive load simulations of high-volume user traffic utilizing HTTPS APIs in JMeter, employing agile scalability to swiftly scale tests and obtain precise measurements of individual API load times.
Conducted extensive load simulations of high-volume user traffic utilizing HTTPS APIs in JMeter, employing agile scalability to swiftly scale tests.
Obtain precise measurements of individual API load times using jmeter.
October 2018 - June 2021
Performance Engineer II • Incomm • Alpharetta, Georgia
I executed scripts for gift card applications for multiple fortune 500 companies. I ensure SLA requirements are met for our
Customers while using multiple cloud technologies.
Monitored the CPU, memory, and network utilizations on the UNIX server using Site Scope
monitors.
Involved in walkthroughs and meetings with Performance team to discuss related
issues
Performed the testing in step-wise manner using Performance Center.
Used Clear case for Version Control, each version of the application is stored in Clear case and necessary modifications, updates, analysis is done.
Extensively used correlation, parameterization techniques, goal-oriented planning & manual scenarios while creating multiple Vuser's in the test scripts for testing the System.
Included rendezvous points in Vuser scripts generated by Vuser Generator
Executed scripts using Devtest
Monitored transactions and debugged issues using Dynatrace
Mentored test analysts on testing processes, methodologies, and application knowledge
Lead coordination and execution of release performance testing
Determined and met time estimates and schedules for performance testing efforts
Created performance test plans and approach documents for various releases across multiple projects
Developed, updated and maintained quality performance testing standards and procedures
Created test models for release performance testing which included test plan documents, test approach documents and test cases and expected results
Assisted in the planning, creation and control of the test environments for both UNIX and Windows platforms
Communicated effectively with customers and software vendors as appropriate
Identified, gathered and created test data
Defined and utilized entry and exit criteria as it related to performance expectations
Participated in lessons learned as appropriate
June 2014 - October 2018
Performance Tester • General Motors • Roswell, Georgia
I design and execute test scripts for web applications that are used in-house and for General Motors customers. I ensure our
applications can hold the desired amount of users requested to us by our developers and business users.
Speak with business owners and project managers to understand the scope of the project
Run virtual users using HP Loadrunner
Design and execute test scripts
Advise corrections to system if bottlenecks, data leaks or system failures occur
Write scripts for web based applications
Created tools that helped increase the efficiency of my team. (pacing calculator & new hire website portal)
Invented a managerial project projection tool using Sitescope
Met with client groups to determine performance requirements, goals and test strategies based on system requirements and architecture
Wrote quality test plans and communicated information to the project team
Implemented performance tests strategies using Mercury Interactive's LoadRunner
Performed automated load and stress tests to identify and isolate system performance bottlenecks and track performance metrics, such as CPU, Disk and Memory utilization
Reviewed computer logs and reports to identify program processing errors and possible improvements
Used performance metrics and average transaction timings to determine system scalability and establish benchmarks to compare against future releases
Prepared reports for end users, which contain graphs and charts that depict system improvements versus system degradation issues
Logged, tracked and retested software defects that are inconsistent with user requirements
Assisted clients during User Acceptance Testing
Education
May 2014
Clark Atlanta University
Computer Information Science