Benjamin Mao
Issaquah, WA *****
******@*****.***
A high performing driven, hands-on, and customer focused performance test expert with proven track records of increasing responsibility in performance testing and improvement in web applications. Successful stories of performance assessment and improvement from closely collaborating with business, architecture, dev and devops teams on planning, execution, analysis, and tuning effort to deliver high-performing web applications to real users and third party partners.
• With the goal of best user experience, driving application performance across system, module, user experience through deep understanding of reliability, performance and scalability
• Promoting accurate performance testing approach to simulate the real user access patterns and workload model in scaled test environments right. Manage multiple performance project tracks closely with SRE and project teams
• Hands-on experience of load test script development and real scenario determination to match real user access patterns and workload models in test environments. Driving performance CI across projects for high quality app releases in agile cycles
• Managing and mentoring onshore and offshore test engineers. Discover and calibrate accurate performance workload models on an on-going basis with business stakeholders
• Implemented and administrated Dynatrace for enterprise web app performance monitoring and improvement
• Enthusiastic, motivated, analytical and results oriented team player Authorized to work in the US for any employer
Work Experience
Lead Performance Engineer
REI - Kent, WA
October 2018 to Present
Leading REI.com performance testing efforts for all critical projects on high reliability, performance and scalability in cloud environments. Promoting comprehensive performance testing framework to platform and project teams. Integrating backend performance testing and frontend end user experience assessment together to project teams. Engaging project teams in Shift-Left attitude. Key Achievements:
• Identifying real user access patterns at peak times in an on-going basis for accurate performance testing. Applied R statistical analysis into accurate workload discovery and calibration to ensure simulating realistic prod workload in test environment
• Hands-on experiences in developing load test scripts in Jmeter/Stormrunner and synthetic end user scripts in JavaScript. Demonstrated great benefits of comprehensive performance testing methodology to project teams from both backend and frontend points of view.
• Driving performance test automation with project teams to ensure high performing app releases into production.
• Monitoring app performance in prod and stage environments. Profiling backend and front performance bottlenecks to provide root cause analysis on performance bottlenecks. Technologies: Jmeter, StormRunner, Taurus, New Relic, AWS Cloudwatch, Javascript, Kibana, Grafana, Jenkins, R, Dynatrace, Lighthouse, Puppeteer
Test Architect
McGraw-Hill Education - Seattle, WA
September 2017 to October 2018
Gather and analyze NFR on web app performance along with accurate workload modeling. Provide accurate performance testing results to MHE project teams. Developed test framework to combine app Functional regression testing and Page usability time monitoring together in Prod and PQA. Manage and mentor onsite and offsite performance team members. Lead test automation solutions and implementations with CI/CD pipeline. Provide deep end performance analysis on frontend via self- developed Performance Assessment Services on AWS and Catch point. Key Achievements:
• Provided guidelines and best practice on accurate performance assessments to management and project teams. Lead test automation solutions with CircleCI for CI/CD pipeline
• Managed and mentored offshore performance teams to meet accurate performance testing requirements and deadlines
• Designed probability functions in LoadRunner, Soasta to implement accurate workload model into performance testing. The test results provided much more valuable performance assessment feedback to project teams
• Developed headless Chrome performance diagnose solution in prod and perf environments to detect potential perf issues proactively
Technologies: StormRunner, HPE ALM, New Relic, Javascript, Puppeteer, Jenkins, Sumo Logic, Catch point, R
Performance Test Architect
Nike, Inc
July 2016 to August 2017
Ensure optimal end-user experience by validating performance, scalability, and reliability of Nike Digital Commerce. Conduct performance assessments and advise on improving front-end user experience across global markets. Develop automated tests, continually improving performance test automation practices. Review and understand business requirements and use cases. Collaborate with cross-functional teams to ensure accurate performance testing practices. Provide hands-on implementation to continuous integration (CI) process. Develop performance trends reports for executive, architect, DevOps teams outlining performance, revenue, user access, and system resource usages.
Key Achievements:
• Improved performance maturity model for Nike Digital, ensuring performance, scalability, and reliability of apps.
• Consolidated performance metrics from separate live data sources by architecting Data Rover bridge application.
• Migrated R-based performance analytical services server onto AWS as a cloud solution for project teams.
Technologies: Splunk, New Relic, Catch point, mPulse, Groovy, Java, SQL Developer, Elastic Search, Jenkins, Dynatrace, R, Selenium
Performance Team Manager
Nike Inc - Beaverton, OR
July 2015 to June 2016
Managed onshore/offshore Nike brand performance team, including training, employee development, and performance evaluations. Provided guidelines to agencies and development teams on application performance assessment best practices. Coordinating resources, priorities, and tasks. Provided load test scripts with hands-on for time-critical performance resource needs. Monitored web app performance trends to calibrate performance testing workloads. Conducted performance issue root cause analysis. Provided performance trend and capacity planning analysis to upper management, architecture and DevOps support teams.
Key Achievements:
• Laid out the accurate performance test strategy for project, QA, and vendors teams.
• Enhanced web-based performance analysis on CentOS VM, enabling effective performance assessments.
• Presented research paper titled “Using R to Discover True Web System Performance” at CMG 2015. Technologies: Jira, Confluence, LoadRunner, Gatling, Splunk, New Relic, Jenkins, SQL Developer, R Performance Test Architect
ulta.com - Bolingbrook, IL
September 2014 to June 2015
Managed onshore/offshore performance teams, mentored performance engineers on accurate performance assessment approach. Analyzed peak usages of real user access, server memory/thread, and end-to-end performance for accurate performance modeling. Provided root cause analysis on web app and system performance issues in production and staging. Administrated performance test servers and Dynatrace APM tool servers. Managed APM tools, licenses, patches, and updates. Key Achievements:
• Reestablished performance best practices for front and back-end web apps.
• Directed performance test strategy for website performance improvement, from architecture to code release.
• Presented research paper of “Calibrate Workload Model for Accurate Performance Testing” at CMG 2014.
Technologies: LoadRunner, Soasta CloudTest, Dynatrace, Jenkins, Sumo Logic, Gatling, Java, Groovy, Eclipse, SQL Developer, Your Kit, R
Senior Performance Engineer
ticketmaster.com - Rolling Meadows, IL
July 2012 to August 2014
Lead performance engineering efforts to improve the end-user experience. Validated architecture and enforced best practice for web application redesigns and enhancements, using continuous performance testing and code profiling. Reproduced subtle concurrency errors in load test environment with accurate performance test modeling. Presented tool usage tips and updates to development, QA, and performance teams. Diagnosed web app and system performance bottlenecks using open source and commercial toolset.
Key Achievements:
• Improved synthetic workload accuracy of performance tests up to 80%.
• Developed web usage pattern auto-recognition programs to analyze real user access patterns. Technologies: LoadRunner, JMeter, Java, Groovy, Eclipse, Dynatrace, Splunk, SQL Developer, Jenkins, Your Kit, R, QTP
Performance Engineer
E-Commerce Technology, Follett High Education Group - Westmont, IL June 2008 to July 2012
Directed performance testing lifecycle along project software development lifecycle (SDLC). Managed onshore and vendors performance teams based on project needs. Performed detailed statistical and analytical calculations on work load models. Correlated performance data with server, network, and database utilization to detect bottlenecks. Provided root cause analysis of performance issues to development and operations team.
Key Achievements:
• Designed random-walk selection LoadRunner scripts to simulate real user accesses more accurate and reduce test data needs by 80%
• Developed set of C APIs in LoadRunner C for reusability, maintainability, and extensibility of load test scripts.
• Created and maintained functional/regression test automation scripts. Technologies: LoadRunner, Site Scope, Java, Groovy, JMeter, Junit, JProfiler, JConsole, Jenkins, FindBugs, PMD, CheckStyle
Systems Engineer III
Enterprise Solution Delivery, Sears Holdings - Hoffman Estates, IL April 2000 to May 2008
Lead middleware application development in OOD/OOP to handle online transactions of sears.com efficiently. Analyzed and corrected performance issues in middleware applications. Developed load test models from user access patterns. Detected memory leaks and setup and managed web monitoring. Administered Silk Performer license server. Managed tool server patches and upgrades. Mentored performance testers on performance testing script skills. Key Achievements:
• Boosted online backend system performance by 35% using code improvement and performance testing.
• Improved performance testing efficiency via performance testing process automation.
• Instrumental in effectively processing sales transactions, growing from $200M to $1B in five years. Technologies: AIX C++, Standard Template Library, Java, IBM WebSphere, Informix SQL, Rational Purify, LoadRunner, CA Intro Scope
Additional experience as Senior Consultant at Meijer Inc. and as DBA at Mitel Inc.
Education
M.S. in Systems Science
University of Ottawa - Ottawa, ON, Canada
B.S in Electrical Engineering
Nanjing University of Technology
Skills
Java (10+ years), C++ (10+ years), Groovy (3 years), Splunk (3 years), New Relic (2 years), dynaTrace
(5 years), Catchpoint (Less than 1 year), Jenkins (8 years), LoadRunner (8 years), JMeter (2 years), HP Diagnostics (2 years), Oracle (5 years), Rational Purify (3 years), VirtualBox (6 years), CentOS (6 years), Selenium (3 years), R (3 years), CA Introscope (2 years), Soasta CloudTest (3 years), Javascript
(1 year), Load Runner, Linux
Publications
Using R to Discover True Web System Performance
https://edas.info/p19899
May 2015
Calibrate Workload Model for Accurate Performance Testing http://edas.info/p17053
November 2014
Additional Information
Benjamin Mao's Performance Assessment Services on AWS http://54.202.94.39:3838/