Name: Vinay B
*****.**********@*****.***
Quality Assurance Performance Engineer
7+ Years of experience as an IT Quality Assurance Professional Primarily in to Performance Testing, Engineering including Tuning, Analysis and recommendations with full software development life cycle experience including designing, developing, implementing test plans, test cases and test processes fueling corrective actions, significant cost savings. which includes Multifaceted experience in QA: software functional, automation, regression, mobile devices, performance and user acceptance testing backed by strong advanced command of various testing frameworks, agile/scrum methodologies and cross-platform skills in Windows, UNIX and Mainframe.
CORE COMPETENCIES
Software Testing Process Quality Engineering Performance Testing Performance Engineering Agile Project Scheduling Stakeholder Communication Leadership Team Building Resource Allocation Problem Solving Conflict Resolution Review: Requirements Documents, Functional Specifications SQL Test Plans Test Strategies Test Scripts Custom Coding Monitoring Tuning Retesting Test Reports Defect Management Test Metrics Tuning Production Rollout.
Professional and Industry Experience
Proficiency in QA Methodologies, Quality Assurance Life Cycle, Defect Management Life Cycle and Software Management Life Cycle (SDLC).
Expertise with Conducted meetings with Key Stakeholders-Business Representatives, Project Managers, Developers, DBA’s, Infrastructure leads, Architects, Middleware etc.,
Expertise with preparing Performance Test Strategy, Test Plan with Risk Assessments and Test Closure Reports.
Expertise in different testing methodologies like Agile, Scrum and Waterfall etc.
Expertise in testing Web, Java, .Net, middleware, Web services, API’s, Customer facing applications.
Expertise with Performance assurance from scratch (tool selection to process implementation).
Expertise Using HP tools LoadRunner, Performance Center, and ALM and JMeter Open source tool for Performance Testing.
Experienced in System Performance Testing Methodologies (Load/Spike/Stress/Endurance Tests).
Expertise extensively on LoadRunner 9/9.5/11.5/12.02/12.5v especially with protocols viz. Web (HTTP/HTML), Citrix, MQ, Web Services, Ajax, Web (Click & Script), RTE, Siebel Web, Oracle 2-Tier, Citrix, DB, SOAP, REST and TruClient
Hands on Experience of Monitoring tools such as HP Sitescope, CA Wily Introscope, Dynatrace and Appdynamics.
Expertise in finding the root-cause analysis and identifying the bottlenecks.
Experienced with defect tracking tools like Hp Quality Center, VSTS, IBM ClearQuest and JIRA.
Technical Skills:
Tools
HP Load Runner (9.0/9.5/11.0/11.50/12.02/12.5),HP Performance Center 11.0,11.5, 12, ALM, HP Quality Center, JMeter 2.9, 2.10,2.11,2.13, 3.0,3.1,3.3 SOAP UI and QTP, JIRA v6.1.3, Toad, JProfiler, XML Spy, PuTTY, VMSTAT, TOP,SAR, PERMON, TCPDump, Wireshark
Monitoring Tools
Performance Center, Wily Introscope, HP SiteScope, Dynatrace, HP Diagnostics, Transaction Viewer, Splunk, OEM & App Dynamics
Cloud Testing Tool
HP Storm Runner Cloud testing
Web Servers
Tomcat J2EE, E-Portal, IIS, Sun One Server, Microsoft Team Foundation Server, IBM WebSphere Portal
Application Server
Jakarta Tomcat, IBM WebSphere MQ Series, BEA WebLogic, Java Web Server, Messaging.
Networking
Firewalls, Routers, Hubs, VPNs, Wireless Internet Products
Database
Oracle, Sybase, MS SQL Server, DB2
Languages
Microsoft C#, C, C++, visual basic, Php
Markup/Scripting
DHTML, CSS, JQuery, JavaScript, XML, HTML, Java/J2EE, TSL
Web Technologies
HTML, CSS, JQuery, WordPress
Packages
MS-Office, Adobe Photoshop CS5, Dreamweaver, flash, Illustrator, In Design
RDBMS
MS SQL, Microsoft Access, SQL Server, Oracle Database
Operating Systems
Windows 98, 2003 Server, Windows NT/2000/XP
PROFESSIONAL EXPERIENCE
Client: AT&T, Los Angeles, CA Duration: Aug 2017 to till date
Position: Performance Lead Engineer
Responsibilities:
Interacted with business leads, Solution architects and application team to develop and mimic production usage models by collecting non-functional requirements for multi-year rollout of large volume Applications.
Conducted meetings across all areas of the product organization to identify, prioritize, and mitigate risks to the responsiveness and scalability of our offerings.
Follow Agile (Scrum) process, the performance validation process goes by the ‘Work Done & Ready to Go’ approach from time to time, release to release and in specific sprint by sprint.
Organized status meetings with the stakeholders for Performance Testing in the project to ensure processes and content of all Performance testing artifacts are documented, maintained and transitioned to the client teams as per the Client's Retention and Transition policy.
Created Performance Test plans and Test Case Design Documents with the input from developers and functional testers.
Integrated Performance Testing with various applications as well as within a cloud environment.
Used Hp LoadRunner 12.55/12.53/12.56 and JMeter 3.0/3.1 for performance testing.
Extensively used LoadRunner using Virtual User Generator to script and customize performance test harness with protocols like Web (HTTP/ HTML), Web services, Seibel Web &RTE.
Generated and associated different IP addresses to Virtual users to emulate real time scenarios for load balancing issues using IP Spoofing.
Executed Baseline, Load, Stress, and Endurance Testing using HP Performance Center.
After test execution collaborated with the development, solution engineering, and technical architecture and release management teams in the client organization, to analyze performance results and identify fixes for the findings and for effectively identifying potential bottlenecks.
Reported various Performance Analysis Graphs and Reports collected from various Performance Tools and discuss its bottlenecks such as Memory Leaks, JVM Heap, CPU Utilization, Network time, Page Refresh Time and the Page Rendering Time.
JVM Performance Tuning: GC and Heap Analysis, Thread dumps, Heap dumps, Memory Leaks, Connection Leaks, Core Dump.
Application server, Database, Network and WebLogic monitors are utilized during execution to identify bottlenecks, bandwidth problems, and infrastructure, scalability and reliability benchmarks.
Configured and used Dynatrace for performance monitoring and performed trouble shooting on Bottlenecks with performance testing along with response times, analysis and profiling the application to find out where the performance issue.
Setting up user profiles, configuring and adding application servers on Dynatrace.
Added Header with the script and monitoring the script Using Dynatrace Client.
Knowledge of Java Virtual Machine internals including class loading, threads, synchronization, and garbage collection.
Profile slow performing areas of the application, system resources and identify bottlenecks and opportunities for performance improvements by using wily Introscope tool.
Back end testing to check for data and application integrity by writing SQL queries.
Conducted application profiling and JVM tuning for all builds delivered per each agile sprint.
Reviewed and profile Java application code on over several dozen J2EE applications to optimize performance and eliminate risks to stability, capacity and high availability.
Conducted application performance profiling at method call level and set priorities for code performance optimization. Root out inefficient SQL calls and indexing issues for DBA group.
Environment: Performance Center12.02,ALM, Hp LoadRunner 12.56/12.53/12.52, JMeter 3.0/3.1, DynaTrace 6.1, Splunk6.2, CA Willy Introscope, Web logic 11g, JAVA 8, Amadeus Interface Systems, Web Cloud, JBoss, SQL DB, SOAP, Rest, Wire shark Networking, PL/SQL, APP Dynamics, HP Diagnostics, Splunk, JProfiler, Tomcat, Putty, Windows NT, TCP/IP, AIX (Unix), Linux, Toad.
Client: Bank of America, Charlotte, NC, Duration: Nov 2016 to Jul 2017
Position: Performance Test Lead
Responsibilities:
Developed Performance test scripts generation using HPLoadRunner12.50/12.02 and JMeter 2.12/2.13 as per the Project Needs.
Developed the Load Test scripts using the Load Runner Virtual User Generator (VuGen) and enhanced the scripts to test the new builds of the application by including transactions, parameterizations and correlations.
Extensively used most of the Load Runner protocols (Http/Html, RTE & Ajax TruClient) for testing Client server applications (Internal & External Faced).
Extensively used JMeter for Performance testing GUI, SOA and Web services, Built Web services Test Scripts, Test plans, JMS Test Plans and Web Test plans using JMeter.
Tested re-designed site on Amazon’s cloud EC2 instantiated servers involving the cloud computing.
Debugged web traffic using fiddler, sorted issues using firebug, IE developer, web developer tools.
Supported troubleshooting of Production related issues by targeting the problem areas.
Tracked QA issues using Jira, Rally along with defect logging that includes technical, logical, and functional issues.
Analyzed the results of the Load test using the Load Runner Analysis tool, looking at the online monitors and the graphs, analyzed the response times of various business transactions, login times under load, developed reports and graphs to present the test results.
Analyzed different metrics like Transactions Response time, Transaction under load, Transaction Summary by Vusers, Hit per Second and Throughput to understand the application behavior under full load.
Created comprehensive test results and summary reports to be shared with the project team and management and pointed out the areas which exceeded SLA.
Responsible for monitoring of the app behavior using tools like Dynatrace and App Dynamics.
Performance analysis report generation based on the SQL/ IIS/ Web Sphere metrics
Analyzed Heap behavior, throughputs and pauses in Garbage collections as well as tracking down memory leaks while executing the duration tests using JVisual VM.
Involved in defect tracking, reporting and coordination with various groups from initial finding of defects to final resolution using Quality Center (QC)
Working with vendor teams to resolve Environmental issues, Data related Issues and performance related Issues.
Involved in the Implementation review meetings.
Worked closely with Development and Business team to get an understanding of the system architecture, system component interactions, application load pattern and the Performance SLA.
Develop and maintain performance library with focus on potential reuse.
Environment: LoadRunner12.50/12.02, Performance enter 9.5/11.0, JMeter 2.12/2.13, Informatica 9.6.1, Wireshark, Dynatrace and App Dynamics SAAS, SOAP UI, ALM, .Net, MQ, MU, SAPGUI, JMS, Web Services, XML, HTML, Mongo DB, MS IIS Server, WAS, UNIX, JBOSS.
Client: American Express, Phoenix, AZ Duration: Mar 2015 to Oct 2016
Position: Senior Performance Engineer
Responsibilities:
Conducted Meetings to understand the application and get nonfunctional requirements for Performance Testing with application team, Business analysts and DB Teams.
Provided Test estimates for various phases of the project.
Analyzed requirements and translated them into performance test plans, gathered all volume metrics to mimic production load on the system in the performance testing.
Involved in the development of detailed test cases and sending to the project team for review and authorization to make sure that they meet the business requirements.
Designed and develop performance test scenarios and test data for company's applications, APIs and data processing engine.
Participated in the weekly project meetings and involved in the preparation of weekly status reports to track test execution and defect fixes.
Developed high level test strategy, Test Scenarios and detailed test plan in coordination with the business analysts, development team and project manager.
Performed effective load testing using HP LoadRunner/JMeter.
Install and configure HP LoadRunner and custom installation of Load generators on host machine for performance testing efforts.
Wrote scripts using various protocols such as Web, Web Services, Ajax (True Client) and Ajax (Click and Script)
Used Fiddler for web testing verification after scripting phase.
Tested web services with SoapUI tool to test the functionality of the new web services.
Prepared test environment, tools and resources for performance test execution
Executed load, stress and endurance tests with different scenarios.
Used Profiling tools like HP Site Scope and, HP Diagnostics are used extensively to monitor all the Tiers for Determining any performance Bottlenecks.
Setup, HP Diagnostics monitors for collection application performance metrics during test execution.
Used, HP Diagnostics for Performance data for problem solving, trend analysis, and capacity planning.
Profiled slow performing areas of the application, system resources and identify bottlenecks and opportunities for performance improvements by using, HP SiteScope,.
Monitored different graphs like Transaction Response Time and Analyzed Server Performance Status, Hits per Second, Throughput.
Logged defects in Test Director and worked with development team on fixing the defects as per priority.
Coordinating with Off Shore on project issues and executions.
Involved in the maintenance of the application after rolled over in production.
Environment: LoadRunner12.00, JMeter 2.9/2.10/ 2.11 HP Site Scope and HP Diagnostics, TFS, Fiddler, JavaScript 1.4 & 1.5, Jprobe, WebLogic 8.1 & 10.3, I Planet 6.1, Web sphere 6.1 & 7.0, Sybase DBMS, Python, Team Quest Tool (Fog light), Windows NT, MS Office, MS-Visio, AIX (Unix), Linux, CSS (Cisco) Load Balancer, Wily Introscope 8.0, Quality Center 9.0, IE, Netscape, TCP/IP Firefox.
Client: T-Mobile, Frisco, TX Duration: Dec 2013 to Mar 2015
Position: Performance Engineer
Responsibilities:
Worked as Performance Test Engineer and executed various performance test conditions using LoadRunner11.0/ 11.50
Created and coded a very flexible HP LoadRunner scripts using C functions that allowed for fast configuration changes during testing.
Developed VuGen Scripts and executed the same from multiple Load Generators in Controller.
Parameterized large and complex test data to accurate depict production trends.
Parameterized cookies, stored dynamic content in LoadRunner functions, used client side secure certificates.
Parameterized unique IDS and stored dynamic content in variables and paired the values to Web submit under Http protocols.
Validated scripts to ensure they have been executed correctly and meet the scenario description.
Analyzed the Business Requirements Document (BRD), created Test Plans and prepared detailed Test Cases.
Involved in writing module level test plans.
Checked the client side and server side verification, tested the functionality of the application.
Validated test results through the UI and through the analysis of various system/ application error logs as well as database queries.
Traced the bugs and reported to the developers using the Rational Clear Quest.
Worked very closely with developers to recreate defects found and also to verify fixes.
Analyze the CPU, Memory stats on Web servers, Application servers and DB servers using Transaction Viewer.
Monitored the Garbage collections, JDBC connections and Timeouts during the Test Execution.
Analyzed the LoadRunner reports to calculate Response time and Transactions Per Second
Developed performance analysis reports, Graphs (include Load Runner build -in graphs and MS Excel - custom graphs).
Environment: HPLoadRunner11.0,11.50, JMeter2.6/2.7/2.8,TransactionViewer, Java, J2EE, VBScript, UNIX, Shell scripting, HTML, Web Sphere, .Net, IIS, Oracle Database, SQL Server, Web Logic, MQ series (IBM and MS), Clear quest
Client: Ecentric Solutions Pvt. Ltd. INDIA Duration: May 2012 to Nov 2013
Position: Performance Tester
Responsibilities:
Involved in Analyzing the business requirements and environment specifications
Involved in manual and automation testing, Also Identified Non Functional requirements after understanding the Applications as part for Performance Testing.
Performed load testing against internal applications and services using LoadRunner scripts to emulate users and monitor systems performance.
Enhanced and modified the scripts according to the test case scenarios.
Designed scenarios for Performance Testing, Generating scripts and handling Correlation as well as parameterization using LoadRunner VuGen, executed scenarios using Controller and analyzed the results using LoadRunner Analyzer.
Created and coded a very flexible LoadRunner script that allowed for fast configuration changes during testing.
Written custom functions and programs to support the load testing efforts.
Identified functionality and performance issues, including: deadlock conditions, database connectivity problems and system crashes under load.
Used Rendezvous point, Start and End Transaction, Parameterization, Correlation features in Virtual User Generator of LoadRunner.
Used web_reg_save_param functions to correlate the scripts manually.
Enhanced script by inserting Checkpoints to check if Virtual users are accessing the correct page which they are supposed to be accessing.
Written scripts in which each virtual user accesses different data on the application.
Performed Load Testing against an application that is on Mainframe.
Created a Manual and Goal oriented scenario with certain amount of Vusers giving Ramp up, Ramp Down and Run time in the Controller of LoadRunner.
Conducted performance regression testing after upgrading the hardware and software.
Provided management with analyzed test results and provided recommendation for performance improvements as needed.
Monitor and analyze the performance of the server by generating various reports of CPU utilization, Memory Usage.
Performed performance testing using HP Business Availability Center (BAC).
LoadRunner reports to calculate Response time and Transactions per minute, Hits per sec, Throughput.
Met with managers, team leaders and developers on LoadRunner, Scripting, Stress and Performance Testing.
Environment: HP LoadRunner9.50/9.10, JMeter 2.5, MS Office 2007, Win Server2003, HP SiteScope, HP-BAC, HP Diagnostics, Quality Center.