N ame : M I TESH MADV IYA Employer : Na resh
E mail : m ************@*****.***
E mail: h r@thegu ruinc.com
D i rect : 505-***-**** Ph: 800-301-0176x101
PROFESSIONAL SUMMARY
• 8+ Years of diverse experience in Performance testing with expertise in requirements
gathering, test planning, test execution and detailed performance analysis of Web based
applications, Client Server and Mainframe applications.
• Developing /Enhancing Vuser scripts,strong experience in costuming loadrunner scripts.
• Hands on experience in implementing LoadRunner, Developing Load Test Conditions
and Test Scenarios.
• Designing and implementing performance testing scenarios.
• Solid Experience with installing LoadRunner Controller, Analysis and Generator on
Windows platform.
• Installing, maintaining and administering LoadRunner PC software.
• Used JMeter in TCS America to test the application and record the scripts.
• Used Quality Center for tracking and reporting bugs.
• Activating / Configuring monitors using sitescope
• Performance testing Experience in Web services, Winsock, J2EE, Oracle, Mainframe
applications by using HTTP/HTML5, DHTML, CSS and Backends e.g. ODBC, Web
Click &Script, Oracle NCA, TCP/IP Protocol and multiple protocols
• Experience with creating Vuser scripts, Vuser groups, manual and goal oriented scenarios
using LoadRunner and with the use of various performance monitors for load test
analysis.
• Experienced in tuning of ERP jobs, time/CPU consuming SQL Queries.
• Extensive Knowledge on Web, Web services and Winsock Protocol.
• Online monitoring of Graphs/Monitors using sitescope
• Created Performance scenarios and scripts for various types of tests (load, stress,
baseline, benchmark,Endurance,Capacity).
• Knowledge of Java Virtual Machine internals including class loading, threads,
synchronization, and garbage collection
• Creating Performance scenarios and scripts for doing multiple iterations.
• Proficient in putting loops into the Load runner scripts to run scripts for multiple
iterations.
• Well versed with the behavior of online monitors and the techniques to fix the monitoring
issues and monitoring Vuser status
• Utilized various performance tools such as Oracle Enterprise Manager, pmon, nmon, top
and weblogic console for monitoring database cluster contention, I/O, User, CPU
activities and overall server(s) performance.
N ame : M I TESH MADV IYA Employer : Na resh
E mail : m ************@*****.***
E mail: h r@thegu ruinc.com
D i rect : 505-***-**** Ph: 800-301-0176x101
• Utilized Database, Network, Application server and WebLogic Monitors during the
execution to identify bottlenecks, bandwidth problems, infrastructure problems, and the
scalability and reliability benchmarks.
• Configured and used Sitescope Performance Monitor to monitor and analyze the
performance of the server by generating various reports from CPU utilization, Memory
Usage to load average etc.
• Analyzed LoadRunner Metrics and other performance monitoring tools results during and
after performance testing on Application server database and generated various Graphs
and Reports.
• Proficient in analyizin LoadRunner results.
• Configuring Run-Time settings for Vugen and Controller
• Well versed with all functionality of Virtual User Generator, Controller, and Analysis
• Configuring Pacing and Think Time to meet load objectives
• Conducted performance testing, stress testing using LoadRunner.
• Performed IP Spoofing using LoadRunner
• Analyzing performance, graphs and reports
• Working with different Vuser types and groups
Areas of Expertise:
• •
LoadRunner v9,9.5,11,11.52 Managing Multiple Projects
• •
Extensive usage of various protocols such as Activating / configuring monitors.
•
HTTP, Winsock, Citrix, Click & Script etc Analyzing scenario performance, graphs and
• ALM Performance Center 11.0,11.52 reports.
• •
Performance, Load and Stress Testing UNIX Environment
• •
Baseline, Benchmark, Soak & Network testing. IP Spoofing using LoadRunner
• •
Performance Center v9.5 Identifying Performance Bottlenecks
• •
Performance Analysis of Systems. Parameterization and correlation of VUser
• Scripts.
Creating and maintaining Performance test
•
environments Testing of GUI and Web based applications
• •
Developing Performance Test Strategies, Test Configuring Run-time settings for Vugen
Plans & Test Scripts and Controller
• •
Executing and Validating Tests Configuring Shunra tool for WAN
Emulation/Network Testing
TECHNICAL SKILLS:
N ame : M I TESH MADV IYA Employer : Na resh
E mail : m ************@*****.***
E mail: h r@thegu ruinc.com
D i rect : 505-***-**** Ph: 800-301-0176x101
Methodologies RUP, Performance Testing, CMM, TQM, Quality Assurance,Agile
Protocols HTTP/HTML, Web Service, Citrix, FLEX, .NET, Siebel, SAP GUI, AJAX
Click and Script
Operating Windows Server 2003 - 2008, Windows 2000/2003/Vista/ 7/8, Unix, Linux
Systems
Programming C, C++, Java, HTML, CSS, Javascript, PHP, XML
Languages
Testing Tools LoadRunner 11.0, Performance center, ALM performance center 11,
11.5, HP Quality Center 9.0/9.2, QTP, WinRunner, Shunra, JMeter
Web/Application IBM Web Sphere, BEA Web Logic 7.x/8.x/10.x, Tomcat 5.0/5.5, Apache
Monitoring Tools App Dynamics, HP Sitescope, HP Diagnostics, Wily Introscope, Team Quest,
BMC Patrol,Dyna Trace
Database: Oracle, DB2, MY SQL, SQL Server
Domain Knowledge Financial, Health Insurance, Retailing,
Telecommunication
PROJECTS:
BMS,Mt Vernon,IL May 2013 - Till Date
ROLE: Sr. Performance Engineer
Responsibilities include the following:
• Involved in various meetings with customers/clients to gather the performance requirements and
SLAs before testing.
• Involved in writing test cases and test plan, Test Scenarios, Test Summary Reports and Test
Execution Metrics.
• Worked closely with clients Interface with developers, project managers, and management in the
development.
• System resource utilization (CPU, Memory, Threads, etc.) & JVM heap size was monitored using
Wily Introscope.
• Carried out stress testing by introducing rendezvous points in the script.
• Participated in regular meetings with developers for reviews and walk-throughs.
N ame : M I TESH MADV IYA Employer : Na resh
E mail : m ************@*****.***
E mail: h r@thegu ruinc.com
D i rect : 505-***-**** Ph: 800-301-0176x101
• Used Wily Introscope extensively to troubleshoot application bottlenecks to dig-out which
modules are consuming excessive resources
• Analyzed the results of the Load test using the Load Runner Analysis tool, looking at the online
monitors and the graphs and identified the bottlenecks in the system.
• Used Performance center for scheduling and execution of load tests.
• Executed Soak Test for 8 hours to identify the environment stability, potential issues &
bottlenecks.
• Used web custom request to record webservice requests in HTTP protocol.
• Enhance Load Runner scripts by correlation, parameterization, adding loops and customized
functions.
• Responsible for conducting Benchmark, Volume, Baseline and soak testing
• Participated in regular meetings with developers for reviews and walk-throughs.
Scholastic, NYC, NY Mar 2011 – April 2013
ROLE: Sr. Performance Engineer
Responsibilities include the following:
• Involved in various meetings with customers/clients to gather the performance requirements and
SLAs before testing.
• Design and executed performance test plan and test cases.
• Worked with business team in establishing benchmarks to use during deployment.
• Created a number of Load testing scripts for Data seeding purposes.
• Responsible for creating the Load Distribution tables for various scripting modules involved.
• Responsible for creating the work load model along with various run time settings.
• Conducted Load testing from different locations (WAN simulation) and for various browsers and
bandwidth simulations using Shunra.
• Used Performance center for scheduling and execution of load tests.
• Responsible for conducting Benchmark, Volume, Baseline and soak testing
• Developed and maintained Performance test strategies and plans based on functional
requirements, use cases, user interface designs, system design documents and domain knowledge.
• Created high level strategy documentation and detailed test documents.
• Written test case creation and always tried to enhance the Test cases for the application.
N ame : M I TESH MADV IYA Employer : Na resh
E mail : m ************@*****.***
E mail: h r@thegu ruinc.com
D i rect : 505-***-**** Ph: 800-301-0176x101
• Worked with database team and submission and processing of data requests
• Conducted Load Testing using LoadRunner for response time monitoring.
• Used Wily Introscope for Monitoring J2EE Applications
• Used HP Quality center (QC) for defect management and test management
• Conducted Load Testing using LoadRunner for response time monitoring.
• Generated Virtual users to ensure multiuser logging, and multi session logging and analyzed the
results.
• System resource utilization (CPU, Memory, Threads, etc.) was monitored with the help of
Sitescope and Perfmon.
• Monitoring and configured JVM heap size using Wily Introscope.
• Managed and edited additional information for the bugs in the Defect Tracking system and helped
developers to track the problem and resolve technical issues.
• Maintained strong relationships with developers which helped in better triaging and narrowing
down the bugs.
• Participated in regular meetings with developers for reviews and walk-throughs.
• Responsible for ensuring the usability of the application, Navigation and graphical interfaces and
the database integrity by performing extensive smoke test, functional testing, integration testing,
regression testing, and data driven testing.
• Tested SOA Based Applications using web services protocol.
Verizon, Tampa, FL Jul 2009 – Feb 2011
Role: Sr. Performance Engineer
Responsibilities include the following:
• Involved in analyzing System Requirements and developing test plans for Functional and
Regression testing.
• Executed manual and automated test cases and verified actual results with expected results.
• Involved in writing test cases and test plan, Test Scenarios, Test Summary Reports and Test
Execution Metrics.
• Worked with business team to establishing benchmarks to use during deployment.
• Responsible for creating the Load Distribution tables for various scripting modules involved.
N ame : M I TESH MADV IYA Employer : Na resh
E mail : m ************@*****.***
E mail: h r@thegu ruinc.com
D i rect : 505-***-**** Ph: 800-301-0176x101
• Developed and maintained Performance test strategies and plans based on functional
requirements, use cases, user interface designs, system design documents and domain knowledge.
• Used Wily Introscope extensively to troubleshoot application bottlenecks to dig-out which
modules are consuming excessive resources.
• Performed QA and pre stage monitoring with Wily Introscope and SiteScope.
• Carried out stress testing by introducing rendezvous points in the script.
• Developed Vuser Scripts in Web\HTTP, Web Services, ODBC, Winsock and Click & Script
Protocols.
• Created Load Test scripts using the Load Runner Virtual User Generator (VUGen) and enhanced
the scripts by including transactions, parameterize the constant values and correlating the
dynamic values.
• Developed the Test scripts using LoadRunner VUGen to provide the individual steps or actions
that would be taken by an actual user of the application and played back without errors.
• Conducted Web services testing using SOAP UI.
• Automatically discover and aggregate transactions from user, server, database and back
end by using HP Diagnostic tool.
• Identifying the critical transactions to be load tested and base lining the performance using ALM
Performance Center 11.
• Produce performance test execution summary reports and presentations using LoadRunner
Analysis, PowerPoint to the management.
• Provided tuning recommendations and future memory requirements to Primary DBA team to
make the changes in Database like table reorg, add enough spaces to database.
• Updated test matrices, test plans, and documentation at each major release and performed
Regression testing using an automated script.
• Analyzed the results of the Load test using the Load Runner Analysis tool, looking at the online
monitors and the graphs and identified the bottlenecks in the system.
COMCAST, West Chester, PA May 2008 – Jun 2009
ROLE: Performance Engineer
Responsibilities include the following:
• Created Scripts using Web http/html, Webservices, JDBC and Java record/replay protocols.
• Used web custom request to record webservice requests in HTTP protocol.
• Enhance Load Runner scripts by correlation, parameterization, adding loops and customized
functions.
• Ran test on SAP based application using the SAP GUI protocol.
• Execute Load test and monitor end to end system for performance bottleneck.
• Executed Soak Test for 3 days for 10 hours to identify the environment stability, potential issues
& bottlenecks.
N ame : M I TESH MADV IYA Employer : Na resh
E mail : m ************@*****.***
E mail: h r@thegu ruinc.com
D i rect : 505-***-**** Ph: 800-301-0176x101
• Executed the SCALABILITY TEST with increasing the load by 25% of production volumes.
• Identified issues across different tiers of the application issues related to throttling, high
Response time, high CPU utilization, high thread utilization - Analyzed those & with support
from the Admin team resolved.
• Performed Component testing of each backend call by calling the web service, using the SOAP
UI.
• Monitored the response in Edifices (Transaction Management), provides the status of the
Eligibility request.
• Used in-house tool for monitoring the performance counter of various servers.
• Monitored Websphere App server resources, i.e. active threads, Errors, timeouts.
• Created dashboard in Wily console for the monitoring purpose.
Monitor application transactions and quickly identify application bottlenecks by using HP
•
Diagnostics tool.
Automatically discover and aggregate transactions from user, server, database and back
•
end by using HP Diagnostic tool
• Used HP Performance center for executing various performance tests.
• Perform Analysis on Load test results and create defect for issues identified during load test.
• Perform root cause analysis on failures. Resolve performance tuning related issues and queries.
• Wrote database queries to capture / gather the data.
• Checked Server logs and identify the root cause of 400/500 errors received during the test.
• Reviewed & analyzed the results & the data and published the final report after every test with all
the issues identified with the potential causes of performance bottlenecks.
Bank of America, Tampa, FL June 2007 – Apr 2008
Role: Performance Tester
• Involved in developing Test plan which includes all necessary elements to do Load testing
(objectives, scope, resources, scenario’s, test cases, schedule)
N ame : M I TESH MADV IYA Employer : Na resh
E mail : m ************@*****.***
E mail: h r@thegu ruinc.com
D i rect : 505-***-**** Ph: 800-301-0176x101
• Analyzed Test Requirements and objectives of the Load Test provided by the Application Group.
• Ran test on JAVA based application using JAVA RMI Protocol.
• Analyze the Latency and Load Test cases and collect required Test Data.
• Created and Executed LR Scripts for testing the performance of various web applications.
• Parameterize and Correlate the script with necessary Text/Image checks.
• Involved in scripting worked on both HTML/URL mode of recording.
• Recorded Scripts using VuGen with web HTTP/HTML, CITRIX and Oracle NCA protocols.
• Have a thorough understanding of assigning test checks, rendezvous points, parameterization, and
correlation (capturing dynamic values like session id’s /cookies) irrespective of the application.
• Extensively worked with the SQL tuning team to identify the DB bottlenecks.
• Created Dashboard using Introscope to monitor JVM out of memory, CPU usage, No of threads
used etc.
• Identified bottlenecks, poor response times, hardware capabilities (CPU, Memory)
• Worked with the Load Runner Analysis tool to generate and analyze load and performance
results.
IBM, Charlotte, North Carolina Oct 2006 – Apr 2007
Role: QA Analyst
Responsibilities include the following:
• Involved in Manual testing using Test Director to develop Test cases, Test scripts, executing the
scripts and logging the defects.
• Effectively work with Developers, Business Analysts, Project Managers and QA team to gather
requirements.
• Gathered user requirements and designed the Test Plans and Test Scenarios accordingly, which
involved creating Business flow diagrams in Quality Center.
• Created Test Plan and Detailed Design Document (Test Scenarios) for the load & Performance
testing to perform testing using different load levels.
• Used Actions in QTP to parameterize specific components of a test. Used reusable Actions to
streamline and modularize the scripts.
N ame : M I TESH MADV IYA Employer : Na resh
E mail : m ************@*****.***
E mail: h r@thegu ruinc.com
D i rect : 505-***-**** Ph: 800-301-0176x101
• Created and executed SQL queries to validate test data and automate validation of results at the
Back end.
• Developed excel macros to compare the Risk Numbers of the Static reports in QA environment
with the numbers generated in the production environment.
• Analyzed the Business requirements and developed the RTM (Requirements Tractability Matrix)
for the project.
• Extensive Experience in writing and executing SQL queries for verification and validation of data
For backend testing on various databases like Oracle, MYSQL.
• Extensively tested Web Based application used internally by the company which was developed
using J2EE technology.
• Working closely with the developers in fixing the defects.
• Involved in functionality Retesting, regression testing Compatibility, and Server logs testing.
• Measured Response times at sub transaction levels at web, App servers and database server levels
by using Optimal Application expert. Highly concentrated on Transactions per Sec during testing.
• Responsible of doing Automation Testing for different products of the application using
Selenium component and execute at different browsers.
• Pro Actively Involving in Testing for any Escalation Release.
• Maintenance of the automation scripts for enhancements and modifications to the product.
• Involved in Agile testing of functionalities and methodologies like Scrum.