Joseph
Summary:
** ***** ** ****** ******** Performance testing experience in Web, ERP (Oracle
Applications), commercial grade, and multi-tiered web based applications developed
in ASP.NET, J2EE environment and Client Server environment.
As a Performance Tester having 7 years of experience into Performance testing using
Performance Center, LOADRUNNER, JMETER, IXWLAN and Performance
monitoring expertise on AIX, WAS, WMQ, Oracle DB using NMON,SPLUNK,
Visual VM Console, AWR report tools and network traffic monitoring experience
using Wire shark and webTrends.
Expert in testing HTML/HTTP, Citrix, TUXEDO, RTE, WINSOC, webMethods.
Adept in preparing detailed schedules with target dates for deliverables, prioritizing and
monitoring activities to meet target dates, make commitments on delivery.
Efficient in managing resources and tracking progress against schedule to meet overall
schedule and performance benchmarks.
Proficient in coordinating team in design, development and Software performance testing
of various software applications for web-based and client/server applications on
Windows and UNIX, iSeries.
Innovative leadership in acquisition/implementation of improved Performance techniques
and defined procedures to find the best possible options to support business needs.
Extensive knowledge in creating performance test strategies that increase performance
coverage, validate scalability, stability, failover and recovers and improve overall
application quality.
Ability to articulate ideas and performance architectural concepts/ oversight for multiple
projects to proceed expeditiously.
Hands-on experience with all phases of Software Development Life Cycle (SDLC)
Proficient in different phases of Performance testing like scalability, stability, volume,
stress, database failover, network, delay and longevity testing.
Analyzed post performance test results and identified the bottlenecks in the wide array of
applications, interfaces, and adaptors in n-tier architecture.
Adept in using automation tools like Load Runner, Win Runner, Rational Clear
Quest, Test Director and Rational Suite.
Expertise in preparing Performance Test Strategy, Test Plan, Test summary Reports, Test
Cases and Test Scripts for automated and manual testing based on the User Requirement
documents and System Requirement documents for the Performance, Functional,
System, Integration, Regression, GUI, UAT, Security, Load, Database, VUGEN,
Smoke/Sanity and Usability testing.
Extensive experience in implementing QA methodologies like Spiral and Waterfall, test
plans, test cases, test scenarios and test deliverables.
Knowledge and implementation experience of Quality Assurance, Testing Principles and
Configuration and Change Management principles.
Well versed in testing scripting languages such as C, TSL and SQA Basic, VB Script.
Strong analytical skills, logical, presentation skills with strong communication skills
TECHNIICAL SKIILLS:
Testing Tools: Load Runner, Silk, Win Runner, Test Director,
QTP, Rational
Scripting : TSL, SQA Basic, 4Test Language, VB Script and Java Script
Environment : WINDOWS, UNIX, LINUX MS DOS, Solaris, AIX, HP-UX
Languages : C, C++, Java, Visual Basic, SQL, PL/SQL, XML/XSLT,
SOAP,HTML, CL, RPG, COBOL
Databases : Oracle 8i/9i, SQL Server 2000, MS-Access
ERP : Oracle Applications 11i (GL, AP, AR)
App/ Web Servers : Web Sphere, MS IIS, iPlanet, Mainframe
Networking : TCP/IP, UDP, DNS, HTTP. SSL
EXPERIIENCE:
Bank of America, NC Mar 12 to Till Date
Lead Performance Analyst
Responsibilities:
• Involved in writing detailed Performance Test Plans, Test Scripts and Test Cases based on
requirements using Performance centre.
• Designing and Executing the Scenarios in Performance Center and Used the Controller to
perform Load Test and Stress tests.
• Tested Performance of Web application “Tax Symphony” developed in Vignette Portal
using Load Runner.
• Involved in developing and maintaining scalable, reusable performance test scripts using
Vugen
• Designed both, manual and goal oriented scenarios for Load Generation.
• Performed stress testing through different loading patterns in the scenario.
• Design, Identify and Execute Stress Test stress transactions to benchmark application
break point.
• Uploaded Scripts, Created Timeslots, and Created Scenarios and ran Load Tests in
controller.
• Enhancing scripts with LR and protocol specific functions, custom C functions, and
logic decision control loops to handle Vusers for exception handling and customizing the
script with timer logic, dynamic transaction names, and multiple action files
• Designing and configuring test scenario using VUGEN script in ALM Performance
Center.
• Used CITRIX Protocol by using the card generation application of Bank of Americ
• Developed reports using the URL commands available through the Crystal Reports and
Integrated Crystal Reports with JAVA/JSP.
• Tested application developed with J2EE 1.4 (Servlets and JSPs), JDBC.
• Managed Performance Test Activity listing and Resource Allocation, Environment Setup,
Vuser scripting, Test Execution, Test result Reporting, Tuning and Optimization.
• Applied custom build Parameterization and Correlation rule to parameterize the Form
post and Query string Parameters.
• Collaborate with team members in defining test strategy, documenting and refining test
plans, test cases, test script and traceability matrix against business requirements to
authenticate performance test coverage
• Generate metrics and checklists to assess the current and potential risks and developed
mitigation strategy
Environment: Load Runner, Performance Center,Quality Centre, CITRIX,C/C++, SQL Server
Java, Visual Studio Team System 2008.
Scholastic, NJ June ’11 to Feb 12
Lead Performance Analyst
Responsibilities:
Managed Performance Test Activity listing and Resource Allocation, Environment Setup,
Vuser scripting, Test Execution, Test result Reporting, Tuning and Optimization.
Analyzed User/Business requirements, documenting performance
specifications/standards to be achieved.
Involved in scripting and test Oracle Applications using performance Center.
Coordinated with program managers and development leads to prioritize and scope load
and performance testing projects, driving projects to completion prior to target/release
dates.
Allocated responsibilities and coordinate load testing efforts among peers for all releases.
Mentored performance test engineers to develop, execute LR scripts using the best
methodology.
Evaluated testing needs for new and existing products, and develop an overall testing
strategy for the group.
Performed Test Services with team that includes analysis, test script design preparation,
execution and evaluation tasks.
Assisted the team with daily performance tasks. Attend project meetings, walkthroughs,
and performance status meetings.
Communicated Analysis, Results, bottlenecks and solutions to the management team.
Documented complete Performance issues to handle disaster recovery scenarios.
Through knowledge of XML, SOAP, HTTP, Web services and other internet
technologies.
Proficient in creating Load Runner VUser scripts for load and performance testing.
Modified VUser scripts through Correlation and Parameterization.
Used HTML and URL based recording to capture the communication.
Performed stress testing through different loading patterns in the scenario.
Designed both, manual and goal oriented scenarios for Load Generation.
Setup Performance Monitors on servers to obtain information for throughput and
utilization.
Schedules Batch Jobs using Windows Task Scheduler.
Developed artificial workload scenarios to perform Isolation testing.
Documented project performance issues with the help of QA Metrics.
Interacted with developers to report and track performance issues.
Environment: Load Runner, Oracle Applications 11i, UNIX, Mainframe, ASP.NET, C#,
TFS, Mercury Diagnostics, iPlanet,, MS SQL Server 2000.
JP Morgan CHASE Bank, NJ June ’10 to May ‘11
Sr. Performance Analyst
Performed Stress Test to verify that the system met expectations for performance, and.
Verified that the system could operate satisfactorily with large/peak production volume of
data, concurrent users, and transactions, under extreme conditions. Vuser generating load
on a system under maximum conditions to determine the failure point.
Involved in attending the goal is to ramp the load until failure, and then identify
bottlenecks and scalability limitations. Short terminology for Virtual User, a script
created using a development tool in Load Runner.
Involved in executing Vuser script during a scenario run that replays the actions that a
real user would perform during a test scenario.
Wrote Vuser scripts include functions that measure and record the performance of the
application’s components. Depending on how the performance test is configured,
Performed in the scope of Performance/ Stress Test is to verify the system meets
expectations for performance, and that it does not unexpectedly degrade the performance
of the business as a whole.
It will also test that the system can operate satisfactorily with large/peak 100 user
production volume of data, 100 concurrent users, and transactions, under extreme
conditions. The Oracle Server transactional replication on TMS system user functions
will be stress-tested using automated test scripts generated by Load Runner automation
tool.
Analyzed various graphs generated by Load Runner Analysis including Database
Monitors, Network Monitor graphs, User Graphs, Error Graphs, Transaction graphs and
Web Server Resource Graphs.
Performance testing was done by conducting, load testing using LoadRunner to
simulate the load generated by hundreds of users to achieve optimum
performance.
Verify the firewall checkpoints with HP Diagnostic’s tool.
Identified defects, assess root cause, and prepared detailed information for developers
and business stakeholders
Responsible for making defect status report and project status report every week.
Environment: ASP.NET, TFS (Team Foundation Server), Test Director, Load Runner,
HTML, JavaScript, Silk performer, SQL Server 2005, Windows2000/XP, HP-UX,
NBTY, NY Nov ‘08 ‘- May ‘10
Sr. Performance Analyst
Responsibilities:
A very good understanding of the Performance Project Life Cycle methodology.
Lead a Performance Engineering project from start to finish get requirements, create
scripts, run, analyze, tune/recommend, final reporting in a rapid/dynamic environment.
Thorough understanding of all database, network (Internet/Intranet) and operating system
skills from a performance perspective.
Hands on experience with Mercury Site Scope monitoring tool and correlating graph
results to failure in application and backend database.
Highly motivated to solve problems quickly and completely; proven troubleshooting
skills and ability to analyze problems by type and severity and understand complex
relationships between various infrastructure components team player, high energy and
positive attitude.
Analyzed application for testing automation requirements and implementation using
automated testing tools.
Designed Load Testing scenarios by analyzing Transaction Profiles and Task Distribution
Diagram.
Analyzed Business Requirements and developed Performance Strategy for benchmarking
new application and efforts for the team.
Excellent analytical skills, troubleshooting experience, presentation, communication, and
time management skills.
Identified and prepared test cases for performance business critical transactions.
Created VUser scripts using Load Runner by recording, incorporating Transactions,
Rendezvous points and think time and customized as per the test requirements.
Adept in Parameterizing and Correlating Load Runner VUser scripts to handle dynamic
data values.
Designed Scenarios for test Identified and categorized Monitoring Parameters to verify
performance requirements.
Created scripts that randomized test data and parsed responses for session ID information
in order to dynamically update virtual user scripts while running load tests.
Configured monitors on servers to obtain information for Performance analysis.
Extensively used Sniffer to identify Network bottlenecks.
Analyzed test results and identified Performance issues of software and hardware.
Developed Artificial workload scenarios to perform Isolation testing..
Recommended Performance tuning and Optimization techniques.
Used Quality Center for Test Management process.
Ongoing reporting to senior Management, Plan for appropriate resources (H/W, S/W and
People), Coordinate with relevant groups for setting up Test data.
Execution to ensure performance meets or exceeds previous performance benchmarks
using Load Runner.
Environment: Quality Center, Load Runner, HTML, JSP,,JavaScript,
iPlanet,WebSphere,Oracle,AS/400,Windows2000/XP,HP-UX,VSS
CARGILL, MN Jan ’07 to Oct ‘08
Performance Engineer
Responsibilities:
Coordinated performance tests on applications to assure capacity and stability of the
applications meets requirements for production deployment.
Monitored all impacted systems for network connection disruptions as well as any
obscure application behavior resulting from a network anomaly.
Measured resource consumption of components and processes and latency at measurable
points in the systems under test.
Performed the Back-End integration testing to ensure date consistency on front-end by
writing and executing SQL queries on the SQL Server database
Identifying application short-comings and bottlenecks, and assist engineers with
improving application performance and reducing latency.
Implement and Run HP Diagnostics tool for component analysis.
Involved in conducting walkthroughs with the stakeholders of the project including the
technical users in order to design complete and detailed performance test plans and test
cases and in order for the application for existing automation equipment to increase the
sample screening throughput.
Designed both, manual and goal oriented scenarios for Load Generation.
Setup the monitors and controller for protocols like ODBC for performance testing and
analyzed the performance result.
Performed Batch job testing, monitored elapsed time, CPU usage, collate statistics and
Identified bottlenecks.
Setup Performance Monitors on servers to obtain information for throughput, utilization
and latency.
Developed the Requirements Traceability Matrix and defined the input requirements to
document the Business Requirements Specifications.
Used Ramp-up and Ramp-down features provided in Load Runner to initiate virtual user
actions, a specified time interval to emulate the real world scenarios.
Used IP Spoofing to ensure that each user uses unique IP address.
Extensively used ODBC protocols and involved in resolving the issues.
Involved in troubleshooting the Load Runner scripts and scenarios.
Created report formats and generated reports in Test Director.
Environment: Load Runner, Test Director, UNIX, MS OFFICE, Web Sphere, iSeries, Windows
NT, Visual Basic, Oracle
Deutsche Bank, NY Jun ‘06- Dec ‘06
Performance Engineer
Responsibilities:
Involved in testing AP, AR and GL applications
Generated Web Virtual load using web (HTTP/HTML) protocol, base lines test scripts
and test data, executed tests and reported results, provided adequate supporting
information for bottleneck analysis.
Involved in analyzing Test requirements, ongoing reporting to senior Management Plan
for appropriate resources (H/W, S/W and People), coordinate with relevant groups for
setting up Test data, responsible for ensuring Test environment availability, Issue
resolution and coordination with other groups, Prioritize testing efforts based on release
plan, analyze and interpret test results, Finalize test plan and get approvals.
Identified, prioritized and designed Test scenarios. Designed test harness for simulating
external interfaces, provide inputs for Performance Test data setup, participated in test
team for developing test cases and test scripts (using automation tool), Sequence
Performance test efforts (Individual transaction performance, Mix of transactions with
constant variations, Stress tests and Endurance tests), supported test team in identifying
bottlenecks and recommend solutions.
Involved in test Script creation, Environment Setup, Test Execution, Test Result
Reporting, Test Result Analysis, Application and System Tuning of various systems,
applications, servers and databases.
Extensive correlation, parameterization techniques used in generating and standardizing
the test scripts for Load Runner.
Verified test environment stability and accuracy, setup specific test data, performed
backup and restore of database prior to/during testing, Supported in monitoring and
identifying bottlenecks in Oracle.
Performed Batch Job testing, monitored elapsed times, CPU usage, collate statistics and
Identified bottlenecks.
Executed test plans and test scripts using QTP.
Used Test Director for documentation and bug tracking
Environment: Win Runner, QTP, Oracle Applications 11.5.7 (AP, AR, GL), Silk, AS/400,
Test Director, Oracle, HTML, JavaScript, JSP, Windows 2000, Web logic
United Nations, New York Jan‘98 to March‘06
Software Developer/Analyst
Responsibilities:
Involved in the development of Software Requirement Specifications, which included
functional and system design specifications and constraints.
Reviewed, Tested and Verified the Standard Operating Procedures written for testing the
application.
Generated graphs and studied them to monitor the software performance and network
bottlenecks.
Prepared a Technical Background Report of the system by documenting the tools and the
environments required to install them, installing procedures and configuration.
Created and modified test scripts by inserting logical commands to handle complicated
test scenarios for performance testing.
Worked as a Member of the QA team, responsible for system testing before deployment
Documented test requirements for Perfect Care using Test Director.
Configured the project settings for Perfect Care project, created user group and user ids
for the team members using Site Administrator.
Participated in walkthroughs and technical reviews all through the testing phrase.
Created and executed detailed test cases using Test Director.
Generated test scripts for functional and regression testing using Win Runner
Enhanced test scripts using user defined functions, parameterization and logical
statements.
Conducted date based testing, extracted test data from tables by using database functions.
Created Batch Scripts and performed batch testing.
Adept in using Test Director Bug tracking tool to report application bugs and
enhancements request and discussed with developers to resolve technical issues.
Software allows only admin user to design an identity card of his choice. User
authentications are provided at every level.
The admin can create fixed fields, variable fields and design own layout of the ID-Card
with the card software.
The designed cards variable data including images can be stored in the database server
with the help of data entry package.
Conducted Regression testing for defect fixes and enhancements.
Prepared a Feasibility and Evaluation report for the modules and plans of the software.
Environment: Test Director, Win Runner, Visual Basic, SQL Server, Windows NT, JavaScript,
ASP, UNIX, AS/400
Education:
Bachelors of Engineering in Computer Science, India
Masters in Information and Systems.