Post Job Free

Resume

Sign in

Performance Test Engineer

Location:
Mechanicsburg, PA
Posted:
October 23, 2020

Contact this candidate

Resume:

Devi Chunduri

484-***-****

adg83d@r.postjobfree.com

Visa Status: US Citizen

Summary:

Around 10 years of Performance Testing experience in domains across Public sector, E-commerce, Healthcare, Banking (Credit cards), Insurance, Federal sector etc.

Proficient in Performance Testing and Analysis of variety of applications across Mainframe, SAP, WEB, ERP and Client-Server applications.

Expertise using automated tools – LoadRunner, NeoLoad, HP Performance tester, HP Quality Center, and Testing applications developed in the .NET Framework, Java, VB, Oracle, SQL Server on Windows and LINUX platform.

Expertise at preparing master test plans, formulating test scenarios, preparing traceability matrix from the requirements document and test case document.

Expertise with Monitoring tools such as WILY Introscope, Sitescope etc.

Worked closely with developers and business analysts to create scripts that emulate the actual business process for documenting and testing.

Having good knowledge and understanding of various Testing Methodologies like Agile, Waterfall, etc and their Life Cycles.

Strong experience in client/server-based business applications, and web-based applications including test planning, test case development, test case reviews, test data preparation, test setup, test execution, test analysis, defect reporting and tracking.

Expertise in System Testing, Functional Testing, Integration Testing, Performance Testing, Regression Testing and User Acceptance Testing (UAT).

Experience at working in close association with Release management team, DB Team, Server Team, Application development teams.

A dedicated, hardworking individual with inter communication skills to work with all levels of organization with Excellent Communication and Documentation skills.

Certification/Training

Neotys (NeoLoad) Certified Professional

https://www.linkedin.com/in/jyothsna-chunduri-1210041ab

Professional Experience:

Deloitte Consulting, Harrisburg, PA

Client: Pennsylvania Department of Human Services

September 2014 – Present

Performance Testing SME

Project:

PA Department of Human Services offers a wide range of public services including Client Eligibility across Medicaid/Medical assistance, SNAP, TANF, Child care, Child welfare, School lunch program, LIHEAP, Child support and much and is aimed at improving the quality of life for Pennsylvanians. IT Shared Services (ITSS) provides a comprehensive approach to deliver centralized IT services in a consistent, efficient and financially attractive manner to the client DHS. Load Testing services are an integral part of ITSS that provides a wide range of services across Application performance, identification & resolution of performance bottlenecks with the ultimate goal of helping DHS to improve their applications’ end user experience & increase user efficiencies. ITSS supports various Application suites across CIS, HCSIS, PELICAN, PA Child Support Enforcement System, CWIS & CAPS.

Environment:

Neotys NeoLoad 5.1, Oracle 11g/12c/19c, .NET, IIS 8.0, LoadRunner 12.6/11.5, Microsoft TFS, SightLine, Windows 2012/2016, Linux, JIRA,SmarterTrack, ServiceNow, Documentum, ImageTrust, Captiva, MTM, Splunk, JMeter, JSON,AviCode, Salesforce, SOA web services

Responsibilities

Responsible for Performance testing activity across multiple projects (CIS, HCSIS, PELICAN, PACSES, CWIS & CAPS project applications) across both work orders and huge maintenance releases.

Responsible for planning of test executions with multiple stakeholders by providing information regarding technical constraints, dependencies or obstacles to successful execution of load tests.

Responsible for driving the triage towards Performance testing issues faced during execution.

Prepared PowerPoint deck and presented the load test results to the client

Active participant in all phases of the project lifecycle. Work with project stakeholders during project definition to help the team understand risks, dependencies, and opportunities; participate in requirements definition and review; lend subject matter expertise to projects.

Attend and drive the weekly calls with client stakeholders by providing agenda and status updates, next steps etc.

Worked on applications, projects that are waterfall and Agile (Weekly and 2-week Sprints). Technologies range from .NET, JAVA J2EE, SALESFORCE etc

Worked closely with Application team developers, DBAs, Network team, Business analysts and TESS (Testing shared services team)

Responsible for client coordination in terms of DB refresh, load balancing configuration changes, firewall changes, rollback etc

Responsible for planning of test executions with multiple stakeholders by providing information regarding technical constraints, dependencies or impediments to successfully executing tests.

Responsible for analyzing performance test needs of various applications, initiatives, deliverables and timelines etc.

Provided inputs for Performance testing risk analysis and mitigation strategies.

Reviews, monitors and summarizes progress of performance testing cycles, prepare reports regarding status of project.

Worked with Offshore team towards scripting, execution and other activities as applicable.

Created and maintained a number of scripts using LoadRunner and NeoLoad across WEB (HTTP/HTTPS/HTML), WEB Services protocols across CIS, HCSIS, PELICAN, PACSES, CWIS & CAPS project applications

Created and executed a number of Load tests across various Major/Upgrades/Patch releases using NeoLoad controller

Used Microfocus Performance center for building scenarios and execution of Load tests

Responsible for data setup for various performance testing needs

Responsible for validation of test scripts post Application code deployments.

Performed execution of larger integrated performance tests involving multiple team members and stakeholders.

Facilitated collaboration between performance testing team members and development teams.

Reviewed, aggregate and synthesize data collected in test executions into a format that can be utilized by non-technical stakeholders to make informed decisions based on test outcomes.

Responsible for providing input on relevant performance test scenarios

Designed, created and maintained assets towards performance test scenarios, load test profiles to support a variety of load test executions

Used Sightline and Wily to perform deep dive diagnosis on System resources

Had working sessions with Development, Database, network teams towards understating systems and modules and the application flow.

Maintained and monitored the status of issues discovered as a result of performance testing.

Executed pre-test “smoke testing” to ensure successful performance test executions.

Reviewed test metrics and post-test data conditions to identify issues (e.g. functional errors, data corruption, SLA violations, etc.)

Performed correlation using JSON Regular expressions.

Used JIRA for task management and defect tracking

Participated in daily scrum calls and sprint ceremonies to share performance results as needed.

Monitored the performance of the WEB/App/DB/SVC/COR Servers as part performance testing

Created/tracked defects in TFS (Team foundation server) based on load test findings and agreement across stakeholders

Responsible for coordination of baseline and rollback of the DBs upon load test runs

Responsible for creating and maintain load test data for various Load test runs

Responsible for sending status updates & results reports during and after the test runs

Analyzed the results using Load test runs and generated KPI and workflow reports for the load tests executed

Performed analysis and identification of application and infrastructure performance bottlenecks

Responsible for generating and publishing Load Test Results and publishing the results to stakeholders

USPTO (United States Patent and Trademark Office), Alexandria, VA

July 2013 – September 2014

Performance Testing Analyst

Project:

The United States Patent and Trademark Office(PTO or USPTO) is an agency in the U.S. Department of Commerce that issues patents to inventors and businesses for their inventions, and trademark registration for product and intellectual property identification. There are a number of Applications developed on JAVA J2EE Platform around Patent search, Patent process, classification, Litigation, Statistics etc.

Environment:

HP LoadRunner 11.52, HP Performance center 11.52, HP Quality center 11.5, HP SiteScope 11.24,, CA Wily IntroScope 9.0,, Teamquest, OPENNMS, Citratest, VMWare, LINUX, Putty, Java J2EE, SAP HR, FICO,Wireshark, Oracle 11, WINDOS 2013, UNIX

Responsibilities:

Prepared test plan and test cases using requirements

Part of shared services serving a number of Application teams and projects at any given time

Created a number of scripts using WEB (HTTP/HTML), ERP(SAP GUI, SAP WEB), Citrix, Web Click & script, Web services.

Created and executed various load tests using HP Performance center

Responsible for creating and monitoring of various monitors for multiple applications. Monitors include Script monitors, URL monitors, URL Content, URL sequence, Port monitoring and media file monitoring

Performed manual correlation of dynamic values across various applications

Adopted an updated the Performance testing SOP (Standard Operating procedures) document based on changes and Application needs.

Monitoring the performance of the Application servers/Web server including IBM WebSphere, TOMCAT, JBoss in terms of Garbage collection, Frontends, backends, JSP, Servlets, EJBs, JDBC etc

Used Teamquest for monitoring system resources such as CPU by workload, Memory, network, Disk I/O

Responsible for Data generation for various applications

Monitored the scenario run using various online monitors in LoadRunner Analysis

Responsible for baseline and rollback of the databases after the load test run

Conducted several load tests such as one Hour peak production load, Reliability and Stress test to identify the performance issues.

Responsible for installation of all necessary patches for VM boxes

Maintained an exclusive Load Test Database instance dedicated to Load Testing activity.

Analyzed the results using Load test runs.

Generated KPI and workflow reports for the load tests executed

Conducted Endurance tests for identifying memory leak

Conducted scrum calls and triages for defects under resolution

Responsible for generating and publishing Load Test Results and publishing the results on the internal portal

Execute group of scripts to run different types of tests such as load test, stress test and Duration Test

Gathered the results from each test run and conducted in-depth analysis on the transaction response times, network latency and the performance of each server

Found performance degradation issues like “Out of Memory” problems and improved Thread pool utilization, Memory Leaks, JDBC connection Pool size, & Transaction Rollbacks

Entered defects in QC and updated status for current defects

Involved in setting up the test data for the individual modules for the load test

Walt Disney World, Orlando, FL

April 2012 – July 2013

Performance Testing Lead

Project:

Team Disney is the IT Department for the Disney world. There are about 40 different Applications deployed across multiple platforms ranging from Online reservations, ticketing, resorts, etc. Applications are built and tested across J2EE, SAP, SIEBEL, etc. The ORT team performs is the Operational Readiness Team which performs testing activities across 4 different environments and is a Shared services that does performance testing and sign off for all the various Apps involved.

Environment:

HP Performance center 11.52, LoadRunner 11.52, Quality center, CA Wily IntroScope, SiteScope, PerfMON, Java J2EE, IBM WebSphere 7.x, Oracle 11, QTP, .NET,, Windows 2008 servers, UNIX

Responsibilities:

Adopted an updated the Performance testing SOP(Standard Operating procedures) document based on changes and Application needs.

Worked on multiple Applications at any given time (6-8)

Created a number of scripts using WEB (HTTP/HTML), Ajax, SAP GUI, SAP Web, Citrix protocols.

Executed different scenarios using HP Performance center

Generate and passed dynamic data across scripts using VTS

Correlated large dynamic values such as Viewstate, Eventvalidation etc

Created a number of Load scripts for Data seeding purposes.

Created scripts using web_custom_Request.

Performed manual correlation without relying on the Correlation studio feature of LoadRunner VU.

Involved in performing volume testing based on the production volumes and cycles.

Responsible for creating the Load Distribution tables for various scripting modules involved.

Responsible for coordinating the Batch processes alongside the Online performance testing efforts.

Responsible for creating the scenario mix and various runtime configurations for the individual scripts that are part of the mix.

Responsible monitoring the Web/App Servers, Database Servers, JAVA Servers etc.

Maintained an exclusive Load Test Database instance dedicated to Load Testing activity.

Responsible for generating and publishing Load Test Results and publishing the results on the internal portal

Created and maintained monitors in Sitescope for Web/App/DB Servers.

Execute group of scripts to run different types of tests such as load test, stress test and Duration Test

Gathered the results from each test run and conducted in-depth analysis on the transaction response times, network latency and the performance of each server

Monitored CPU, Memory and throughput using different monitoring tools such as Wily IntroScope, PerfMON while running different types of tests

Tuned servers for memory and CPU as per the requirement and test result analysis

Found performance degradation issues like “Out of Memory” problems and improved Thread pool utilization, Memory Leaks, JDBC connection Pool size, & Transaction Rollbacks

Entered defects in QC and updated status for current defects

Monitored the scenario run using various online monitors in LoadRunner Analysis

Analyzed the results using LoadRunner Analysis Graphs.

Co-ordinate with developers to discuss bugs and to enhance the application.

Involved in setting up the test data for the individual modules for the load test

Conducted several load tests such as one Hour peak production load, Reliability and Stress test to identify the performance issues.

AmerisourceBergen, Valley Forge, PA

November 2011 – April 2012

Performance Test Analyst

AmerisourceBergen Corporation is one of the world's largest pharmaceutical services companies serving global markets with a focus on the pharmaceutical supply chain. Servicing both pharmaceutical manufacturers and healthcare providers, the Company provides drug distribution and related services designed to reduce costs and improve patient outcomes. There are a

number of Applications built on Microsoft.NET and JAVA J2EE, Mainframes. Headquartered in Valley Forge, PA, AmerisourceBergen has locations around the world. In the US, ABC handles about 20% of all of the pharmaceuticals sold and distributed throughout the country.

Environment:

Environment: LoadRunner 12.6/11.52/11.0/9.5/9.0/8.1/8.0/7.5, HP Quality Center 9.0 HP Performance Center 9.5, DB2, HP QTP 9.x, HP BAC, Microsoft.NET, ERP (SAP ECC, SD, PS, BI), Java J2EE, Mainframe, 7.1, BEA WebLogic 8.1, J2EE Diagnostics 3.5, Web Services, JavaScript, IIS 6.0/5.0, COM+, CA Wily Introscope 7.x, Oracle 10g/9i, DB2,TOAD, Windows 2003K, PC Anywhere, Terminal Services Manager and Client.

Responsibilities:

Worked closely with Business Analysts and Developers to gather Application Requirements and Business Processes in order to formulate the test plan.

Hands on experience in Web testing including high level of Security enabled modules.

The performance testing and monitoring framework was implemented mainly using, HP’s LoadRunner.

Developed VUser Scripts using the HTTP/HTML, Web services, SAP protocols in VUGen in Performance Center.

Involved in updating QA test plans based on functional requirements, use cases, user interface designs, system design documents and domain knowledge

Created high level strategy documentation and detailed test documents.

Always tried to enhance the Test cases for the whole system.

Conducted Load Testing using LoadRunner for response time monitoring.

Performed Usability Testing Manually.

Deeply involved in Unit Testing, Integration Testing, Performance Testing, System Testing and UAT Testing.

Generated Virtual users to ensure multiuser logging and multi session logging and analyzed the results.

Used Quality Center for bug reporting, tracking and documentation on the Bug tracking System.

Managed and edited additional information for the bugs in the Defect Tracking system and helped developers to track the problem and resolve technical issues.

Maintained strong relationships with developers which helped in better triaging and narrowing down the bugs.

Used HP Sitescope for Monitoring URLs, servers, network etc

Participated in regular meetings with developers for reviews and walkthroughs.

Responsible for ensuring the usability of the application, Navigation and graphical interfaces and the database integrity by performing extensive smoke test, functional testing, integration testing, regression testing, and data driven testing.

Report and review the defects in Mercury Quality Center

Responsible for preparation of Test results reports

Generated Deliverables including scripts, Results, analysis reports and Execution matrix

Worked on SOA based Applications. Tested SOAP based web services in SOAPUI

L3 Communications, Salt Lake City, UT

July 2010 - Oct 2011

Title: Performance QA Analyst

Encompass: This project includes a number of supply chain and logistics applications deployed across JAVA J2EE, SAP. The implementation of SAP ECC, SAP ME, PS, MM, SAP FI OpenText, Dassian, and Teamcenter Unified. Each of these systems will be a data source and approximately 60 reporting gaps have been identified for this project. Many of these gaps will require data from more than one system in the report.

Environment:

HP LoadRunner 11.0/9.5, HP Performance Center 11.0/9.5, JAVA J2EE, IBM WebSphere,Services SAP ECC, SAP ME, PS, MM, SAP MRP, BW, BOBJ, OpenText, Dassian, and Teamcenter,, HP Quality center,10.0, CA Wily Introscope, Sitescope 10.x, Oracle 11g/10G, Windows 2003K.

Roles & Responsibilities:

Worked with Business Analysts, SMEs and track leads and got an understanding of the business and various workflows, performance requirements and SLAs.

Checked the release of Performance testing environment and associated parameters before starting the test run

Understood the functionality of various SAP Modules including SAP PS, MM, SD, FI etc.

Understood and digested the business requirements and preparation of the Performance Test Plan and Test scenarios

Created Data seeding scripts for Data generation

Created Test Agenda based on the Performance testing goals for various sub systems

Created and updated scripts for various modules across protocols ranging from WEB HTTP, SAP, Web Services, Web click & Script, protocols, Citrix etc

Updated the BPP(business process documents) when necessary to reflect the test scripts and requirements

Incorporated the necessary logic using C functions and captured the output from the Load Test scripts

Ran smoke tests to make sure the code and migration of the Release in question is acceptable

Executed a number of Performance tests based on the goals set

Paged the respective Application and middleware teams in relation to various issues seen during performance tests and Environment availability issues

Uploaded the scripts and saved to HP Quality center.

Tracked the status of scripts and assignments in share point portal.

Performed monitoring of various servers and databases and responded to Sitescope alerts for performance of Java applications and SOA

Created Defects and updated defects in HP Quality center

Set up tests and test sets in HP Performance center for various testing activities

Truncated logs on various servers before the start of a new performance test

Generated the Test status reports with KPI, Defects along with performance goals and call logs

Worked closely with the Functional SMEs, DBAs and the Developers group in tuning the application and the Database.

Helped at times in Scheduling/Kicking off the Load tests through via HP Performance Center/Controller involving a variety of Load combination scenarios.

Followed best practices and conducted performed peer reviews.

Responsible for communication the testing status for various engagements

Responsible for review of performance of Load test results and ensured proper communication across various stakeholders

Worked on weekends and Night hours to support various testing activities.

Responsible for performing analysis of the load tests ran and presenting the same to the management

Generated load test reports and performed distribution of reports and publishing.

Shared knowledge and helped the team with troubleshooting of scripting and other performance testing activities

Technical Skills:

Testing Tools: HP Performance center 12.62/11.52/11.0/9.5, LoadRunner

11.52/9.5/8.0

WinRunner 7.0/7.5, Sitescope 10.x, Citratest

Management tools: HP Quality Center 11.5

Operating Systems: UNIX, Linux, Dos, Windows NT/2000/9x

Application Software: ORACLE, SQL Server, MS Access

Web Development: HTML, VBScript

Operating Systems MS-DOS, UNIX, LINUX, Windows NT/Windows 2000, XP.

Languages C, SQL, PL/SQL.

Databases DB2, Oracle 11i/10/9/8i, Sybase 11.x, SQL Server, MS-Access.

ERP SAP, Peoplesoft

Tools/Applications Sitescope, Wily Introscope, Teamquest, OpenNMS

Sharepoint, Visio.

Business Tools: MS Word, MS Excel, MS Power point & MS Project

Web Technologies: J2EE, HTML, JavaScript, ODBC, ActiveX, VBScript, ASP,

LDAP SiteMinder, SOAP, Web Services.

Application/Web servers : IIS6.0/5.0, Apache, Weblogic 8.x/7.x/6.x/5/x, IBM Websphere 7.x, JBoss, Tomcat.

Education

Sri Subbaraya & Narayana College, A.P, India

Bachelors in Computer Science

Post Graduate Diploma in Computer Applications



Contact this candidate