Sameera S
Performance Engineer
********.**@*****.***
PROFESSIONAL SUMMARY:
An IT professional with 7+ Years of experience in Performance Testing & Engineering in Web-based- SOA, Client-Server and Middleware Applications. Hands-on experience in software testing of web-based, client server applications and web services. Highly proficient in performance testing using HP LoadRunner, Performance Center, JMeter, Blaze Meter.
TECHNICAL EXPERIENCE:
Worked on different various Domains like Financial, Insurance, Airlines & Retail/Travel with a unique combination of skill set in solving complex quality assurance challenges, and implementing solutions that work.
Comprehensive experience using HP LoadRunner, JMeter for Performance Testing, Stress Testing, Load Testing, Longevity/Endurance Testing, Benchmark/Scalability Testing and other related to Performance testing.
Essential knowledge in Performance Test Life Cycle, Software Development Life Cycle (SDLC) and well acquainted with Software Testing Life Cycle (STLC).
Rich experience in gathering NFRs requirements, Test planning, Scripting, Performance Test Execution, collating, analyzing and publishing the results.
Experience with the components of JMeter and with the components of Load Runner i.e. VuGen, Controller, Load Generator, Analysis.
Experience in working with various protocols like Http/Html, WEB (Click & Script), Web Services, AJAX Click and Script, Siebel Web, RTE, FLEX, SAP-WEB, SAP-GUI and Citrix etc.
Widespread experience with Enterprise Java(J2EE), JSP, Java Script, jQuery, Spring, Hibernate. Used JMeter, HP LoadRunner, SOASTA, CloudTest, LoadStorm, performance center, HP quality center (QC).
Hands on Experience of Monitoring tools such as HP SiteScope, CA Wily Introscope, OpTier, Splunk, Teleran, Dynatrace and AppDynamics.
Good skills in SQL statements, database connectivity, Oracle10g, configuring TNS file and connecting through TOAD.
Good with debugging scripts by running it within VuGen with Runtime Settings and performing IP Spoofing using Load Runner to replicate a real world/Production like Scenarios.
Hands on experience in Creating Dashboards in Splunk as administrator and other monitoring tools.
Pinpointed issues and bottlenecks and presented them to development team.
Expert knowledge of Identifying and Analyzing the Bottlenecks in Performance testing, Web Performance Throughput, Server Response Time and Network Latency.
Expertise in testing Web/J2EE technologies, .Net, middleware, Web services, API’s, Customer facing applications.
Experience in using JConsole, VisualVM for Java Monitoring to identify the bottlenecks like Heap Memory, GC and JVM Size.
Excellent understanding of the functionalities of QA in Development, Enhancement and Maintenance of software applications and Software Development Life Cycle (SDLC).
Experienced using LoadRunner and JMeter for developing scripts and executing various performance tests Such as Baseline, Benchmark, Load, Stress, Endurance, Failover Tests etc.
Good experience in using APM tools Such as Dynatrace, Wily Introscope, Hp SiteScope and AppDynamics for Monitoring the Applications and identifying performance issues.
Used Performance Center/ALM to manage LoadRunner scripts and scenarios. Customized LoadRunner scripts in C language using string manipulation functions and C libraries and other Custom Coding.
Experienced Monitoring CPU, Memory and Network and analyzing performance bottlenecks such as long running SQL queries, memory leaks etc.,
Experienced in analyzing scenario results using LoadRunner Viz. On-line graphs analysis and reporting, network delay, client delay identification, I/O delays, transaction time, CPU and memory usage, miscellaneous, server level issues.
Did JVM Tuning on the Garbage collection, which is a key aspect of Java server performance and Revised JVM Heap Sizes analyzing the Performance of the Application.
Excellent oral and written communication skills, including the ability to read and process complex technical information.
Expert in deliverables like Test Report and Test Analysis (Weekly Status Report, Work Break down structure, Defect Trend etc.
Good experience in engaging with business contacts and stakeholders for requirements gathering, architecture review and results analysis.
EDUCATIONAL QUALIFICATIONS:
Bachelors in Information Technology from JNTU: Aug 2006 – July 2010
Masters in Engineering Management from SCSU: Aug 2013 – May 2015
TECHNICAL SKILLS:
Operating Systems:
Windows XP/2007/2010/Vista, IOS. AIX, UNIX, Linux
RDBM/Databases:
MS SQL, Microsoft Access, SQL Server, Oracle Database, DB2.
Cloud Testing:
HP Storm Runner Cloud testing, AWS & Azure, SOASTA Cloud Test Lite.
Programming Languages:
Microsoft C#, JAVA, C, C++, visual basic, .NET, Php, GO
Tools:
Micro Focus Load Runner 8.0, 9.5, 11.0, 11.50, 12.02, HP Performance Center 11.0,11.5, 12, ALM, HP Quality Center, JMeter 2.0, 2.5, 2.7, 2.8, 2.9, 2.10, SOAP UI and QTP, JIRA v6.1.3, Toad, JProfiler
Monitoring tools:
Performance Center, Wily Introscope, HP SiteScope, Dynatrace, OpTier, Teleran, HP Diagnostics, Transaction Viewer, Splunk, OEM & App Dynamics
Application Packages:
MS Office, Microsoft SharePoint Server, Adobe Photoshop CS5, Dreamweaver, flash, Illustrator, In Design.
LoadRunner Protocols:
Web (http/html), Web Services, MQ Client/Server, Citrix, Web/Winsock Dual Protocol, Citrix ICA, Citrix, SAP Web, Ajax TruClient, GUI, Click & Script, Siebel, Oracle.
PROFESSIONAL EXPERIENCE:
Client: 3M-HIS, Salt Lake City, UT Duration: Feb 2018 – Till Date
Role: Performance Engineer
Responsibilities:
Design, configure and implement the Performance Test environments to conduct Stress/Load/Endurance testing.
Coordinate with Project Teams & Stakeholders to gather Performance Testing Requirements.
Create Scripts using Load Runner Protocols Web-Http/Html, Mobile-Web, Web Services for testing Multiple applications.
Modify existing Load Runner VuGen scripts to replicate new builds of the application.
Execute different performance test scenarios like Load, Stress, Volume and Endurance tests according to Test Plan document.
Extensively use JMeter for Web services testing and also Web Application Testing.
Conduct work group meetings across all areas of the product organization to identify, prioritize, and mitigate risks to the responsiveness and scalability of our offerings.
Follow Agile (Scrum) process, the performance validation process goes by the ‘Work Done & Ready to Go’ approach from time to time, release to release and in specific sprint by sprint.
Organize status meetings with the stakeholders for Performance Testing in the project Ensure processes and content of all Performance testing artifacts are documented, maintained and transitioned to the client teams as per the Client's Retention and Transition policy.
Responsible for monitoring the Infrastructure behavior using AppDynamics during Load Test execution to identify performance Bottle Necks if any. Use App Dynamics to Monitor the Memory Pools, Transactions, Stack Trace and other performance counters of all the tiers involved in the Architecture.
Use App Dynamics to Monitor End User Experience, Overall Application Performance, Business Transaction Performance & Application Infrastructure Performance.
Worked on building Load Test Scripts from scratch based on Business Requirements.
Worked on modifying Load Test scripts using Correlation, Parameterization, C Programs etc. to replicate Business Functionalities.
Use JVisual VM to Monitor the JVM for CPU, Heap, GC, Thread behavior and I/O Stat using UNIX commands like top, Vmstat, Nmon and Netstat while system under test.
Schedule Test results review meetings with project teams to walk through Test reports and discuss about Performance Bottlenecks Identified.
Involve in defect tracking, reporting and coordination with various groups from initial finding of defects to final resolution using JIRA defect tracking tool.
Prepare Final Performance Report by consolidating all the data gathered from the Performance Tests.
Create progress reports on projects and reported status to Project Manager on a Timely manner as part of the Team’s Process.
Publish the Final Reports to all the Stake holders of the Project.
Environment: HP Load Runner 12.50/12.53, JMeter 3/5, HP Performance Center, Quality Center, HP ALM 12, JVM, App Dynamics, UNIX, Vmstat, Nmon, Netstat, Application Servers, Tomcat Servers, Web Logic, Web Servers, Oracle 11g, Toad, SQL Developer Message Queue Servers.
Client: United Airlines, Houston TX Duration: May 2016 - Jan 2018
Role: Performance Engineer
Responsibilities:
Organize Business meetings to understand the application and make a decision to whether conduct a performance test or not.
Analyze Business Requirements, Non-Functional Requirements and Functional Requirement Documents to develop Performance Test plan and Identify Business Scenarios for Performance Testing.
Responsible for Preparation of Work Load model by grouping together various critical business processes for Performance Testing and getting Sign Off from the stake holders
Develop VuGen Scripts using LoadRunner Web Http/Html, TruClient, SAP, .Net etc. for testing Web Applications and Web Services protocol for Testing Web Services, Oracle and SOA, Mobile Web Applications, Mobile Native Applications.
Created and Analyzed the Heap & Thread dumps and helping the development team to fine-tune the application performance.
Highly involved in monitoring middleware application server's performance metrics like Thread count, JVM heap size, queue size etc.
Execute JMeter scripts for high capacity load tests. Extensively used JMeter for Performance testing SOA, Web services and API’s.
Analyzed results using LoadRunner Analysis tool based on Transaction per Second, Average Response times and resource usage to meet the SLA (Service Level Agreements)
Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test report.
Develop and implement Load and Stress tests with LoadRunner and present performance statistics to application teams, and provide recommendations on the issues that impact performance.
Monitor and administrate hardware capability to ensure the necessary resources are available for all the tests.
Performed online monitoring of Disk, CPU, Memory and Network usage while running the load test.
Reported various Performance Analysis Graphs and Reports collected from various Performance Tools and discuss its bottlenecks such as Memory Leaks, JVM Heap, CPU Utilization, Network time, Page Refresh Time and the Page Rendering Time.
Application server, Database, Network and WebLogic monitors are utilized during execution to identify bottlenecks, bandwidth problems, and infrastructure, scalability and reliability benchmarks.
Configured and used Dynatrace for performance monitoring and performed trouble shooting on Bottlenecks with performance testing along with response times, analysis and profiling the application to find out where the performance issue.
Infrastructure Monitoring using Dynatrace & Application Monitoring using End User Management & Business Availability Center BAC. Responsible for Setting up user profiles, configuring and adding application servers on Dynatrace tool
Proficiency in QA Methodologies, Quality Assurance Life Cycle (QALC), Defect Management Life Cycle and Software Management Life Cycle (SDLC).
Managed a team size of 10 to 12 team members and have the ability to implement and drive through the performance testing process for multiple projects.
Ensure sufficient level of stakeholders’ participation in all phases of the Performance Testing life cycle. Ensured appropriate stakeholders’ signoff is obtained where required on test artifacts and exceptions.
Environment: Performance Center 12.02/ALM, Load runner 12/12.02, HP Storm Runner, JMeter, Dynatrace 6.1, Splunk 6.2, Introscope, Web logic 11g, JAVA 8, Amadeus Interface Systems, Web Cloud, JBoss, SQL DB, SOAP, Rest, Wireshark Networking, PL/SQL, APP Dynamics, HP Diagnostics, Splunk, JProfiler, Tomcat, Putty, Windows NT, TCP/IP, AIX (Unix), Linux, Toad.
Client: Ally Financial, Detroit, Michigan. Duration: May 2015 - April 2016
Role: Performance Tester
Responsibilities:
Gathered business requirements, collecting the information about Service Level Agreement from Business Analyst and developers, Identified the Test Cases for Performance as per the Client requirements.
Performance Tested SOA Based application using Web Services Protocol. Executed tests using HP Performance Center.
Designed varieties of Scenarios for Baseline, Benchmark, Load, Regression, Stress, Steady state and Endurance Testing.
Performed load and stress testing using JMeter to assess and improve the application performance.
Created Scripts for the Application using HP LoadRunner and JMeter Parameterized the data and enhanced the scripts according to the requirement and Scenario Design.
Analyzed results and provided Developers, System Analysts, Application Architects and with information resulting in performance tuning the Application.
Analyzed Database AWR Reports, Identified Queries which are consuming more resource, high response times etc.
Worked with Database Admin team in tuning the Database Queries by providing all the findings from load test execution.
Also effectively involved in running the Performance Tests from the Standalone Controller Machine.
Extensively Involved in distributing the Load across the Load Generators while running the Tests on the Controllers using Frequency of Business Transactions, Number of Users, Number of MDRV process to Launch, Number of Scripts, Hardware Configurations each LG in to Consideration.
Performed Dynamically Updating the Scripts, Data Files while Running the Tests.
Executed Web Services using Web & Java Protocols, also wrote Custom Code to Capture the Request and Response Associated which helps in tracking the Issues when a service fails.
Responsible for analyzing applications and components behavior with heavier loads and optimizing server configurations.
Created Scenarios and ran Various Tests (Baseline, Peak Hour, Duration, breaking point, Fail Over/Fall Back) in Performance Center.
Analyzed results using HP LoadRunner Analysis tool and analyzed Oracle database connections, sessions, Web Logic log files.
Deeply involved in looking in to Logs of Applications Servers to identify any Issues, performance Engineering aspects.
Responsible for Analyzing Average Response time, TPS, Throughput for each Test.
Identified the severity of the bugs according to the priority and reported to developers using JIRA as a Bug Tracking tool.
Interacted with developers, DBA, Networking, Infrastructure and other concerned teams to solve Performance issues.
Involved in conducting benchmark test, stress tests and volume tests against the application using LoadRunner- VuGen.
Maintained defect status and reported testing status weekly and monthly using defect tracking tools.
Interacted with developers during testing for identifying and fixing bugs for optimizing server settings at web, app and database levels.
Updating the Stake holders about the performance results by generating reports using analysis.
Environment: HP LoadRunner 9.52, JMeter 2.8, VuGen, Performance Center, Transaction Viewer, Solaris, Java, J2EE, Oracle10g, WebLogic, Citrix, Site minder, EG Monitor, CA Wily Introscope, TDA, Secure CRT.
Client: State Farm Insurance, IL Duration: Aug 2011 - July 2013
Role: Performance Tester
Responsibilities:
Prepared Test Strategies, Test Plan and Test Cases as per the business requirements and Use Cases.
Developed the Load Test scripts using the Load Runner Virtual User Generator (VuGen) and enhanced the scripts to test the new builds of the application every release.
Extensively used SAP Web, WEB (HTTP/HTML), SAP GUI, Citrix client and Web Services protocol.
Collected HTTP basic watch, wire shark and fiddler logs for various scripting challenges in VuGen.
Created load runner scripts for BAC alerts and worked with HP to migrate in to production for external customer facing applications.
Prepared Test plans which specify testing overview, testing approach, testing strategy, roles and responsibilities, scope of testing, Architecture landscape.
Conducted Performance tests on the Application servers using HP Load Runner to establish the load capacity of the servers.
Used HP Performance Center 9.52 to create scenarios and run load tests, HP LoadRunner 9.5/11.0 and JMeter for writing VUser Scripts.
Using Load Runner analyzed the response times of various business transactions, modules login times under load, developed reports and graphs to present the test results.
Involved in performance, stress and duration testing of .Net Apps and Java Apps.
Produce a test report correlating poor performance to bottlenecks and make recommendations for improvements.
Used Introscope to monitor critical transactions and isolate performance bottlenecks
Used Perfmon for Windows Monitoring and vmstat, iostat, mpstat, Top, etc., for UNIX Monitoring.
Extensively worked with Shunra team for network testing with various bandwidth and latency tests.
Executed endurance tests for long period to find DB server usage.
Gathered SQL queries, Java object classes, web service calls from Wily Introscope.
Environment: Windows Server 2005/2008, Oracle10G, Windows, Unix, PerfMon, Nmon, ASP, HTML, SQL, TOAD, PL/SQL, HP Quality Center, HP LoadRunner, J2EE, WAS, IIS, Apache.