Post Job Free

Resume

Sign in

Perfoemnace Engineer

Location:
United States
Posted:
January 11, 2018

Contact this candidate

Resume:

Vinusha Ravooru

346-***-****

ac3zzc@r.postjobfree.com

SUMMARY

Around 8 years of experience in Software Quality Assurance, Test Automation Process and Performance Engineering that include insurance applications which are Client/Server, web-based applications with proficiency in manual, automation,performance testing and SAP testing.

Experience in industry standard methodologies like Software Development Life Cycle (SDLC), Software Test Life Cycle (STLC) and software development methodologies such as Agile, V Model and Waterfall.

Skilled in preparing and reviewing test plans, test scenarios and test scripts as actively involved in manual testing of various projects.

Automation Feasibility Analysis, Design/ Implement Automation Frameworks using selenium, preparation of automation scripts.

Developed Scripts to meet load-testing requirements according to the SLA (Service Level Agreement) agreed upon.

Participated in Project Kickoff and Signoff meetings.

Involved in all the phases of Software Testing Life Cycle.

Good command over Test Scripts, Documentations, Reviews, Defect Management and Status report.

Extensively Testing knowledge in SAP R/3 CRM, MDM, FI/CO, MM, SD, and HR, PP Modules.

Enhanced Vuser script in Load Runner with Parameterization, Correlation, Transaction point, check points.

Very strong in custom coding, error handling, file handling etc

Excellent knowledge of software development and testing life cycles, SQA methodology and test process documentation and End user training.

Good experience in handling the Desktop, Web, Flex and flash objects with SIKULI.

Developed Vuser scripts using Web (HTTP/HTML), Citrix and Web Services protocol.

Experience in understanding Business Process from the requirements.

Extensive experience with Load, Stress and Performance testing using Load Runner and developed Vugen test scripts.

Expert in finding performance bottlenecks both in client side & server side and making recommendations for Performance Profiling or Tuning.

Analyze the CPU Utilization, Memory usage, & Garbage Collection and DB connections to verify the performance of the applications.

Experience in working on Releases, Cycles, Services, Requirements, Business Components, BPT (Business Process Testing), Test Plan, Test Lab, Defects, Report and Document Generation in Quality Center.

Hands on experience in creating Restful service to avoid requests going to the production during performance test.

Basic knowledge on Jmeter

Expertise in writing reusable modular scripts for automation testing for various Business Applications in various domains.

Comfortable with various Industry Leading Operating systems (Windows NT/98/95/2000/XP/Vista/Windows 7 and UNIX).

Experience in Installation and Configuration of Software and Hardware in testing environment.

Excellent inter-personal abilities and a self-starter with good communication & presentation skills, problem solving skills, analytical skills and leadership qualities.

Good leadership qualities with experience in training developers, advising technical groups on coding best practices.

Experience in coordinating on shore and off shore resources.

TECHNICAL SKILLS

Testing Tools

HP Load Runner tool, Selenium, Jmeter, SOAP UI, Quality Center, Performance Center, Bugzilla, JIRA, Robotium, HP ALM, Win Runner 6.0/7.2, SAP R/3 4.6C, ECC 6.0 FI, SD, MM, PP, HR.

Languages

Core Java, C, C++.

Database

MySQL, DB2, Oracle, SQL Server

Web Technologies

HTML, VBScript, JavaScript.

Environment

Windows7/2003 server/95/98/NT/00/XP, UNIX, Soap UI.

Communication

MS-Outlook2003, MS-Office.

Methodologies

Water fall, Agile methodologies (SCRUM).

PL/SQL

Oracle PL/SQL.

EDUCATION

Bachelor of Technology in Electronics and Telecommunication, Vinayaka Missions University, India.

PROFESSIONAL EXPERIENCE

Senior Performance Engineer

CVS-EPH, Lincoln, RI Jul 2017 – Present

EPH is an application for tracking patient records and is used by members of CVS. Enterprise Patient Hub (EPH) provides access to a portion of CVS services and feature via the internet in a highly available, reliable, secure, navigable & consistent manner. This environment provides access via presentation logic only. The services and features are provided by business domain running within other environment.

Responsibilities:

Gathered Test Plan and Test Specifications based on Functional Requirement Specifications and System Design Specifications.

Responsible for creating test scripts using Load Runner using various protocols including Web Http/Html, Web service, Citrix etc.

Planned and generated Vuser scripts with VuGen and enhanced them with correlation, parameterization and functions.

Worked on the DB queries and preparing shell scripts.

Challenge of creating huge amounts of data to avoid using the production data in the test environment.

Using the Linux commands to start the stop services for the performance test using putty.

Worked on SQL queries for retrieving the statistics and the data from the database.

Worked on HP ALM tool for tracking of test cases and the test execution status.

Parameterized large and complex test data to accurate depict production trends.

Scheduling the scenarios using the Load Runner's Controller and analyzing the results using Analyzer.

Monitoring the servers and logging the metrics using the monitoring tools.

Identified Disk Usage, CPU, Memory for Web and Database servers and how the servers are getting loaded using Site scope.

Worked in association with the DBAs in making sure that the databases are re-pointed to the original environments once we are done with the environment for the load test in question.

Created test cases based on the requirements and the test conditions in Mercury Quality Center and identified test data in order to match with requirements.

Executed SQL Queries for backend testing of the application to ensure business rules are enforced, and data integrity is maintained.

Performed usability and navigation testing of web pages and forms.

Analysis of cross results, cross scenarios, overlay graphs and merging different graphs.

Responsible for database rolled back after the load tests are completed.

Independently executed the test scenario, analyzed the execution statistics by monitoring the online graphs.

Coordinated with Technical Teams to monitor Database Query, CPU Utilization and Memory.

Worked closely with the Development team in the performance tuning efforts of the various sub systems.

Analyzed Performance test results, and prepared detailed Performance Test Reports including recommendations.

Environment: Load runner 12.53, performance center 12.02, Java 8, SQL server 2014, Mercury QC 12.53.

Senior Performance Engineer

TaxHug, Boston, MA Feb 2016– Jun 2017

CWP is an online application for trading and is used by members of Cyberonics. CWP was the replacement of the Cyberonics web trading app. Client Web Platform (CWP) provides access to a portion of Cyberonics services and feature via the internet in a highly available, reliable, secure, navigable & consistent manner. This environment provides access via presentation logic only. The services and features are provided by business domain running within other environment.

Responsibilities:

Gathered Test Plan and Test Specifications based on Functional Requirement Specifications and System Design Specifications.

Responsible for creating test scripts using Load Runner using various protocols including Web Http/Html, Web service, Citrix etc.

Planned and generated Vuser scripts with VuGen and enhanced them with correlation, parameterization and functions.

Working on Dynatrace to get the response time at the server level

Challenge of creating Restful service to avoid request going to the production during performance test.

Good in Linux commands to stop and start services after performance test using putty.

Hands on experience in executing SQL queries.

Good experience in using JIRA.

Parameterized large and complex test data to accurate depict production trends.

Scheduling the scenarios using the Load Runner's Controller and analyzing the results using Analyzer.

Monitoring the servers and logging the metrics using the monitoring tools.

Identified Disk Usage, CPU, Memory for Web and Database servers and how the servers are getting loaded using Site scope.

Worked in association with the DBAs in making sure that the databases are re-pointed to the original environments once we are done with the environment for the load test in question.

Created test cases based on the requirements and the test conditions in Mercury Quality Center and identified test data in order to match with requirements.

Executed SQL Queries for backend testing of the application to ensure business rules are enforced, and data integrity is maintained.

Performed usability and navigation testing of web pages and forms.

Analysis of cross results, cross scenarios, overlay graphs and merging different graphs.

Responsible for database rolled back after the load tests are completed.

Independently executed the test scenario, analyzed the execution statistics by monitoring the online graphs.

Coordinated with Technical Teams to monitor Database Query, CPU Utilization and Memory.

Worked closely with the Development team in the performance tuning efforts of the various sub systems.

Analyzed Performance test results, and prepared detailed Performance Test Reports including recommendations.

Environment: Load runner 12.53, performance center 12.02, HP Dynatrace 6.2, Java 8, SQL server 2014, Mercury QC 12.53.

SAP QA tester.

RevSW, Houston, TX Oct 2014 – Jan 2016

Multi-national chemicals manufacturing corporation, active in fiber intermediates, nylon and

Melamine fibers, resins and plastics, performance products and agricultural products and

fine chemicals to crude oil and natural gas. RevSW aimed at migrating from 4.6c R/3 system

to SAP ECC 6.0 environment.

Responsibilities/Deliverables:

Involved in the preparation of test plans, which specify testing overview, testing approach, Testing strategy, roles & responsibilities and scope of testing

Analyzed Business Process Procedures(BPP) Document and Prepared Test Scenarios, Test scripts

Involved in testing of Unicode conversion from 4.6c to ECC 6.0

Developed Test scripts and Executed with HP Mercury Tool Quality Center

Performed testing Screen formats, navigation and functionality of Accounts Payable (AP), Accounts Receivable and Treasury modules as per BPP.

Manually performed integration end-to-end testing of AR module consisting of Customer Master Records, Customer Groups, Terms of Payment, Interest Calculation, Cash Discounts and Credit management, Over/Under Payments, Dunning Procedures as per business requirement.

Manually performed integration testing of AP module consisting of Vendor Groups, Vendors Master data, Dunning Procedures, Parking Document, Posting Park Document, Payment Program Configuration- company code data, paying company code data, country payment methods, company code payment methods, bank selection.

Manually performed integration end-to-end testing based on BPP requirements of Treasury (TR) module consisting of House banks, Lock Box, Lock Box file formats,

Cash Management (CM), Cash Forecasting (CF), Cash Concentration (CC), EBS

(Electronic Bank Statement).

Manually performed integration testing controlling area and operating concern.

Manually executed integration testing Automatic Cost Element Creation, and Manual Cost Element Creation.

Manually performed integration testing cross-company order settlement CO to FI balances.

Documented and reported the defects and interacted with developers to follow up on

Defects/Issues.

Manually tested interface programs, scheduling jobs as default, as running as per set up in the system

Performed Functionality and Regression testing with automation tool QTP

Conducted training and developed documentation for end-users for UAT testing

Environment: SAP R/3 ECC 6.0, Oracle 9i, UNIX, Manual Testing, Quality Center 9.1, QTP.

Performance Engineer

Ironshore Insurance Inc, NY Mar 2014 – Sep 2014

The project at Ironshore Insurance Inc involved enhancement to their Online Payment features. The main focus of the project was to upgrade the existing system by enhancing the functionality like Online Bill Pay, User Interface change, Search Functionality, Online money transfers between different accounts, etc.

Responsibilities:

Gather performance test requirements from the application team based on the performance test request submitted.

Work closely with the development team to identify the performance test needs and its deliverables.

Design performance test strategies and performance test plans considering the performance test requirements.

Design performance test scripts in Load runner VuGen or Jmeter and enhance the script to meet the test scenarios, troubleshoot any run time errors and set run time settings as required.

Create correlations for the dynamic values in the script based on the response from the server to enhance the Script.

Parameterize the test with multiple users to simulate real time multiple Vusers logging into the system.

Organize the test cases and test plans in JIRA.

Responsible for taking thread dumps and using profilers for CPU and Memory profiling.

Schedule and execute the test scripts for performance testing assigning multiple Load generators to meet the test needs through Performance Center.

Design and execute performance test scenarios in Performance Center.

Collate the execution results and analyze it in Load runner Analysis.

Perform the monitoring performance of the application and database servers during the test run using tools like Dynatrace.

Drill down on the failed transactions, HTTP errors, memory spikes, jvm memory, CPU usage, web requests, pure paths, application processes and database.

Analyzing the metrics after running Report.

Sharing the report to all the stake holders.

Environment: Windows 2008, IIS 8, HP Load Runner 12.02, Performance Test Center 11, Jmeter 2.13, JProfiler, JVM, VisualVM, Http Watch, AWR Reports.

Performance Tester

Coca Cola Company, Atlanta, GA Dec 2012 – Feb 2014

TMMC (The Minute Maid Company) Migration is related to enhancements and movement of Applications to Corporate location and to validate that applications, servers, interfaces, and data in production are not adversely impacted. The migration includes Front-end Applications, Back-end Servers, Batches running at different periods and online transactions.

Responsibilities:

Involved in preparing high level scenarios based on Agile Methodologies for each Scrum.

Developed Test plan, Traceability metrics mapping with Requirements and Test Cases.

Developed Load Test Scripts by using Load Runner for entire site and did the Parameterization, Pacing, and correlation.

Responsible for setting runtime settings in Load Runner.

Correlated the dynamically created session data in the load test scripts in VuGen to synchronize with the application.

Responsible for performance testing using Load runner and Jmeter.

Developed Load/Stress scenarios for performance testing using the Load Runner Controller.

Configured Tomcat server, Data Base Server, Apache Server and Static Servers in site scope to monitor Memory Utilization, CPU Utilization, Throughput, Network Connections, etc in Load Runner

Defined and configured SLAs for hits/sec, throughput, transactions per second in Load Runner.

Responsible for monitoring different graphs such as Throughput, Hits/Sec, Transaction Response time and Windows Resources while executing the scripts from Load Runner.

Analyzed the results of the Load test by using Load Runner Analysis tool to identify bottlenecks.

Configured Production Server System settings on Load Test Servers and Created Load/Stress testing scenarios for performance testing using Load Runner Controller by creating 500 to 1000virtual users.

Prepared detailed Performance Test Analysis Report with Graphs and the application bottlenecks from the scripts execution.

Performed Backend testing by integrating SQL queries within scripts and validated the backend workflow under load testing.

Developed and executed complex SQL Queries and Procedures to perform database testing.

All the bugs were tracked and updated in defect tracking tool JIRA.

Environment: Load runner 12.02, QC 8, Performance Center 12.02, Jmeter 2.13, HP Quality Center, Site scope, Unix, Windows 8, Wily Introscope, JAVA, Jboss, Web logic, Oracle, XML, SQL Server 2014, Network analysis, MS Access.

Automation Test Analyst

Lloyd's Banking Group, India Sep 2010 – Nov 2012

This project is aimed at creating a web application for internal users while maintaining huge amount of information about various payers and providers. Financial department of payers would use this application and providers can track the deposit and remittance balances. The application also had the features that would enable the users to export all the transactions to an excel sheet and to import the data directly from the CSV files directly into this system for further tracking.

Responsibilities:

Analyzed the Use Cases and various documents to prepare Test Plan.

Highly experienced in designing and applying test methodologies to ensure products meet required specifications and Performance expectations.

Developed the Test strategy for short term and long term automation.

Prepared Manual testing scenarios and test cases for system design of advance builds of an application.

Automation infrastructure and error free test scripts are developed using Selenium Web driver methods and features.

Used Eclipse IDE with selenium for supporting functional testing of client side application.

Created Requirement Traceability Matrix (RTM) while preparing the test cases.

Organized the test cases in the HP ALM/Quality Center (QC) for Manual Test execution also generated reports and graph type documents for results using QC.

Test scripts are written using Java selenium to automate the GUI test and functional test cases.

Test scripts created with automation tools and executed automation scripts on various builds.

Performed risk analysis to critical areas of application from customer perspective and supported multi-tier application.

Test scripts written are enhanced for Global Execution.

Data Driven Testing (DDT) is executed for the Test data.

User stories are tracked using JIRA and HP ALM/Quality Center (QC) to track the defects.

Involved in weekly status review meeting and generated reports on summary of the test executions.

Environment: Windows 8, Selenium IDE 1.9.0, Selenium Web driver, Java, J2EE, Scrum Master Pro, HP Quality Center, JIRA, Eclipse IDE and Firepath.

Associate Test Engineer

TechVedika, India Jun 2009 – Aug 2010

The purpose of this project is to allow the customer to be able to create, update, share purchase from a list of gifts for any occasion. This functionality will be available on Mobile web, Native apps. The project will also create a new gift shop that can be re-used for different marketing events.

Responsibilities:

Analyzed & studied various Project Artifacts such as Business Requirements, System Requirements and Use Cases to develop and execute Manual Test cases.

Design Build Acceptance Test, Regression Test, Functional Test Script, Adhoc Test Scenarios.

Involved with QA lead in preparing the detailed Master test plan that provides a detailed list of conditions under which the system was tested.

Initial testing was conducted manually and later phase was executed using the Mercury Tools.

Performed Manual Testing of the web application on Unix Platforms.

Developed Test Plans and Test Cases.

Documented the Test cases, Test results and Test procedure.

Performed Manual Testing.

Participated in Design Reviews and Software Development Life Cycle (SDLC) process from application requirement and specification gathering stage through final release -Go-Live.

Involved in planning and execution of the Regression and IST testing for various releases.

Requirements and Test Cases, logging & tracking Defects in the Defects Module.

Provided support to End-to- End Testing.

Performed Positive and Negative Testing.

Tested documents for Cross Browser Compatibility.

Environment: Manual Tester, Mercury Tools, UNIX, IST testing, Windows 2000/XP. Vista.



Contact this candidate