Post Job Free
Sign in

Test Engineer

Location:
Nanaimo, BC, Canada
Posted:
January 21, 2020

Contact this candidate

Resume:

Rakesh Gaddala

Contact No: +1-910-***-****

Email ID : ***************@*****.***

SUMMARY

* ***** ** ********** ** Quality Assurance Methodologies using Micro Focus/HP/Mercury Interactive Tools mainly HP Load Runner, Quality Center (QC).

Expert in Manual and Automation Testing.

Evangelist of good testing practices, like pair programming, code reviews, Test Driven Development (TDD), Integration Testing, Exploratory Testing, Behaviour-Driven Development (BDD), Continuous Integration (CI), and Continuous Delivery (CD)

Hands on experience in Waterfall and Agile methodologies and worked actively in all phases engaging with developers, business, analysts and other stakeholders

Extensive experience with Load Runner tool for monitoring, testing of Web based as well as client Server systems on multiple platforms such as in .NET, Java, SQL.

Developed Scripts to meet load-testing requirements according to the SLA (Service Level Agreement) agreed upon.

Excellent working knowledge in Developing & Implementation of complex Test Plans, Test Cases and Test Scripts using automated test solutions for Client/Server and Web-based applications

Enhanced Vuser script in Load Runner with Parameterization, Correlation, Transaction point, check points.

Excellent knowledge of software development and testing life cycles, SQA methodology and test process documentation and End user training.

Experienced in User acceptance testing, Performance, Load, and Stress Testing.

Developed Vuser scripts using Web (HTTP/HTML), Ajax (Click and Script), and Web Services protocol, Ajax Truclient, Oracle NCA.

Experience in understanding Business Process from the requirements.

Experience in Manual and Automated testing of applications developed on Windows environment.

Worked on multiple Performance Testing tools LoadRunner, JMeter.

Extensive experience with Load, Stress and Performance testing using Load Runner and developed Vugen test scripts.

Expert in finding performance bottlenecks both in client side and server side and making recommendations for Performance Profiling or Tuning.

Proven ability to check Network Bottlenecks using Network Delay Time and Vuser Graphs.

Expert in Analyzing results using HP Load Runner Analysis tool and analyzed Oracle database connections, Sessions, Log files.

Expertise in writing reusable modular scripts for automation testing for various Business Applications like Banking& Finance, Health Care, Retail.

Comfortable with various Industry Leading Operating systems (Windows NT/98/95/2000/XP/Vista/Windows 7 and UNIX)

Experience in Installation and Configuration of Software and Hardware in testing environment

Excellent inter-personal abilities and a self-starter with good communication & presentation skills, problem solving skills, analytical skills and leadership qualities.

Experience in coordinating on shore and off shore resources.

Technical Skills:

Testing Tools

SOASTA, VSTS, HP Load Runner tool, JIRA Bug Tracking Tool, Mantis Bug Tracking Tool, Quality Center, Performance Center

Languages

Java, J2EE, SQL, PL/SQL, C, C++

Build Tools

Ant, Maven, Jenkins, Hudson

Database

MySQL, DB2, Oracle, SQL Server, Sybase

Web Technologies

HTML, VBScript, JavaScript

Environment

Windows7/2003 server/95/98/NT/00/XP, UNIX, Soap UI

Communication

MS-Outlook2003, MS-Office

Methodologies

Water fall, Iterative model, Rational Unified Process, Agile methodologies (SCRUM)

Others

Edit Plus, Junit, TestNG, SVN, TOAD, SQL Developer

PL/SQL

Oracle PL/SQL

Browsers

IE 8/9/10/11, Firefox, Chrome, Safari

Professional Experience:

Sr.Performance Test Engineer

Tilray, Nanaimo, BC Apr‘2018 - Present

Roles and Responsibilities:

Gathered Test Plan and Test Specifications based on Functional Requirement Specifications and System Design Specifications.

Advised developers to build automation scripts (unit, integration, end-to-end, UI) ensuring code coverage and quality

Created and maintained documentation in confluence, SharePoint

Responsible for creating test scripts using Load Runner using various protocols including Web Http/Html, Web service, Oracle NCA etc.

Planned and generated Vuser scripts with VuGen and enhanced them with correlation, parameterization and functions.

Expert in developing Work load Model for performance testing.

Used Ramp Up/Ramp Down, Rendezvous point, Start and End Transaction, Parameterization, Correlation features of Load Runner.

Maintained Defect Log, Test Log and Status Report, Traceability Matrix, which gives a clear indication of quality and stability of product for UAT.

Executed in Load, Stress and Endurance Testing to simulate a process, which allowed using more 1000 virtual users.

Parameterized large and complex test data to accurate depict production trends.

Responsible for implementing Load Runner, Performance center, JMeter based infrastructure and Architecting the load testing infrastructure, hardware & software integration with Load Runner.

Scheduling the scenarios using the Performance center and analyzing the results using Analyzer.

Monitoring the servers and logging the metrics using the monitoring tools.

Identified Disk Usage, CPU, Memory for Web and Database servers and how the servers are getting loaded

Worked in association with the DBAs in making sure that the databases are re-pointed to the original environments once we are done with the environment for the load test in question.

Studied application performance and maximum scalability, critical parameter such as number of users, Response times, hits per seconds (HPS) and Throughput using Load runner.

Created test cases based on the requirements and the test conditions in Mercury Quality Center and identified test data in order to match with requirements.

Proficient in different phases of Testing like Black Box Testing, White Box Testing, Functionality testing, writing test cases from complex requirements in black box environment and white box environment.

Executed SQL Queries for backend testing of the application to ensure business rules are enforced, and data integrity is maintained.

Performed usability and navigation testing of web pages and forms.

Analysis of cross results, cross scenarios, overlay graphs and merging different graphs.

Responsible for getting the database rolled back after the load tests are completed.

Independently executed the test scenario, analyzed the execution statistics by monitoring the online graphs

Coordinated with Technical Teams to monitor Database Query, CPU Utilization and Memory.

Worked closely with the Development team in the performance tuning efforts of the various sub systems.

Accurately produce regular project status reports to senior management to ensure on-time project launch.

Actively participated in Defect Review meetings involving Test Coordinator, Developers, Business Analysts and Project Managers to report the status of defects to the management.

Prepare testing status report every week.

Environment: Load runner 11.52, Load runner 12.00, Load runner 12.53, JMeter, performance center 11.52,12.00,12.53 Wind XP Professional, Introscope, C#, Java, SQL, Mercury QC

Performance Engineer

Krispy Kreme, Winston, NC Oct ‘2016 – Apr‘2018

Roles and Responsibilities

Assisted the team lead in the preparation of the Test Plan and Test Strategy documents.

Responsible for Load Testing Co-ordination with various other projects involved in load testing activity.

Developed Scripts in HTML/HTTP, web services for Load runner.

Analyzed graphs and reports to check where performance delays occurred, network or client delays, CPU performance, I/O delays, database locking, or other issues at the database server.

Analyzed Performance Bottlenecks using Load Runner Monitors, DynaTrace and HP Site scope, HP Diagnostics.

Also involved in Vuser Setting for different scenarios and business processes in Controller and analyzed graphs to find out the performance tuning of the system

Developed script using NeoLoad.

Presenting the results to the team and analyzing the bottle necks and resolving the issues from their end.

Responsible for scheduling the Load tests using HP Performance center involving a variety of load scenarios combination.

Using Load Runner, execute multi-user performance tests, used online monitors, real-time output messages.

Executed different Scenarios for different applications in controller and created Load Runner Analysis Reports and Graphs.

Interacted with developers during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web and App levels.

Analyzed results using HP Load Runner Analysis tool and analyzed sessions and log files.

Responsible for generating and publishing Load Test Results and publishing the results in share point.

Worked closely with development team to narrow down defect reproduction cases and scenarios.

Responsible for performance monitoring and analysis of response time & memory leaks using throughput graphs.

Responsible for configuring and installing the Performance center Infrastructure for executing and scheduling the load tests.

Performed backend testing using complex SQL queries on Oracle database.

Gathered user requirements and designed the Test Plans and Test Scenarios accordingly.

Responsible for coordinating the new Transports to the Performance testing environments.

Analyzed the server resources such as Available Bytes and Process Bytes for Memory Leaks.

Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report.

Interacted in Daily standup meetings with the Management and report day-to-day activities and updates.

Environment: Load Runner, Quality Center, Agile, Performance Center Oracle, SQL

Performance Tester

Wells Fargo, San Francisco, CA Sep ‘2015 - Oct‘2016

Roles and Responsibilities

Involved in developing the scripts to check the Servers connectivity.

Importing the WSDL’s and performing the unit tests using the SOAP UI project.

Worked with business analyst in gathering the requirements and the SLA details from the client, writing test cases from complex requirements

Conducted testing of complex business applications of Hotel

Experience in Black Box testing, and white box testing with a complete QA cycle from testing, defect logging and verification of bugs

Developed Load Runner test scripts according to test specifications/ requirements.

Involved in full life-cycle of the project from requirements gathering to transition using Agile Methodology.

Develop performance test suites using Jmeter Testing tools.

Develop and implement load and stress tests with Load Runner, and present performance statistics to application teams, and provide recommendations on how and where performance can be improved

Developed and enhanced scripts using Load Runner VuGen and designed scenarios using Performance Center to generate realistic load on application under test.

Coordinated with Functional Teams to Identify the Business Process to be Performance Tested.

Inserted Transactions, Checkpoints into Mercury Load Runner Web VuGen Scripts and parameterized & correlated the scripts.

Monitor and administrate hardware capacity to ensure the necessary resources are available for all tests.

Customized scripts for error detection and recovery.

Worked in shared environment and tested different applications.

Independently executed the Mercury Load Runner test scenario, analyzed the execution statistics by monitoring the online graphs.

Involved in planning and coordination effort throughout QA life cycle.

Designed tests for Benchmark and load testing.

Ability to diagnose Web/App server performance issues/troubleshooting using Load Runner J2EE Diagnostics/Deep Diagnostics.

Worked with Load Runner in analyzing application performance for varying loads and stress conditions.

Responsible for the generation of the Load Runner Analysis files based on the Load Runner

Results file generated by the load test and filtering the analysis file data based on the durations required.

Generated detailed test status reports, performance/capacity reports, web trend analysis reports, and graphical charts for upper management.

Assisting the QA Lead with administrative tasks such as meeting notes, defect database clean up.

Provide daily/weekly application availability reports to the management

Environment: Load Runner 11.5/11, HP Performance Center, Layer 7, WebLogic App Server,XML, Web Services, Jmeter, CA Wily Introscope 7.x, Oracle 10g/9i, SharePoint, Windows 2000/XP.Vista.

Quality Assurance Analyst

Aviva,Hyderabad,IN May ‘2013 - Sep ‘2015

Roles and Responsibilities:

Profiling of legacy policies that meet the testing requirements on conversion (to Guidewire) using criteria like Underwriting rules, data refresh (internal claims, VDM, DMTI, MVR, Autoplus, Transunion, IClarify, MPAC), policy change, coverage decline, delinquency etc. Used complex SQL queries for profiling from Oracle database. Data consistency and mapping document validation.

Identified, analyzed, reported and documented defects, questionable functions and inconsistencies in product content and outlook during the design and implementation phase

Reviewed code commits and ensured the right level of tests are included

Conducted Manual testing when required and lead and conducted Exploratory Testing sessions with the team

Automation audit and cleaning, validate accuracy/efficiency, remove duplicates with other types of testing (Unit Test/Contract Test), remove overlapping with other E2E testing, respecting digital code standards, validate that tests are sold)

End to End and SIT testing of profiled legacy policies (after loading into Guidewire) for all the workflows like time based testing from legacy to Guidewire on renewal, UW Rules, Data refresh criteria, validation of all the screens of Policy,Billing and Claim Centers. (Vehicle, Driving Record, Claims, Discounts, Coverages and document validation).

Creating and executing test cases for Guidewire Policy Center.

Working on data conversion from General Insurance to Guidewire. (ETL)

Creating SQL queries on source and target databases.

Renewal validated for conversion renewal (from Legacy to GW) to upto fifth year renewal in GW.

Analysis on Requirements specification.

Executed test cases and analyzed results along with reporting of bugs an errors to development teams

Created high level diagrams using Visio Studio tool

Experience in mobile, branch, distribution and foundation testing

Testing in Agile and waterfall project methodologies

Create detailed test scenarios, test cases based on project requirements and project change requests.

Performed Functional, Smoke, Integration, Regression, Ad-hoc, SIT, QAT and UAT testing

Created Test Data by creating various commercial and personal accounts and performed end to end testing, as well as verified the workflow of the application

Tested different modules of the application such as document capturing, document, Remote desk capture, Virtual Image Capture, Cheque Image Hub, Created and executed appropriate test cases to ensure the integrity of the app prior to business unit and user acceptance testing using HP Quality Center 10.00 and ALM

Detected, monitor and investigated defects to determine nature of the problem and verified the problem is consistent and reproducible

Reported and managed software concerns and test activities throughout the software development lifecycle

Attend project meetings to report on status of testing, communicate issues, implement QA standards and policies, and communicate QA schedule

Review EDD documents and design specifications to ensure full understanding of individual deliverables

Involved in writing test cases and test scenarios, and executing test cases in HP Quality Centre.

Testing web based insurance quotes

Involved in Data validations, Data integrity testing and Database testing in Oracle database.

Tested asynchronous interfaces using XML files

Reported defects in timely manner in issue tracker in online SharePoint as per defined project guidelines

Education: Bachelor of Technology, GITAM University



Contact this candidate