Post Job Free

Resume

Sign in

Engineer Data

Location:
Minneapolis, MN
Posted:
March 01, 2021

Contact this candidate

Resume:

PROFESSIONAL SUMMARY

Over * years of experience in performance testing utilizing HP/Mercury Load Runner and J Meter

Experience in Load Runner and Performance Center: creating VuGen Scripts, monitoring Runtime Transactions and analyzing the test results.

Knowledge on Software Development Cycle (SDLC), methodologies and test processes and Software Testing Life Cycle (STLC)

Involved in gathering business requirement, studying the application and collecting the information from developers, and business.

Created Vuser scripts that contain tasks performed by each Vuser, tasks performed by total Vuser’s and tasks measured as transactions.

Developed Vuser Scripts in web http/html, web services, Java, Truclient and Citrix Protocols.

Designed tests for Benchmark and Stress and Volume testing.

Parameterized large and complex test data to accurately depict production trends.

Validated the scripts to make sure they have been executed correctly and meets the scenario description.

Created Single User, Base Line and Soak test scenarios. Random pacing between iterations was introduced to get the desired transactions per hour.

Experience in monitoring servers using tools like HP Site Scope, CA Wily, Splunk and App Dynamics and Dyntrace.

Added performance measurements for Oracle, Web Logic, IIS in Load Runner Performance Center.

Analyzed results using Load Runner Analysis tool and analyzed Oracle and sql database connections, sessions, Web Logic log files.

Responsible for analyzing application and components behavior with heavier loads and optimizing server configurations

Maintained test matrix and bug database and generated monthly reports.

Extensively used tools like ALM, Version1 and JIRA for reporting metrics of a Train individually and more trains based on the defect leakage.

Interacted with developers and architects during testing for identifying memory leaks, fixing bugs and for optimizing server settings at web, app and database levels.

Used Load Runner and JMeter tools for testing and monitoring actively participated in enhancement meetings focused on making the website more intuitive and interesting.

Have strong experience in API testing for various API Calls using Postman and SOAP UI tools.

Strong application knowledge, excellent interpersonal skills, and good verbal and written communication capabilities.

Also performed functional testing writing test cases and validating the scenario’s and tracked the defects through ALM.

TECHNICAL SKILLS

Testing tools

LoadRunner, Performance Center, SiteScope, Wily IntroScope, Dynatrace, App Dynamics, Splunk and Diagnostic, BMC Patrol, TOSCA, Google Analytics.

Scripting Languages

TSL, Shell script, VB script, SQA Basic

Bug Reporting Tools

Quality Center, Test Director, Clear Quest

Operating Systems

Windows NT/XP, Unix, Linux

Programming Languages

C, C++, Java, VB, SQL

Databases

MS SQL Server, MS Access, DB2, Oracle, LifeRay 6.0,6.2

Web Technologies

HTML/DHTML, XML, ASP, JSP, VBScript, JavaScript, FLASH

Others

ALM, Fiddler, Quality Center, Version1, JIRA, CA Rally

EDUCATION

Master’s in IT & Management, Campbellsville University, KY.

WORK EXPERIENCE

Mar 2019 - Till date, PERFORMANCE TEST ENIGNEER

US Bank, Saint Paul, MN

Actively involves in gathering the Non - Functional Requirements of the application from Business, Architects.

Develops the Performance Test plan that outlines the Objectives, scope, and approach for the Performance Testing and to describe the process that can verify software and hardware upgrades of an application that meets the performance requirements.

Involves in providing the Preliminary Test Schedule, specify the acceptance, entry, and exit criteria, deliverables, risks, issues, and assumptions for the Performance Testing Activity to be successful.

Assist Infrastructure team in creating and maintaining a test environment that replicates real-life scenario for efficient performance testing and to resolve the bugs before the application is deployed to production.

Uses performance test tool such as Load Runner 12.50 to Create the test scripts that are used to simulate a real-like virtual-user using protocols such as Web (HTTP/HTML), Java user and Web Services. To create, maintain and execute the load test scenarios and assign scenarios to Virtual users and load generators to conduct load tests.

Analyzes test execution results through Transaction Summary, Reports and graphs that get generated.

Develop and modify the Performance Test Scripts using Virtual User Generator by handling Correlations, Parameterizations, logic times, think time, iterations, pacing, logging options and preferences based on the test scenarios and business requirement.

Conducts different types of performance testing such as capacity testing, load testing, stress testing, Volume testing, Soak and Spike testing as per the test scenarios based on the Business-critical transactions and requirements.

Monitor and analyze the application and web servers like JSE (Java), IIB (IBM integration Bus), WAS –(Web sphere Application), MQ (Message queues) and DB –Data base during the test execution and Collect the Data generated by the servers to identify the performance bottlenecks and issues.

Analysis to find the bottlenecks of Java Virtual Machine -level parameters such as Method Execution, Thread Execution, Object Creation and Garbage Collection to provide a finer view of the target application execution and its resource utilization.

Collect performance metrics such as CPU utilization, memory usage, transaction response time, wait time, average load time and error rate and server performance statistics like hits per second, throughput, windows resources, Database server resources by performance monitoring tools App Dynamics and NMON (AIX) to report the performance bottlenecks.

Carry out Performance Tuning and Performance optimization in application and web servers to resolve the performance bottlenecks.

Analyze the test results, Create the data reports and share them with developers and leadership for fixing the defects and run the regression test using the same and different parameters.

Involved in implementing, executing and monitoring API Performance Load Testing using J Meter tool.

Analyzed important metrics for API Performance monitoring such as API response time. API availability and concurrent users

Create and validate the test data by writing SQL Queries on different Environments and execute the SQL scripts for retrieving, updating and purging the data according to the test scenarios and issues.

Log and update defects in Version One tool. Participate in Defect triage and Performance team meetings to discuss issues and resolve any Road blockers to have a better approach for improving the system.

Participate in Sprint and Retro planning which is a regular event of Scrum and Agile framework for planning and to-do tasks of the user stories, defects from backlog refinement.

Utilize Jenkins integration tool as part of continuous integration/continuous delivery (CI/CD) release train.

Monitor and analyze the application for production support and retest after fixing the bug.

Tools & Environment: Automation, Selenium, MS SQL Server, Load runner, Jenkins, HTML/HTTP, Java, API, MS Office Suite, App Dynamics, NMON, Eclipse IDE, XML, MQ’s Server, Java, IBM integration Bus, Web Sphere.

July 2018 – February 2019 PERFORMANCE TEST ENGINEER

AON Insurance, Bloomington, MN

Created scripts using Jmeter for an Electron App and Mobile Device, also created Automation Selenium scripts.

Modify, Enhance scripts by adding Encryption, writing Beanshell scripts to run the scripts on cloud and also to write data into files.

Worked on executing tests in Docker on Linux Machine Azure Cloud and Local.

Performed load, stress, Endurance tests for the business critical transactions identified.

Monitored server health of web, app and DB using Dynatrace.

Worked on MQ’s virtualization, API’s and SOAP calls, SOAP UI and validated functional test cases of the application in PERFORMANCE Environment.

Worked on analyzing the time taken by the calls to troubleshoot in reducing the time being taken.

Root cause analysis of the bottleneck identified from Dynatrace where pure path and pure stack for end to end visibility.

Also used Tivoli for server log monitoring PRODUCTION along with Dynatrace.

Created Test Strategy, Test Plan, Capatity Planning, Volume metrics. Lessons learned.

R&D, POC for Open Source tools and Syncing, Usage of tools and their value of impact on the existing with pros and cons.

Performed ELK to gather the performance metrics of the test runs, Camunda for the build and work flow pipeline.

Attend daily Standup’s, scrum meeting with SME’s and Architects to discuss the progress.

Worked on finding the bottlenecks by JVM Profiling, Garbage Collector, Thread Dumps, Heap Dumps, Connection pool, Network monitoring, where we encountered Memory Leak issue.

Used JIRA Agile tool for defect tracking and tasks worked on.

Analyze and Report the test runs and the response time improvements, Server health analysis based on Results.

Performed Network Throttling using multiple online tools like Charles proxy, net limiter, Sitespeed, Speed Tests etc., while running tests and created Selenium scripts and ran End User automation scripts for Regression.

Taking a Savepoint of DB before running and after running the Performance LOAD tests.

Ran tests for FailOver, Ran ETL Jobs for Loading data and creating Users.

Working on R&D of Gatling tool & Taurus for Monitoring the API’s and REST calls that Jmeter is hitting, as Gatling doesn’t have webUI nor GUI, needs Scala scripting and Taurus could fill gaps of jmeter, gatling and Selenium.

R&D on Kubernetis, as oour vision is to execute selenium and jmeter together where Docker doesn’t support and Kubernetis does.

Worked with a team of 8 onsite and 4 in Naples on an Integration project of existing and Legacy and creating a whole new application which runs on Azure Cloud.

Widely used SQL for data extraction, Excel for reporting, maintained a log on the runs performed to view the response time metrics.

Environment: Jmeter, Selenium, Docker, SQL, JIRA, Dynatrace, Charles Proxy, SOAP UI, Webservices, Putty, Azure Cloud, Wildfly, WAS, Tomcat CMS, MQ’s, Oracle DB, Messaging Services (SuperMario), Bridge Server, Talend, Voltage Servers, Charles Proxy, Eclipse Neon, Excel, Tivoli, Kibana, Elastic Search,, Maven, Putty, Tortoise GIT, Bamboo.

Dec 2016 – June 2018 PERFORMANCE TEST ENGINEER

Caesars Entertainment, Inc, Las Vegas, NV.

Experience in Agile, working with multiple cross functional teams and product owners.

Created scripts using Web (Http/Html), Web Services protocols in VuGen.

Performed load, stress, Endurance tests for the business critical transactions identified.

Monitored load balancer, JDBC connection pool, Garbage Collector, Forgerock servers, LifeRay, DB based on which performance tuning was performed.

Scheduled, attend weekly status meeting with SME’s and few other BSA to provide update on work around regarding performance.

Maintain the performance share point by updating the test plans, reports, tools available, managing tasks for each tool regarding getting access to new Resource and the tools in all the projects.

Worked on finding the bottlenecks by JVM Profiling, Garbage Collector, Thread Dumps, Heap Dumps, Connection pool, Network monitoring.

Experience in executing SQL queries to validate the data in the back end.

Manual test cases were written and executed.

Functional testing against API’S by writing test cases and executing them in ALM and validating them.

Automation testing using TOSCA and Used AQT and build queries using SQL, validating and changing SQL queries.

ALM was extensively used for logging defects and pulling metrics of cross process of all trains

Creating user stories in CA Rally and connecting them to defects in ALM.

Involved in analyzing the test results and address the respective defects via HPQC.

Monitored the tests using Dynatrace, Customized Dashboards, API Distribution, Run time Diagnostics, Pure Path tree, Error Analysis, Incidents & Alerts.

Monitored the application using SITESCOPE counters, SPLUNK for PROD and APP DYNAMICS.

Worked with Jenkins integration tool.

Prepared capacity assessment, risk assessment and performance documents which involves the results, defects raised during testing, defect resolution, and provided the signed off as one of the SME with respective risks mentioned.

Environment: HP Load Runner, HP Performance Center, HP ALM, Fiddler and CA Introscope, Sitescope, Splunk, App Dynamics, Dynatrace, LifeRay DB, DB2, MS SQL Server, Weblogic, F5, Web Sphere, Load Balancer, JAVA, Forgerock, OpenAM, OpenDJ, Service Now, Jenkins, Mainframes, AQT, Google Analytics

June 14 – Dec 2015 PERFORMANCE TESTER

Pro Soft Technologies, Hyderabad, IN

Created scripts using Web (Http/Html), SAP WEB/GUI, Web Services protocols in VuGen.

Involved in creating performance data that is required during load and stress test.

Scheduled, attend weekly status meeting with managers and provided update on work around regarding performance.

Performed baseline, stress, endurance, load, volume tests for the identified business critical scenarios.

Involved in analyzing the test results and address the respective defects via HPQC.

Monitoring of the application using Wily Introscope, DynaTrace, and various SAP monitoring

T-codes.

Prepared a final performance test and risk assessment documents which involves the results, defect raised during testing, defect resolution, and got signed off with respective SME’s, project managers.

Environment: HP Load Runner, HP Performance Center, HP Quick Test Professional, Fiddler and SAP, SAP GUI, SAP R/3 ECC 7.0,SAP CRM 7.0,XML, Wily Introscope, BMC Patrol.

Environment: Load Runner, Performance Center, Wily IntroScope, Oracle, MS SQL Server, Web logic, Web Sphere,Load Balancer, JAVA, Linux, Vugen, J2EE Diagnostic Tool, web, Windows XP, AIX

Dec 13 – May 14 PERFORMANCE TESTER

Novartis, Hyderabad, IN

Responsible for the successful performance testing projects.

Met with business end users and product developers to document and understand the product performance expectations.

Analyzed the requirement and design documents.

Used Clear quest for repository, reporting bugs, tracking bugs and updates on resolved bugs.

Monitor resources to identify performance bottlenecks.

Provided recommendations to the application owner on steps to meet performance goals.

Tested performance of web application and generated automation test scripts for different scenarios.

Environment: LoadRunner, Performance Center, MS SQL Server, Web logic, IIS, Web Sphere, F5 Load Balancer, SiteScope, Wily IntroScope, JAVA, FIX, Test Director J2EE Diagnostic Tool, Windows XP/ NT Server, AIXHTTP/HTML.



Contact this candidate