Post Job Free

Resume

Sign in

Quality Assurance Scrum Master

Location:
Monroe, NC
Posted:
February 03, 2024

Contact this candidate

Resume:

Rama Chandra Reddy P

Address ● Charlotte NC *****980-***-**** ● ad3cm9@r.postjobfree.com

SUMMARY:

Overall, 16 years of experience which includes recent 15 years of experience of overall performance management, architect, troubleshooting, monitoring, and analysis.

Possess quality experience in coordination activities including planning, resource administration, setup of processes and quality assurance. Resourceful in end-to-end testing including

oRequirements gathering includes performance risk assessment

oTest Planning, workload modelling, data setup including any responder setup.

oTest execution & reporting includes bottleneck analysis, performance recommendations.

Rich experience in testing tools like load runner 2020 SP1 / Microsoft VSTS 2017. Defect Tracking using Quality Center and other monitoring tools like Introscope10.5, Splunk 8.2.3.3, App Dynamics 4.4.3.20091, Dynatrace 7.1.10, Appwatch for MQ Monitoring, ETL Batch Testing using Autosys 11.3, Informatica 9.6.1, IBM WebSphere Dara Stage 11.7 and OpsCenter for Cassandra DB Monitoring. Proficient in leading and motivating individuals to maximize levels of productivity. A customer centric professional who also motivates large work forces for exceeding customer expectations in delivery of committed services.

Knowledgeable Industries: Technology, Financial Services, Banking, Insurance, Cable / Telecom.

Strong leadership, interpersonal, analytical, organizational, negotiation, motivational, written & oral communication skills.

KEY SKILLS:

IT infrastructure Management

Leading Local & Offshore Test Teams

Project Management

Process Improvement

Budget Managing

Staff Development and Training

Customer Support/Customer Relations

SDLC Methodology (Scrum Master)

Risk Management

Data Collection and Analysis

Meeting Facilitation

Process Audits

Monitoring Tools (Wily Introscope, APPD, Splunk and Dynatrace).

ETL & Batch Performance Testing

Automated Testing

HP Load Runner 12.60, Performance center 12.60/Quality Center 12.53

Test Status Preparation and Presentation

WebSphere Application Server

Microsoft Office Suite of Products

TECHNICAL SKILLS:

Hardware Knowledge: x86 (windows, Linux), Active Directory, VM Ware, PCF Cloud, AWS Cloud.

Software Testing Knowledge: Dynatrace8.2.3.3, App Dynamics 4.4.3.20091, Introscope 10.5, IBM Memory Profiler, Splunk 6.6.3, Micro Focus Quality Center 12.60, Micro Focus Load Runner2020 SP1, Micro Focus Performance Center2020 SP1, Micro Focus ALM 12.60, ITKO Lisa for Responder, Autosys 11.3, Informatica 9.6.1 (ETL Batch jobs), JMeter 5.2.0, Gremlin (Chaos Engineering), Swagger API tool, JENKINS, GitHub, Bitbucket.

Protocol : Web (HTTP/HTML), Web Services, JSON/Rest Calls, Citrix, TruClient, Siebel protocol.

Database Knowledge: Database Testing, DB2, Oracle, SQL Server, Mongo DB.

Programming Knowledge: C, C++, HTMIL, Java, Unix, SQL, Python.

Networking Knowledge: Wire sock Networking

Web Server Knowledge: Apache, IBM HTTP Server (IHS), Enterprise Server, Microsoft Internet Information Server (IIS)

PROFESSIONAL EXPERIENCE:

Hire IT People

Performance Architect, TIAA, Charlotte, NC June 2022 – Till Date

Meet with client groups to determine performance requirements and goals and determine test strategies based on requirements and architecture.

As a Lead Actively planned and conduct the daily and weekly status meetings of Performance testing with the respective leads.

Gather and analyze the Performance Testing Requirements.

Understanding business transactions (Both UI and API) of the application.

Responsible in review of the Performance Test Plan & Traversal flow document

Lead the teams to support testing strategies and activities and to ensure overall integrity of the performance testing strategy.

Smoke test to validate the scripts, environment, and data.

Executing test runs in AWS cloud environment to find the scalability and response time of the identified transactions and the system will be stressed to identify the maximum TPS and user load sustainable by the application.

Using Performance Center for running the scenarios on AWS Cloud.

Using Splunk 8.2.3.3 and Dynatrace monitoring tools for identify and resolving the performance issues.

Analyze and schedule the results walkthrough report meeting. Final results consisting of the various metrics (response time, CPU, Memory and Errors) and DT and Splunk links,

Performing resilience testing using Gremlin tool (Network, Memory, CPU and TPS). To identify the early issue like bottlenecks for AWS Cloud Production level.

Conducted Monthly meetings with Project Head, Business, and development teams.

Worked closely with Software Developers to isolate, track, debugging and troubleshoot defects

Performed the Break Point/Capacity test to find the Maximum number of concurrent users on the system based on the Response times, CPU and Memory utilization.

Ensure the timely delivery of different testing milestones

Prepares/updates the metrics dashboard at the end of a phase or at the completion of project.

Worked on AppDynamics, Introscope, Dynatrace, Splunk, Toad, Autosys, Informatica, WinScp, log miner, Memory Profile, Heap dump

Hire IT People

Performance Test Engineer, Charles Schwab, Denver, CO June 2021 – April 2022

Scheduling and hosting kick-off meeting with all the stake holders of the Performance requirement engagement.

Meet with client groups to determine performance requirements and goals and determine test strategies based on requirements and architecture. 0020

Implementing Agile Methodologies (scrum) process and actively participated in daily and weekly status meetings of capacity testing.

Gather and analyze the Performance Testing Requirements

Understanding business transactions of the application and ETL Batch jobs.

Responsible in prepare and review of the Performance Test Plan & Traversal flow document

Lead the teams to support testing strategies and activities and to ensure overall integrity of the performance testing strategy.

Created Business flows, Test Strategy, Test Plan and Workload model based on the business requirements to meet SLA.

Created and executed JMeter scripts using Web (HTTP/HTML), Web services protocol, API Calls (Json and REST).

Created Performance Test simulation scripts using the JMeter and Integrated with GitHub/Bitbucket for CI/CD.

Working Closely with Development and Testing team to Understand all the ETL Batch jobs and Test data those are part of Performance testing.

Using the PCF Cloud environment to run the Batch jobs.

Using the Python scripts to load the data file for WinScp (NAS/Landing server path).

Using the Python commands to generate batch id’s and load the data into Mongo DB in unprocessed status.

Using the Swagger to run the Batch jobs providing the COMM Type of request.

Measure the how much load can process each Node, total number of threads, Memory/CPU utilization, total time to complete and calculate the Rec/Sec.

Using the Mongo DB to validate the all the Batch job records status and also to get the Total time and Total Process records count.

Responsible for debugging the script by increasing the number of iterations.

Smoke test to validate the scripts, environment, and data.

Executing test runs to find the scalability and response time of the identified transactions and the system will be stressed to identify the maximum user load sustainable by the application.

Using AppDynamics, Splunk monitoring tools for identify and resolving the performance issues.

Analyzing the test results and preparing the preliminary report consisting of the response time, scalability point and threshold level of the application.

Conducted weekly meetings with Project Head, Business, and development teams.

Worked closely with Software Developers to isolate, track, debugging and troubleshoot defects

Performed the Endurance test by executing the test for longer hours in order to find out any Memory Leaks, slow resource consumption problem.

Performed the Break Point/Capacity test to find the Maximum number of concurrent users on the system based on the Response times, CPU and Memory utilization.

Provided management and vendor with analysis reports and recommendations, which resulted tuning of the application. Vertical scaling and garbage collection were performed. Communicated with the vendor to resolve issues.

Preparing final executive summary report and Closure documents at the end of the engagement.

Ensure the timely delivery of different testing milestones

Prepares/updates the metrics dashboard at the end of a phase or at the completion of project.

Worked on AppDynamics, Introscope, Dynatrace, Splunk, Toad, Autosys, Informatica, WinScp, log miner, Memory Profile, Heap dump

Collabera

Performance Architect, Bank of America, Charlotte, NC OCT 2019 – May 2021

Scheduling and hosting kick-off meeting with all the stake holders of the Performance requirement engagement.

Meet with client groups to determine performance requirements and goals and determine test strategies based on requirements and architecture.

Implementing Agile Methodologies (scrum) process and actively participated in daily and weekly status meetings of capacity testing.

Gather and analyze the Performance Testing Requirements

Understanding business transactions of the application

Responsible in review of the Performance Test Plan & Traversal flow document

Lead the teams to support testing strategies and activities and to ensure overall integrity of the performance testing strategy.

Created Business flows, Test Strategy, Test Plan and Workload model based on the business requirements to meet SLA

Created and executed Vugen scripts using Web (HTTP/HTML), Web services protocol, API Calls (Json and REST), RTE and Ajax Truclient.

Created Performance Test simulation scripts using the HP Load Runner 12.60.

Responsible for debugging the script by increasing the number of iterations.

Smoke test to validate the scripts, environment, and data.

Executing test runs to find the scalability and response time of the identified transactions and the system will be stressed to identify the maximum user load sustainable by the application.

Using Performance Center for running the scenarios.

Using AppDynamics 4.4.3.20091, Splunk 7.3.1 and Dynatrace 7.0.10 monitoring tools for identify and resolving the performance issues.

Analyzing the test results and preparing the preliminary report consisting of the response time, scalability point and threshold level of the application.

Conducted weekly meetings with Project Head, Business, and development teams.

Worked closely with Software Developers to isolate, track, debugging and troubleshoot defects

Performed the Endurance test by executing the test for longer hours in order to find out any Memory Leaks, slow resource consumption problem.

Performed the Break Point/Capacity test to find the Maximum number of concurrent users on the system based on the Response times, CPU and Memory utilization.

Provided management and vendor with analysis reports and recommendations, which resulted tuning of the application. Vertical scaling and garbage collection were performed. Communicated with the vendor to resolve issues.

Preparing final executive summary report and Closure documents at the end of the engagement.

Ensure the timely delivery of different testing milestones

Prepares/updates the metrics dashboard at the end of a phase or at the completion of project.

Worked on AppDynamics, Introscope, Dynatrace, Splunk, Toad, Autosys, Informatica, WinScp, log miner, Memory Profile, Heap dump

Tanu Infotech July 2019 – Sep 2019

Performance Test Lead, AVANGRID - Mobile Application, Rochester, NY

Scheduling and hosting kick-off meeting with all the stake holders of the Performance requirement engagement.

Meet with client groups to determine performance requirements and goals and determine test strategies based on requirements and architecture.

Implementing Agile Methodologies (scrum) process and actively participated in daily and weekly status meetings of capacity testing.

Gather and analyze the Performance Testing Requirements

Understanding business transactions of the application and Batch job details.

Responsible in creating and review of the Performance Test Plan & Traversal flow document, Batch job list.

Lead the teams to support testing strategies and activities and to ensure overall integrity of the performance testing strategy.

Created Business flows, Test Strategy, Test Plan and Workload model based on the business requirements to meet SLA

Open Source JMeter 5.1.1 tool used for scripts creation and execution. Scripts like Authentication, Update and Add Account, Help, Outage Reporting (Total we have identified 37 API calls (Service calls like JSON)).

Cognizant In House (Mobile Clinical) tool using for simulating the Mobile application flows.

Responsible for debugging the script by increasing the number of iterations.

Smoke test to validate the scripts, environment, and data.

Executing test runs to find the scalability and response time of the identified transactions and the system will be stressed to identify the maximum user load sustainable by the application.

Performed the Break Point/Capacity test to find the Maximum number of concurrent users on the system based on the Response times, CPU and Memory utilization.

Using Performance Center for running the scenarios.

Using Windows Perfmon monitoring tools for identify and resolving the performance issues.

Analyzing the test results and preparing the preliminary report consisting of the response time, scalability point and threshold level of the application.

Conducted weekly meetings with Project Head, Business, and development teams.

Worked closely with Software Developers to isolate, track, debugging and troubleshoot defects

Performed the Endurance test by executing the test for longer hours in order to find out any Memory Leaks, slow resource consumption problem.

Provided management and vendor with analysis reports and recommendations, which resulted tuning of the application. Vertical scaling and garbage collection were performed. Communicated with the vendor to resolve issues.

Preparing final executive summary report and Closure documents at the end of the engagement.

Ensure the timely delivery of different testing milestones

Prepares/updates the metrics dashboard at the end of a phase or at the completion of project.

Infosys LTD Nov2012 – July 2019

Performance Test Lead, AB Billing, Jacksonville, FL Oct 2018 – July 2019

Scheduling and hosting kick-off meeting with all the stake holders of the Performance requirement engagement.

Meet with client groups to determine performance requirements and goals and determine test strategies based on requirements and architecture.

Implementing Agile Methodologies (scrum) process and actively participated in daily and weekly status meetings of capacity testing.

Gather and analyze the Performance Testing Requirements

Understanding business transactions of the application and Batch job details.

Responsible in creating and review of the Performance Test Plan & Traversal flow document, Batch job list.

Lead the teams to support testing strategies and activities and to ensure overall integrity of the performance testing strategy.

Created Business flows, Test Strategy, Test Plan and Workload model based on the business requirements to meet SLA

Created and executed Vugen scripts using Web (HTTP/HTML), Web services, Ajax TrueClient and Citrix protocol

Created Performance Test simulation scripts using the Micro Focus Load Runner 12.60.

Responsible for debugging the script by increasing the number of iterations.

Smoke test to validate the scripts, environment, and data.

Smoke test for validating the Batch jobs for updating DB as expected.

Application interface (Web) is used for loading the data, selecting the batches, and executing the jobs.

Executing test runs to find the scalability and response time of the identified transactions and the system will be stressed to identify the maximum user load sustainable by the application.

Using Performance Center for running the scenarios.

Used the JENKINS for CI/CD process for Automatic updating for any changes PE scenario, Results and Script modifications. Object is all the teams can see the previous and current results.

Using Splunk 6.6.3 and Dynatrace 7.1.10 monitoring tools for identify and resolving the performance issues.

Analyzing the test results and preparing the preliminary report consisting of the response time, scalability point and threshold level of the application.

Conducted weekly meetings with Project Head, Business, and development teams.

Worked closely with Software Developers to isolate, track, debugging and troubleshoot defects

Performed the Endurance test by executing the test for longer hours in order to find out any Memory Leaks, slow resource consumption problem.

Performed the Break Point/Capacity test to find the Maximum number of concurrent users on the system based on the Response times, CPU and Memory utilization.

Provided management and vendor with analysis reports and recommendations, which resulted tuning of the application. Vertical scaling and garbage collection were performed. Communicated with the vendor to resolve issues.

Preparing final executive summary report and Closure documents at the end of the engagement.

Ensure the timely delivery of different testing milestones

Prepares/updates the metrics dashboard at the end of a phase or at the completion of project.

Performance Test Lead, PWC_LLC, Tampa, FL June 2018 – Sep2018

Scheduling and hosting kick-off meeting with all the stake holders of the Performance requirement engagement.

Meet with client groups to determine performance requirements and goals and determine test strategies based on requirements and architecture.

Implementing Agile Methodologies (scrum) process and actively participated in daily and weekly status meetings of capacity testing.

Gather and analyze the Performance Testing Requirements

Understanding business transactions of the application

Responsible in review of the Performance Test Plan & Traversal flow document

Lead the teams to support testing strategies and activities and to ensure overall integrity of the performance testing strategy.

Created Business flows, Test Strategy, Test Plan and Workload model based on the business requirements to meet SLA

Created Performance Test simulation scripts using the VSTS 2017 – Web Performance test.

Responsible for debugging the script by increasing the number of iterations.

Smoke test to validate the scripts, environment and data.

Executing test runs to find the scalability and response time of the identified transactions and the system will be stressed to identify the maximum user load sustainable by the application.

Using Load Test (Cloud) for running the scenarios.

Using Introscope 10.5, Splunk 6.6.3 and Dynatrace 7.0.10 monitoring tools for identify and resolving the performance issues.

Analyzing the test results and preparing the preliminary report consisting of the response time, scalability point and threshold level of the application.

Conducted weekly meetings with Project Head, Business and development teams.

Worked closely with Software Developers to isolate, track, debugging and troubleshoot defects

Performed the Endurance test by executing the test for longer hours in order to find out any Memory Leaks, slow resource consumption problem.

Performed the Break Point/Capacity test to find the Maximum number of concurrent users on the system based on the Response times, CPU and Memory utilization.

Provided management and vendor with analysis reports and recommendations, which resulted tuning of the application. Vertical scaling and garbage collection were performed. Communicated with the vendor to resolve issues.

Preparing final executive summary report and Closure documents at the end of the engagement.

Ensure the timely delivery of different testing milestones

Prepares/updates the metrics dashboard at the end of a phase or at the completion of project.

Performance Test Lead, Bank of America, Charlotte, NC Nov 2012 – May2018

Scheduling and hosting kick-off meeting with all the stake holders of the Performance requirement engagement.

Meet with client groups to determine performance requirements and goals and determine test strategies based on requirements and architecture.

Implementing Agile Methodologies (scrum) process and actively participated in daily and weekly status meetings of capacity testing.

Gather and analyze the Performance Testing Requirements

Understanding business transactions of the application

Responsible in review of the Performance Test Plan & Traversal flow document

Lead the teams to support testing strategies and activities and to ensure overall integrity of the performance testing strategy.

Created Business flows, Test Strategy, Test Plan and Workload model based on the business requirements to meet SLA

Created and executed Vugen scripts using Web (HTTP/HTML), Web services protocol, API Calls (Json and REST) and Ajax Truclient.

Created Performance Test simulation scripts using the HP Load Runner 12.50.

Responsible for debugging the script by increasing the number of iterations.

Smoke test to validate the scripts, environment, and data.

Executing test runs to find the scalability and response time of the identified transactions and the system will be stressed to identify the maximum user load sustainable by the application.

Using Performance Center for running the scenarios.

Using Introscope 10.5, Splunk 6.6.3 and Dynatrace 7.0.10 monitoring tools for identify and resolving the performance issues.

Analyzing the test results and preparing the preliminary report consisting of the response time, scalability point and threshold level of the application.

Conducted weekly meetings with Project Head, Business, and development teams.

Worked closely with Software Developers to isolate, track, debugging and troubleshoot defects

Performed the Endurance test by executing the test for longer hours in order to find out any Memory Leaks, slow resource consumption problem.

Performed the Break Point/Capacity test by increasing the 3x/4x volume to find the Maximum number of concurrent users on the system based on the Response times, CPU and Memory utilization.

Provided management and vendor with analysis reports and recommendations, which resulted tuning of the application. Vertical scaling and garbage collection were performed. Communicated with the vendor to resolve issues.

Preparing final executive summary report and Closure documents at the end of the engagement.

Ensure the timely delivery of different testing milestones

Prepares/updates the metrics dashboard at the end of a phase or at the completion of project.

Worked on Introscope, Dynatrace, Splunk, Toad, Autosys, Informatica, WinScp, log miner, Memory Profile, Heap dump

Cognizant Technology Solutions July2008 – Oct2012

Performance Test Lead, Key Bank, US Jan 2011 – Oct 2012

Scheduling and hosting kick-off meeting with all the stake holders of the Performance requirement engagement.

Gathering the requirements of client side and compiled them into Performance testing Requirement (PTR) Document.

Involved in creating and maintaining of the projects using HP Performance Center 9.50

Lead the teams to support testing strategies and activities and to ensure overall integrity of the performance testing strategy.

Created Business flows, Test Strategy, Test Plan and Workload model based on the business requirements to meet SLA.

Handling the end-to-end testing of delivery support for testing engagements.

Mentored and coached team members on application.

Prepared Vugen scripts, Load Test, Test Data, execute test, validate results, manage defects and report results in TFS.

Created and executed Vusers scripts using Siebel Web, Web (HTTP/HTML), and Web services protocol.

Extensively used HP Performance center to run and execute the performance Tests.

Architecting the load testing infrastructure, hardware & software integration of application.

Performs impact analysis on the change to existing architecture, work processes and systems

Analyzed results of Transactions Response times, Transactions under load, Transactions Summary by Vusers, Hits per Second and Throughput.

Architecting and setting up enterprise applications of workload modeling and performance monitoring mechanism using UIM.

Using Introscope tools for identify and resolving the performance issues.

Monitoring system resources such as CPU Usage, Disk, Memory and Database for all the servers configured for the system.

Interacting with developers, project managers, and management in the development, execution and reporting of test automation results.

Identifying performance bottlenecks and finding out Root cause analysis about the issues.

Coordinating with on Shore team on project issues and executions.

Accurately produce regular project status reports to senior management to ensure on-time release of project.

Identified and monitored servers by creating dashboards in UIM monitoring tool.

Verified and validated the new or upgraded application to meet the specified performance requirements.

Prepared performance test reports based on the SLAs.

Performance Test Lead: Comcast Corp. US Nov. 2010 - Dec 2010

Prepared Test Cases, Test Strategy, and Test Plan based on the non-functional business requirements to meet SLA

Prepared Vugen scripts, Load Test, Test Data, execute test, validate results, manage defects, and report results.

Responsible for implementing Load Runner, Performance center.

Architecting the load testing infrastructure, hardware & software integration with Load Runner.

Analyzed results of Transactions Response times, Transactions under load, Transactions Summary by Vusers, Hits per Second and Throughput.

Architecting and setting up enterprise applications of workload modeling and performance monitoring mechanism using Dynatrace.

Interacting with developers, project managers, and management in the development, execution and reporting of test automation results.

Identifying performance bottlenecks and finding out Root cause analysis about the issues.

Coordinating with Offshore team on project issues and executions.

Accurately produce regular project status reports to senior management to ensure on-time release of project.

Performed Black Box, Automation, Integration, System, Smoke, Regression, Performance testing and Validation testing during the testing life cycle of the product release.

CA Wily Introscope was extensively used to monitor server metrics and Performed in-depth analysis to isolate points of failure in the application.

Identified and monitored queries to improve performance. Also, checking for degrading the performance by looking at the resources such as Available Bytes and Private Bytes.

Preparing documentation for tracking, maintaining, creating and escalating daily issue logs.

Verified and validated the new or upgraded application to meet the specified performance requirements.

Prepared performance test reports based on the SLA.

Performance Test Lead, Amex. GEM UK Aug 2010 - Oct 2010

Scheduling and hosting kick-off meeting with all the stake holders of the engagement

Gather and analyze the Performance Testing Requirements

Understanding business transactions of the application

Involved in review of the Performance Test Plan & Traversal flow document

Developing simulation scripts using the Load Runner 9.52

Developing simulation scripts using (Web (HTTP/HTML) protocol from the LoadRunner 9.52

And debugging the script by increasing the number of iterations.

Executing Smoke test to validate the scripts, environment, and data.

Executing test runs to find the scalability and response time of the identified transactions. The system will be stressed to identify the maximum user load sustainable by the application.

Using Performance Center for running the scenarios.

Analyzing the test results and preparing the preliminary report consisting of the response time, scalability point and threshold level of the application.

Preparing final executive summary report and Closure documents at the end of the engagement.

Performance Test Lead, WellPoint US. Jan 2010 - July 2010

Scheduling and hosting kick-off meeting with all the stake holders of the engagement

Gather and analyze the Performance Testing Requirements

Understanding business transactions of the application

Involved in review of the Performance Test Plan & Traversal flow document

Developing simulation scripts using the Load Runner 9.52

And debugging the script by increasing the number of iterations.

Executing Smoke test to validate the scripts, environment and data.

Executing test runs to find the scalability and response time of the identified transactions. The system will be stressed to identify the maximum user load sustainable by the application.

Using Performance Center for running the scenarios.

Analyzing the test results and preparing the preliminary report consisting of the response time, scalability point and threshold level of the application.

Preparing final executive summary report and Closure documents at the end of the engagement.

Performance Test Lead, Dun & Bradstreet UK Jan 2009 - Dec 2009

Scheduling and hosting kick-off meeting with all the stake holders of the engagement

Gather and analyze the Performance Testing Requirements

Understanding business transactions of the application

Involved in review of the Performance Test Plan & Traversal flow document

Developing simulation scripts using the Load Runner 9.52

Developing simulation scripts using (Web (HTTP/HTML) protocol) the LoadRunner 9.52 And



Contact this candidate