Sign in

Test Cases Customer Service

Maryland City, Maryland, United States
November 27, 2017

Contact this candidate


Uma Macharla



A highly competent and results oriented Software Quality Assurance Engineer with 5+ years of experience in Functional, Performance & Load Testing, Proven ability in Performance Center load testing and Web Service Testing. Excellent experience of Performance Testing Methodologies, SOA Designs, Performance Test Strategy & Planning, Methodologies. Solid domain experience in Healthcare and Banking and finance.


Extensive experience in Functional Testing, Black Box Testing, System Testing, Integration Testing, Regression Testing, Interface Testing, Load Testing, SOAP Testing, User acceptance Testing, UI Testing & Sanity Testing.

Expertise in data driven tests using SQL and Database/Backend Testing, extracted data

from Oracle and DB2 databases, and validated.

Proven skills in Structured Query Language (SQL)

Expertise in Problem solving and Bug Reports using Bug Tracking Tools such as Bugzilla and JIRA.

Solid experience in requirement analysis, test case creation, test execution, defect life cycle, test case maintenance, execution of automation test scripts for Functional, Performance, regression, system and integration testing.

Solid experience of tools HP Load runner, HP Performance Center, SOAP UI, Site Scope, Wily Introscope, ALM 12.5.

Proven skills in Performance Load Testing, Stress Testing, Volume, Endurance Testing, Scalability Testing and Bench Mark Testing and Failover Testing.

Ability to successfully manage performance testing for multiple projects at the same time

Strong experience in working with Healthcare, Banking and Financial domains.

Experienced in implementation of different QA methodologies/policies, strategies and plans in all stages of SDLC.

Experienced in Agile and Waterfall methodologies and release management.

Proven skills in Performance Test requirement analysis and designing scenarios accordingly.

Gathered Performance Test requirements from the client and designed Performance tests.

Demonstrated excellence in creating and developing Effort Estimations, Test Plans, Test Specifications, Test cases and Automation Test scripts.

Experienced in Designing multiple LoadRunner scripts (VuGen) with different protocols like Web (Http/Html) and Web services for load testing different applications.

Planned and customized Vuser script in LoadRunner and enhanced them for parameterization, manual correlation, think times, transaction point and rendezvous points.

Created scenarios using LR controller and performance center, analyzed load test results and created reports using LR Analysis.

Monitor application transactions and identify application bottlenecks using Wily Introscope and Site scope.

Demonstrated excellence in creating and developing Effort Estimations, Test Plans, Test Specifications, Test cases and Automation Test scripts.

Good understanding of the web services principles and technology. Involved in testing of Web services (SOAP and REST) using SOAP UI.

Experience in using SOAP UI for testing of SOA environment.

Created the Execution result sheet, Execution plan, Result matrix and Traceability matrix for functional testing.

Excellent analytical and problem-solving skills and thrive on building robust software quality assurance systems and processes that ensure the delivery of perfect applications.


Performance Testing Tools

HP Performance Center 12.5, HP Load Runner 12.5/11.0

Performance Monitoring Tools

Wiley Introscope, Site Scope


C, C++

Bug Tracking Tools

JIRA, Bugzilla, ALM 12.5

Test Management Tools

ALM 12.5


MS SQL Server 2008/2008R2/2012, MySQL, Oracle 10/11

Scripting Languages

VB Script, Shell Script.

Operating Systems

Windows 7/8, Windows Server 2008/2008R2/2012, Mac OS, LINUX/Solaris/HP-UX

Domain Knowledge

HealthCare, Insurance, Banking,


Bachelor’s degree in Microbiology from Osmania University, India.

Masters of Arts in Literature (English) from Osmania University, India.


Client: CareFirst (FEPOC), DC Sept 16 - Till date

Role: Performance Test Engineer

Federal Employee Program Operation Center (FEPOC) is responsible for implementing strategic tools and support BCBSA initiatives, Member 360 is one of the applications to replace legacy customer care portal for plans. The purpose of the application is to search the contract and claim information for customer representatives. The Member 360 Project combines related and linked customer information from different sources – Enrollment, Claims and other sources - into a consolidated portal view, that enables Customer Service Representatives (CSRs) to:

Facilitate service to Members and providers

Track and manage Member inquiries

Take actions on behalf of Members

This project also will provide a suite of services that will enable systems integration between local Plan applications and FEP-wide Member 360 data.


Involved in application architecture review and defining the critical business scenarios for member 360 performance testing scope.

Responsible for analyzing the performance requirements and getting the basic requirements like SLA, transaction volumes, concurrent users and key transactions.

Responsible for creating workflow and transactions names document

Created manual test cases and executed in ALM for cycle 1 testing while performance test plan is being created.

Responsible for recording Vugen scripts and enhancing scripts with manual correlation, parameterization, transaction names, think time and run time settings

Responsible for creating a test scenario in PC as per the user load profile and executing multiple cycles.

Responsible for performance issues uncovered during the tests were discussed with the development and technical support teams, tuned and re-tested.

Responsible for collecting peak load, yearly volumes, expected response times and created workload model.

Responsible for workload model preparation of complex workflows and different combination of user roles and getting it approved by key stake holder (Directors Office) of the system like architects, business analysts.

Designed and executed standalone web service tests before the application is integrated to client end application.

Responsible to coordinate with development and DBA teams in the meeting to monitor and troubleshoot the issues while execution in progress.

Responsible for monitoring different graphs such as Throughput, Hits/Sec, Transaction Response time and Windows Resources while executing the scripts from Load Runner.

Recommended solutions for application server and DB servers tier performance issues such as increasing number of connections to increase the performance.

Used Wily and LPAR to monitor tiers performance and troubleshoot the performance issues.

Environment: Mainframe DB2, IBM WAS AIX, WMQ 8, LPARs, performance center 12.5, Wily Introscope 10.5, JAVA9.

Client: QSSI, Columbia, MD

Role: Performance Test Analyst Jan 15 – August 16

QSSI the application testing implementation partner for the affordable health care Act project and responsible to for all E2E testing activities for Federally Facilitated Marketplace (FFM) applications. Federally Facilitated Marketplace (FFM) helps individuals and small employers shop for, select, and enroll in high-quality, affordable private health plans that fit their needs at competitive prices. Federal Marketplace Program System (FMPS) is a conglomeration of several interacting systems. The Centers for Medicare & Medicaid Services (CMS) is developing FFM new system to support FMPS.


Responsible for designing manual test cases covering business processes for online consumer insurance application creation and maintained in ALM

Various scenarios are considered like single and multiple member applications with all possible combinations

Responsible for test case walkthrough and getting test cases reviewed by business analyst over meeting

Involved in preparing master test plan and test strategy document.

Executing test cases, and creating defect in ALM and tracking the defect and attending defect triage calls for troubleshooting the issue

Joining working session to reproduce defect and troubleshoot the issue

Daily scrum meetings to provide daily status, blocking issues and support need from team members.

Responsible for preparing traceability matrix to ensure that requirements are covered by test cases and tracing the test cases for any change the requirement

Preparing the test summary report and sharing client and stakeholders.

Designed Performance center VUGen 11.52 scripts using HTTP protocol which are robust enough to execute in multiple environments.

Responsible for script validation and updating and debugging and repairing broken scripts on new code base.

Responsible for performance shakeout test execution and result reporting after the code deployment in to performance environment.

Responsible to coordinate with development and DBA teams in the meeting to monitor and troubleshoot the issues while execution in progress.

Responsible for monitoring different graphs such as Throughput, Hits/Sec, Transaction Response time and Windows Resources while executing the scripts from Load Runner.

Created detailed test status report and Error Reports with graphical charts and tables explaining test execution for upper management using Load Runner analysis component.

Setting run-time parameters (Think time, Pace time, replay options etc.), ramp up and load distribution.

Participated in discussions with the QA manager, Developers and Administrators in fine-tuning the applications based on the Load Test Results.

Environment: Oracle DB, MarkLogic DB, Apache, TomCat, Java, JMS, Windows 7, PC 11.52, MQC11.0, NewRelic.

Client: Bank of America-Concord, CA September 13 – December 14

Role: Performance tester

Online Service: Many Customer requests are initiated by phone, fax, or mail which significantly limits the opportunity for straight through processing and can create duplication of efforts hence to cater a smooth and faster service to customer BOFA has decided to develop an external portal which will enable customers to perform self-manage on their commercial banking accounts services.


Involved in writing test plans and test cases using requirements and use case documents and business requirement documents.

Created Test Strategy and Test plan for the testing effort.

Conducted Smoke, Non-Functional, Functional, System and Integration testing

Developed Virtual user scripts using Siebel web and HTTP protocols

Recorded Vuser scripts implementing parameterization both manually and by data driven wizards in VuGen.

Designed scenarios in Load Runner controller, executed load tests in Controller and analyzed the results in Report tool.

Running automated QTP regression suites and shared the results and maintained the existing scripts

Prepared load Test analysis reports (%disk, CPU Utilization, Throughput, %page breakdowns, Response Times, Web Server Monitor Counters, Captures, System Performance Counters and Database Performance Counters).

Analyzed load test results and created reports using LR Analyzer.

Used various parameterization techniques with Data Table, Random, Environment Variable and Action parameters.

Modified the Automation scripts by inserting check points to verify the object properties

Created Data Driven test phases by creating different data tables.

Parameterized various links in the application for Functional/Integration testing.

Created and maintained Requirement Traceability Matrix.

Performed UAT testing for each UAT release build.

Provided back end testing for database auditing and data validation using SQL scripts.

Tracked bugs using Bugzilla and generated the defect reports for review by the client and the management teams

Environment: Oracle DB and IBM AIX, Siebel and Mainframes, Windows 7, Load runner 11.0, MQC11.0 and site scope 11.2, DNT

Client: Service Master- Memphis, TN May 2012 – August 2013

Role: QA Analyst

Genesis: Project genesis release 5 included Siebel/ATG – end to end core functionality – full scale, customer service and field service, partner Portal –contractor, real-estate and third-party vendors, and basic customer self-service (support), back office – JD Edward –JDE 9.x & JDE 9.x to JDE 8.x integration OBI-Analytics.


Responsible for writing Test cases for all the possible scenarios for every requirement in the requirements document.

Engaged in meetings with clients for the requirements.

Preparing End to end Test cases and get the sign-off before the test execution.

Peer review the test scenarios within the team.

Loading the Requirements into ALM and mapping the Test cases and Defects for the tractability metrics.

Estimating the effort for test execution and test results documentation.

Communicated with the team and with other teams throughout the project life cycle for change requirements and to update in the Test plan /test cases

Analyzing the Test Data for every Scenario and coordinating with different teams for Test Data.

Executed SQL queries for data validation and back end testing.

Involved in database testing.

Logged defects where necessary and communicated to Test Lead of the status of the defects.

Extensively involved in Validation Testing after development phase.

Modified and executed manual test scripts for different application modules.

Extensively performed System, Integration, and functional testing Application Functionality and Regression test cases for manual and automated testing.

Performing Health Checks on all Test systems on Daily basis.

Coordinating with different Teams for integrating testing.

Coordinating with release teams for every build.

Create and review UAT test cases and supporting UAT (User Acceptance Testing)

Environment: RAC, IBM DB2, Siebel 8.2, JDE 9.x & JDE 9.x to JDE 8.x integration OBI-Analytics, Load runner 9.5.

Contact this candidate