Post Job Free
Sign in

Sr. QA Analyst/Tester

Location:
Charlotte, NC
Posted:
September 22, 2020

Contact this candidate

Resume:

Surya Kommireddy(Chavvakula) Phone# 908-***-****

Synopsis of Professional Experience:

More than 15 years of “Manual and Automation (Using QTP and Load Runner)” experience in testing Software Quality Assurance Projects. My Primary concentrations are in the Telecom, Financial Services, and Pharmaceutical Industries.Worked in Manual, Automation and Load Performance Testing Projects. Worked on Client Server, Mainframe and Web Environments with various roles and responsibilities including,

Creation of Project Plans, Scheduling Test Cycles and Generating “On Time” Deliverables.

Skilled in Automation of Test Cases for End-to-End Regression Testing.

Well versed with Bug Tracking, Unit Testing, Functionality Testing, System/Integration Testing, Validation Testing, UAT Testing, Automation Testing, Load and Performance Testing.

Administration of Load Runner for developing Virtual User Scripts, Session Monitors, Resource Monitors, Scenarios, and Scheduling for short and long term Load and Performance Testing.

Administration of Quick Test Pro and Win Runner for recording and playback of Sub-Modular Applications. Creating and editing Test Scripts for GUI Functionality Testing and complete Regression Testing to Detect, Tabulate, and Report All Levels of Software Bugs.

Creating GUI maps, synchronizing different tasks, performing Data Driven Tests, Created Scripts for screen navigation, object functionality and spatial properties.

Preparation and execution of Test Plans and Test Cases.

Developing Functional Specifications and Test plans from Business and Technical Specifications.

Utilizing various SQL Queries and Unix Commands to generate Test Records and perform Regression Testing.

Developing Korn Shell Scripts to integrate, process and manage data, to submit batch jobs interactively, and to monitor data download Jobs.

Back End Testing using SQL Queries, and Unix Logs to ensure Data Validation and confirm Business Rules.

Have extensive knowledge in load, stress, soak and scalability testing and writing Load Runner Scripts. Tuned web based application, UNIX batch processes and real time daemons.

Experience using various databases like Oracle, SQL Server, MS Access.

Strong experience in using SQL, SQL*plus, PL/SQL.

Extensive knowledge in Amdocs IAF Rating Engine, Amdocs Telegence, Ensemble, Enabler, Clarify (CRM) applications, EMS (Error Management System) & Convergys Mediation Manager, Geneva Rating- Billing Engine.

Have 2 years experience in Operations team (Production Support). Worked as Production Support team member, provided support for the application 24/7.

Technical Skills:

Operating Systems: HP-UX 11i, IBM-AIX, Sun Solaris, Windows 98/ME etc.

Testing Tools : Test Director, Win Runner, Load Runner, Silk Test, QTP

Web Technologies : HTML, Java Script, VBScript, ASP

UNIX tools: Glance, vmstat, iostat, mpstat, netstat, ps, aux, top, topas, sar.

3rd Party S/W : TOAD, Ultra Edit, XML Spy, Reflection-X, Go Global etc.

Work Experience:

AT & T Labs, Middletown, NJ Senior QA Analyst/System Tester June 2010 – June 2020

I worked for CBUS in AT&T. The CBUS which is Collaboration Bus is a notification tool that has been designated as part of the AT&T Service Assurance target architecture. It interfaces with AOTS, as well as other Service Assurance systems, so that you can automatically distribute notifications based on specific ticketing events.

CBUS allows you to define the criteria that must be matched on a ticket for the notification to be initiated, and to specify a list of contacts that are to receive the notification. Used micro services and rest api tools.

Responsibilities:

Written Test plan and test cases for the requirements in QC.

Used Jira and TDP to report and track issues.

Conducted reviews for Test plans and Test cases with Developers.

Developed functional test plan and tested all modules involved in CBUS.

Supported previous releases/products for field problems (Trouble Reports), interacting with production support and UAT support.

Conducted Data Accuracy Testing, Boundary Testing and Performance testing.

Developed test cases in QC and reported bugs to developers via QC.

Extensively used SQL queries to view successful transactions and to validate data.

Developed UNIX shell scripts to automate batch process testing.

Coordinated with Offshore team and did Joint System Testing whenever required.

Used agile process also for the successful testing.

Telcordia, Piscataway, NJ Senior Product Tester Nov ’09 – May ‘10

I worked for Telcordia in NPG group. NPG is a Number Portability Gateway which provides a single, operator-based contact point for mobile operators to manage the complex interfaces, message processing, and transaction data associated with number portability requirements. Supplementing and/or replacing existing operator systems, this gateway can help to reduce operator costs related to number portability by contributing to increased flow through of port requests and simplifying management of both internal and external systems interfaces while supporting operator growth across all subscriber segments and number types.

Responsibilities:

Analyzed business requirements and created test plan.

Written Test plan and test cases for the requirements in QC.

Conducted reviews of group Test plans and Test cases with Developers.

Developed functional test plan and tested all modules involved in NPG.

Supported previous releases/products for field problems (Trouble Reports), interacting with field support and customer engineers.

Conducted Data Accuracy Testing, Boundary Testing and Performance testing.

Developed test cases in QC and reported bugs to developers via QC.

Extensively used SQL queries to view successful transactions and to validate data.

Developed UNIX shell scripts to automate batch process testing.

AT & T Labs, Middletown, NJ Senior System Tester Dec ’07 – Nov ‘09

I worked for CBUS in AT&T. The CBUS which is Collaboration Bus, is a notification tool that has been designated as part of the AT&T Service Assurance target architecture. It interfaces with AOTS, as well as other Service Assurance systems, so that you can automatically distribute notifications based on specific ticketing events.

CBUS allows you to define the criteria that must be matched on a ticket for the notification to be initiated, and to specify a list of contacts who are to receive the notification.

Responsibilities:

Analyzed business requirements and created SRS (Software Requirement Specifications).

Written Testplan and test cases for the requirements in QC.

Conducted reviews of group Test plans and Test cases with Developers.

Developed functional test plan and tested all modules involved in CBUS.

Supported previous releases/products for field problems (Trouble Reports), interacting with field support and customer engineers.

Conducted Data Accuracy Testing, Boundary Testing and Performance testing.

Developed test cases in QC and reported bugs to developers via QC.

Extensively used SQL queries to view successful transactions and to validate data.

Developed UNIX shell scripts to automate batch process testing.

Used Ramp up and Ramp Down features provided in Load Runner to initiate Virtual user actions and groups at specified time intervals to emulate the real world Scenario.

Coordinated with Offshore team and did Joint System Testing whenever required.

Online Resources, Parsippany, NJ Senior QA Analyst June ’07 – Dec ‘07

Provided testing of the web-based Consumer Card – payments and service requests online and Virtual Collection Agent product that allows debtors to self-cure online.

Involved in the process of creating test scripts for change requirements of the existing products using DOORS.

Ensured that all test cases are properly added or updated to reflect the changes in the requirements.

Performed impact analysis to determine the extent of regression testing required to properly validate the change request.

Documented and reported defects through the Visual Intercept tracking tool. Worked with business analysts, developers, and content department to resolve issues.

Executed scripts of the final Full Cycle testing. Provided Test Summary Report that the Final build had successfully passed QA.

Vonage, Holmdel, NJ Senior Test Engineer April ’06 - May ‘07

CDR and Billing module Testing Type: Manual and Automation

I worked on CDR and Billing modules in Vonage. CDR (Call detailed records) has Event processor, Generator, Rater and Monitor. EP receives SIP events from proxy server in real time. EP parses those events and converts into predefined data format and sends it to CDR generator. CDR Generator will receive pre-parsed data from EP and construct CDR. Rating engine will receive completed CDRs from CDR Generator and the CDR based on the customer's plan. Rated CDRs will be persisted in the CDR database. The CDR Monitor Console is responsible for monitoring the status of CDR processes and sending alerts if there is a problem with monitored processes. Billing module has subscribing accounts and provisioning accounts. It has monthly processor, mid cycle processor and payment manager.

Responsibilities:

Generated and ran web GUI tests for CDR and provisioning databases, using both manual and automated testing tools(using QTP).

Developed scripts to test subscriptions and provision accounts.

Conducted reviews of group Test plans and Test cases.

Developed functional test plan and tested all modules involved in CDR and Billing.

Supported previous releases/products for field problems (Trouble Reports), interacting with field support and customer engineers.

Conducted Data Accuracy Testing, Boundary Testing and Performance testing.

Analyzed business requirements and created SRS (Software Requirement Specifications).

Performed VoIP Call Processing and Data Provisioning tests for specific feature areas.

Developed test cases in QC and reported bugs to developers via QC.

Extensively used SQL queries to view successful transactions and to validate data.

Developed UNIX shell scripts to automate batch process testing.

Used Ramp up and Ramp Down features provided in Load Runner to initiate Virtual user actions and groups at specified time intervals to emulate the real world Scenario.

Performed correlation for dynamically generated Values, URL and Parameterization for user inputs and for data coming from Database.

Used Load Runner monitors to measure the Transaction Response Time, Throughput, Http response time and Hits per Second using Load Runner Analysis.

Verizon wireless, Orangeburg, NY Sr. Quality Analyst Feb’05–Mar ‘06

Billing System Conversion Testing Type: Manual and Automation.

Environment: Oracle 8i/9i, VC++, J2EE, Main Frames, Sun Solaris, WAS, C, C++, PL/SQL programs, TOAD, Test Director, Load Runner, Ultra Edit, XML Spy, Go Global, Reflection, QTP.

Verizon wireless has multiple billing systems to support 4 different regions in United States. This project is to convert all I2K (East & West) to Vision billing system. This project is to integrate all billing systems to one billing system. After migrating I2K customers to Vision, we need to make sure that the customers are being billed exactly same way as they were in I2K. This migration is transparent to customers.

Responsibilities:

Based on requirements created detailed test plan for testing functionality of the migration processes.

Created Use-Case Models after assessing the status and scope of the project and understanding the business processes.

Developed small Java scripts.

Conducted weekly meetings with the teams to manage project schedules.

Tested Front end applications using QTP. Generated scripts using QTP.

Designed, developed, and executed manual test scripts, test plans and test cases.

Tested end to end functionality of Conversion.

Reported defect to the developers and made sure they get fixed.

Involved in designing the Project Frame work of this Automation Project

Involved in automation of Smoke and Regression Test Cases

Extensively used MS-Access to view successful transactions and to validate data.

Involved in the Data Driven Tests using MS Excel data sheets as external data for parameterization

Defined Environmental, Global variables in XML file to execute the scripts on different URL’s and Environments

Used VB Script Extensively for Organizing test flow, conditions and for Exception Handling

Involved in writing Library Functions, Re-Usable Functions, Procedures for repeatable Scenario’s

Reporting Weekly status on Manual and Automation scripts to the QA Manager.

Performed Batch job testing, monitored elapsed time, CPU usage, collate statistics and Identified bottlenecks.

80211 Hotspots (WiFi) Testing Type: Manual and Automation.

Verizon Wireless launched 80211 hotspots in airports, coffee shops, hotels etc. In order to bill all WiFi customers we used IAF to bill. This product was launched with 2 price plans, Daily unlimited usage and Monthly unlimited usage. Network sends the usage via XML files to JDC/Mediation process. After the file and record level validate records were rated based on customer’s price plan.

Responsibilities:

Developed functional test plan and tested all modules involved in 802.11 changes.

Tested end to end functionality of 80211. Provisioning, JDC/Mediation, 802.11 Rating, Billing, Extract to VBS & Customer Care application.

Written shell scripts to automate validation process.

Validated rater results and made sure that rater is rating usage records properly.

If a customer is on a daily price plan, on his 1st record the 24 hrs clock will start and from there onwards all the calls shouldn’t get charged which are with in 24hr period.

Developed SIT plans and coordinated integration test cases with Vision, I2K, MTAS, AAA, POS etc.

Went to the hotspot locations and generated WiFi usage.

Sent the rated usage to VBS and validated final bill images.

Supported UAT (User Acceptance Test) phase, processed usage generated by the users provided report to the UAT group.

Worked closely with production support team when this product is went into production.

Received sign-off from the Business and UAT teams on testing.

Implemented the SDLC for the testing life cycle and followed the Standards process in the application, which meet SEI standards.

Environment: Oracle 8i/9i, HP-UX, JAVA, Sun Solaris (SunOne webserver), WAS, C, C++, PL/SQL programs, TOAD, Test Director, Load Runner, Ultra Edit, XML Spy, Go Global, Reflection, QTP.

LNP (Local Number Portability). Testing Type: Manual and Automation.

FCC granted LNP for wireless industry. In Verizon Wireless every system started enhancing their applications to support LNP functionality. In IAF we enhanced our application to accept MDN’s different from MIN’s. Before LNP in IAF every MIN is same as MDN, in order to support LNP IAF application should be able to handle MIN different from MDN. Some data products are rated on MDN and some of them are rated on MIN.

Responsibilities:

Developed functional test plan and tested all modules that has changes to support LNP functionality.

Tested end to end functionality. Provisioning, JDC/Mediation, Rating, Billing, Extract to VBS, data warehouse, revenue assurance feed and Customer Care application.

Written shell scripts to automate batch validation process.

Generated usage tested the usage via rater for all data products which gets rated on MIN and which gets rated on MDN.

Reported defect to the developers and made sure they gets fixed.

Used to send weekly updates to the upper management regarding project status.

Supported UAT (User Acceptance Test) phase, processed usage generated by the users.

Worked with the production support to debug issues found in production.

Replicated production issues on prod test environment.

Environment: Oracle 8i/9i, HP-UX, JAVA, Sun Solaris (worked on Ultra 10, E450 4CPU, E3000 4CPU), TOAD, Test Director, Load Runner, Ultra Edit, XML Spy, Go Global, Reflection.

SBC, Hoffman Estates, IL (Now AT&T) System test engineer Sep ’03 – Jan ‘05

Clarify module Testing Type: Manual

SBC is using Amdocs Telegence telecom s/w for provisioning and activating products, w/ price plans. As this s/w is not giving the flexibility of adding/changing/terminating bill cycles, change price plans, bundling price plans etc. SBC brought Amdocs Clarify product and integrating w/ Telegence. They are customizing these two products according to their needs.

SBC Customer Care representatives are using this product to provision Voice, Video & Data (the Triple play) Light speed services. The goal of this project is to provide a Telegence data interface to Order One that will support product inquiries (e.g., Retrieve products associated with a BAN) and functions to add/update/delete products to a BAN or Corporate ID. Clarify APIs are integrated w/ Telegence billing functionality so that ASI, AADS, and combined SBCT and SBCLD can use Telegence to bill the FRS and ATM products which are currently bill using a CRIS billing system

Responsibilities:

Designed, developed, and executed manual test scripts, test plans and test cases.

Developed test cases in Test Director.

Reported defects through Lotus notes.

Developed UNIX shell scripts to automate batch process testing.

Used UNIX commands like tail, more, pg, head, to check logs and error files.

Extensively used SQL queries to view successful transactions and to validate data.

Developed many comparison scripts to validate o/p from different modules.

Developed PL/SQL Procedures to validate test results.

Written test cases in test director. Automated some the test cases wherever possible.

Environment: Unix-HP, Oracle 9i, SQL, PL/SQL, TOAD, Exceed, Java, Java Script, Web logic Server, Tuxedo, API Links, WinRunner7.6, TD 8.0, Lotus Notes.

Educational Qualification: Masters in computer applications.

E-mail id: ****************@*****.***



Contact this candidate