Post Job Free

Resume

Sign in

Test Project

Location:
Cary, NC
Posted:
March 01, 2020

Contact this candidate

Resume:

Ravishankar Potharlanka

Cell: 919-***-****

Email:adb2su@r.postjobfree.com

Work Authorization: GC

Summary

Focused, diligent and solution driven QA Manager with a career spanning 14 years in the software testing industry and having strong experience in Performance, Manual, Functional and Web Services automation. Extensive track record of enabling, managing and motivating high performance teams to deliver high quality software.

Highlights

10 years of experience in SCRUM Agile methodology.

5 Years of experience in Performance Testing in defining, designing, creating, & executing performance tests, including load tests, volume tests, stress tests, endurance tests, scalability tests, etc.

Experience in Manual and Automated testing using Selenium, Soap UI, Quick Test pro, ALM / Quality Center across diverse operating systems and technologies

Expertise in test Engineering, Project and release management of multimillion dollar revenue account with a record of shipping high quality software release, on time and within budget

Experience in the Blended (Onsite-Offshore) Delivery Model – Coordination, Client Communications, Status tracking and Issue resolutions

Proven hands-on experience with quality assurance practices, including project plan development, test strategy development, test plan development, test case & test data management

Experience in planning and executing tests from large system wide tests (SIT, UAT, etc.) to individual functional tests and regression tests. Methodologies used include waterfall, iterative & Agile

Mobile application testing experience on iOS, Android mobile and windows platforms

Performed as focal point and leader to all performance testing deliverables and directed QA activities to ensure maximum quality and effectiveness

Worked in coordination with other departments in implementing the performance benchmarks

TECHNICAL EXPERTISE:

Operating Systems: Windows NT/98/2000/2003/2005/2012XP/VISTA, Linux 5.6, HP UX, Sun Solaris

Programming Languages: C, Java, VB, Unix

Databases: Oracle 8i, 9i, 10g, 11g, DB2, MS SQL Server

Automation: Quick Test Professional 12.0, Load Runner 12.50, Selenium, SOAP UI, GIT Lab, Maven

Test Management Tools: Mercury Quality Center 11.0 / Test Director 7.6/8.0, HP ALM, Jira, Clear Quest, Microsoft Test, Microsoft TFS,Rally, Version One

Monitoring tools: PerfMon, Cacti, SiteScope, AppDynamics, Splunk, Wily Interscope

Profiling and debugging tools: SQL Server Profiler, HTTP Watch, Firebug, Fiddler, Debug Diag

Mobile Platforms: iOS, Android

Testing Methods: Integration, System, Smoke, Regression and UAT

Education

MBA SMU Universtiy, India

Experience

Infosys, USA Nov 2018 to till date

Project: Interact Client: Bank of America

Role: Lead Consultant

BOA Interact application provides agents with quick access to customer details and provides some of the most common services. It is a hosting environment for components built for associates who handles business functions.

Mentored, trained teammates/team on using different applications and understanding the testing requirements

Involved in creating and maintaining of the projects using HP Performance Center 12.52

Responsible for creating the performance test plan, test strategy document, managing performance test scheduling and logistics, after reviewing and verifying performance requirements

Involved in the evaluation of Functional Performance testing tools and contributed to the evaluation report

Responsible for giving feedback to the development team with possible improvements and discovered

Responsible for reviewing and verifying that performance requirements are documented and stated in measurable terms, Includes reviewing Architecture design, software design, and non-functional requirements documents.

Analyze, interpret, and summarize meaningful and relevant results in a complete Performance Test Report

Monitor and administrate hardware capacity to ensure the necessary resources are available for all tests

Prepared Test Cases, Load runner scripts, Load Test, Test Data, execute test, validate results, Manage defects and report results

Created and executed Load Runner scenarios with Failover conditions, high volume, Network latency, etc

Investigate the backend logs created during execution of Load Runner scripts via Splunk

Involved in communicating the performance testing process and status to project team members and the test manager to ensure timelines and quality are met

Responsible for performance measurement strategies to observe application and infrastructure performance across diverse hardware, operating systems, application servers, and databases, includes Client Experience

Monitoring, transaction tracing with diagnostic tools AppDynamics, Dynatrace, DNT to detect, isolate, and resolve performance issues throughout the technology stack

Communicate test progress, test results, defects/issues and other relevant information to project stakeholders and management.

Cognizant, USA June 2014 to Nov 2018

Project: U Portal Client: BB&T

Role: Product Consultant

U Portal provides secure online access to your accounts, credit cards, mortgages, alerts, bill

payment services and more. It’s easy and convenient. This application gives you secure access to your

accounts at your convenience. Manage your accounts, transfer money, pay bills and more-from your

computer, tablet or smartphone.

Responsibilities

Managed a team of 25 business testers/system testers for complete Release and test management solutions

Work with Product Managers, Business Analyst and development Team leads to establish requirements, stories and test cases for new features and bug fixes

Analyze and verify requirements for completeness, consistency, comprehensibility, feasibility, and conformity to standards

Ensure testing planning, test cases and test approach is accurate, focused on customer needs, and in line with expectations

Created and manage Release plans that address project approaches, deliverables, change management, resource and implementation schedules

Triage issues quickly and accurately, align resources, manage escalations and risks, monitoring and follow through on tasks to resolution, and adjust plan in order to meet schedule

Conducting daily scrum meeting to understand project impediments and provide mitigation

Oversee execution of test cases and updating results on an ongoing basis until the verification of requirements and product regression is completed

Supported Production deployments and Monitor and track high Priority defects and Production issues on day-to-day basis

Responsible for resource mmanagement activities like resource allocations, de-allocations

Identifying and highlighting the risks in the project and planning mitigation

Working closely with the automation in converting the existing QTP scripts to Selenium

Develop automation feasibility report, ROI and presenting to client to initiate in sprint automation

Conducting UX and Design prototype reviews to ensure early detection of defects even before design docs are delivered to dev team for integration

Proactively indemnifies impediments and appropriately resolves using a network of peers and formal channels (including escalation if required) by facilitating discussion

Cognizant, Bangalore, India Aug 2011 to June 2014

Project: DCM Client: Corelogic

Role: Sr. Product Specialist

Corelogic is a leading provider of consumer, financial and property information, analytics and services to business and government. At present, Corelogic 2014 business plan is to upgrade and shift the existing platforms/ applications/ databases on to cloud.

Responsibilities

Involved in analyzing business requirements and convert them into accurate nonfunctional requirements

Manage Project Change Requirements and responsible for handling subsequent amendments as required

Liaise with Core Logic teams & other vendors to understand overall program approach/phases/timelines

Involved in onboarding and managing performance resources

Created and manage release plans that address project approaches, deliverables, change management, resource and implementation schedules

Manage execution of multiple Performance Testing projects Helping the teams in zeroing on Work load profile

Monitor the load test execution closely to ensure performance SLA's are met

Identifying and highlighting the risks in the project and planning mitigation

Status reporting and dashboard preparation for presenting the project progress

Ensure setup of an effective governance structure and communication mechanism

Co-ordinate with onshore teams (Corelogic & other vendors, as applicable) to ensure quality and timely delivery

Performed testing project estimation, planning, tracking, and risk management for midsize to large projects utilizing Version one

Developed the Performance Test Scenarios and uploaded the same to share point server based on functional requirements, general requirements and system specifications.

Analyzed test results based on the benchmark specifications to Identify performance bottlenecks and provided recommendations

Configured and monitored the Performance Counters like I/O for Physical and logical disks, Memory, dead locks, procedure/query execution time and also analyzed the logs to find out the performance bottle necks.

Report testing progress (highlighting key risks, mitigation plans, etc) on a periodic basis to CoreLogic Sr. Management

Corelogic, Bangalore, India Dec2007to Aug 2011

Project: K2 Realist Client: Corelogic

Role: Sr. Product Specialist Client:Corelogic

K2 is a property search Flex application. K2 is the market leader in providing robust, feature-driven functionality that enhances the capabilities of Multiple Listing Services (MLS). K2 Links directly from a listing to its corresponding public record and auto-populate MLS listings with Realist data

Responsibilities

Interacting with client POC’s, Development team to understand and gather the Non-Functional requirements for projects/each release.

Involved in requirements gathering from project stakeholders and understanding the functional specifications.

Design Test scenarios for various modules.

Creating and enhancing the scripts by using parameterization, correlation and debugging scripts.

Conducting baseline, load tests with multiple scenarios through Performance Center.

Monitoring Application server, Web server and Database server counters through Site scope.

Measuring performance metrics (response times, throughputs, etc.) and monitor resource demand metrics (%CPU, Disk IO, Network IO, memory, etc.).

Preparing High level and final reports with all recommendations.

Deep dive root cause analysis of any performance bottlenecks based on the collected monitoring data.

Coordinate with Development team for iterative execution & analysis during fine tuning the performance issues.

Create test execution summary report, give presentation and get the signoff from the stake holders

Infinite Solutions, Bangalore, India July 2006 to Dec 2007

Project: IBM® SERVICEPAC Client: IBM

Role: Sr. Test Analyst

Servicepac is a web application which helps users to find the correct list of servicespacs available for all IBM products. The tool enables Administrator and geos to update any information related to servicepac. The search pages allow user to search for needed information in a quick and robust way. The results can be exported to excel. This tool interfaces with three other Legacy Systems to get some critical information.

Responsibilities

Closely Working with clients, vendors and customers to understanding their business and requirements

Created test plans and test strategy for the smoke testing, functional testing, system testing and user acceptance testing

Managing daily team activity to meet or exceed commitments for tasks and timelines and Assign QA tasks to the test team

Preparing test strategy, test methodologies, test estimation, test harness, test plans for running projects

Review test cases to improve test coverage and defects raised by fellow team members

Estimating the backlogs and assigning the backlogs to respective team members

Preparing the test summary report at the end of each test cycle completion and presenting to same to client in Release review

Estimating the backlogs and assigning the backlogs to respective team members.

Performing & monitoring regression testing activities.

Coordinated patch releases/defects/fixes with the development teams

Coordinated with Other Testing Teams for Integration Testing

Executed and documented various manual test results from regressive testing of each release

NIIT Technologies, Bangalore, India April 2005 to July 2006

Project: Raiser’s Edge Client: Black Baud

Role: Test Engineer

Raiser’s Edge is application used to monitor every aspect of your fundraising. It provides with an array of tools to track information of your donors (called constituents). One can maintain a detailed biographical record for all your constituents, including employment, banking, and relationship information. With the use of this information there can be increase in effectiveness of appeals (by accurately matching them to your constituents).

Responsibilities

Interacting with the team members to resolve technical issues regarding Quality of product and assigning priority according to the documentation.

Participated in Business/Development meetings to understand user requirements.

Developed Functional Test Cases & Procedures, based on requirements specifications.

Involved in User Interface, Functionality and Navigation Testing

Performed UI Testing and validations.

Performed Sanity testing for each new build of the application

Execution of test cases for various test data’s and reporting the defects

Conducted cross browser testing to check the compatibility of the AUT with different Browser's

Used Test Director as repository for maintaining test cases, execution and tracking the defects

Attends the triage meetings reporting the bugs and tracking the bug statuses



Contact this candidate