Ravendra Thallapureddy
Cell# 469-***-****
Email:*********@*****.***
Summary:
. 13 years of cumulative QA/test-specific experience as Project Lead in
Quality Assurance with experience in Automation and Manual with a wide
variety of projects and environments.
. Proficient in analyzing the SRS (Software Requirement Specifications),
Functional Design Documents and Use Cases to formulate Test Plans, Test
Scenarios and Test Cases for Manual as well as Automated Testing.
. Experience in Functionality, Compatibility, Exploratory/Ad-hoc,
Load/Stress, Usability, Installation, Security, Upgrade, Context-
sensitivity, User Acceptance testing.
. Experience in testing Authentication and Authorization mechanisms.
. Possess experience in testing web applications and client-server
applications of various domains.
. Hands on Experience in Automated testing using Mercury Tools (Quality
Center and Load Runner).
. Experience in Selenium and Testing framework like TestNG.
. Strong experience in testing interfaces developed using Web Services
using JMETER for SOAP, Postman for RestFul API and knowledge in using
SOAP UI.
. Extensively executed SQL queries on testing database tables and views in
order to view successful transactions and for validating user data.
. Hands on experience in bug tracking and triaging using tools like
Rational Clearquest, Bugzilla, and Client's proprietary tools like Watson
Express, BLT and Bugnizer to make sure those defects are efficiently
passed from one layer to another.
Possess knowledge on build deployment on Tomcat Application server.
. Experience in testing applications developed in Java/J2EE, JSP, .NET,
ASP, C++, HTML, XML, AJAX and JavaScript.
. Excellent understanding of the Software Development Life Cycle, Agile,
Waterfall and role of QA.
. Capable of learning new technology and adapt to a new environment
quickly.
. Mentor, coordinator, leadership and managerial roles in creation and
execution of QA and testing processes, Evaluation of requirements and
define testing objectives, manage testing tools and environments, test
preparation, execution, and automation.
Computer Skills:
Scripting Languages Python 2.6.
Testing Tools Selenium, Quality Center 9.2, Load
Runner, and JMETER, Postman, SOAP UI.
Testing Framework TLC Automation, TestNG.
Languages Java1.5, SQL.
Web Technologies HTML, DHTML, AJAX,JSON, XML.
Database MySQL.
CMS Adobe CQ 5.5, World Server 9, CPS.
Web Analytics Site Catalyst 15 (Client side)
Tools/IDEs PyWin, Eclipse 3.x, Toad, Oracle SQL
developer.
Web and Enterprise Application Tomcat 6.0.18, WebSphere 7.0.
Servers
Operating Systems Unix, Goobuntu[Ubuntu], Windows, Windows
Servers 2003 and 2008R2.
Other skills SDLC, Agile Methodology and Waterfall
Methodology.
Professional Experience:
Client: Google Inc, Mountain View, CA. [April 2013 to Present]
Project: Google Search Appliance.
Role: SQA/Test Lead.
Description: Google Search Appliance (GSA) Adaptors allow the GSA to crawl
and index non-HTTP content repositories. They provide the repository
content over HTTP/HTTPS and notify the GSA of documents in the repository.
Adaptors are designed to be simple to create and get running, but still
provide security, performance, and scalability. This project provides
adaptors to search cloud contents (Drive, Sites), search content on
FileSystem and AuthN adaptor which provides Authentication and
Authorization mechanisms to validate user credentials while serving the
content.
Responsibilities: My role in current project, I lead testing efforts for
Adaptors project and other integrated test areas within GSA. Also, co-
ordinate regression test cycles for internal releases and trusted tester
program which is designed to get feedback from selected customers on pre-
release of GSA project. I work with Engineering, Deployment, Support teams.
Application Testing and bug Tracking.
. Analyzing and understanding the functional & nonfunctional requirements
. Developing & Executing the Test plans & test cases for all the new
features
. Creating test deliverable at all levels of testing - functional, upgrade,
system, user acceptance and field implementation.
. Responsible for creating Test documentation for Web Services testing.
. Responsible to verify RestFul web services API request, response data
validations.
. Tested REST API and validated JSON responses; updated specs while
testing.
. Running the Security and performance evaluation tests
. Maintain and monitor all the related applications in all the environments
that are, QA, staging and production.
. Farsighted person in conveying the release outcome to fellow testers,
Engineers, Test managers, Project Managers.
. Defect tracking and bug analysis
Quality Control Reporting - Metrics and benchmarking.
. Preparing the Test evaluation reports, bug status reports and sharing
them with all the stakeholders.
. Preparing the test dashboards with various quality metrics and benchmarks
. Involve in test improvement activities of contributing to lessons
learned, creating quality checklists for repeated testing tasks,
tracking/sharing/testing resource, test effort and defect metrics.
Percentage time spent: 10%
Preparation of Technical/Business documentation.
. Quality Assurance, process enhancement and testing services
documentation.
. Prepare Technical and Business processes documents for BugFix,
Maintenance, Stabilization and development work.
. Prepare test summary reports and share it with all stakeholders.
Environment: GSA, RestfuI API, XML, HTML, JSON, LDAP, ActiveDirectory,
SAML, ApacheTomcat6, WindowsServers2003, 2008R2,Linux, Goobuntu[Ubuntu],
Perforce, Test case manager, Buganizer.
Client: Adobe Systems, San Jose, CA. [Sept'11 to Mar'2013]
Project: Community Help and Learning.
Role: QA Project Lead.
Description: Aim of this project is to redesign of help and support
experience on adobe.com: improved navigation, visual, and community content
by hosting them on CQ servers.
Responsibilities:
. Experience in coordinating efforts with vendor in offshore that supports
development and testing.
. Team administrator for Test Studio (Test Case management Tool), Watson
(Bug tracking tool) Creating/Managing Test Suites and Run Suites.
. Support ticket creation for enhanced requests and management of tasks;
facilitation of
Agile Scrum meetings.
. The Scrum Master for many Sprints to lead the Daily Stand-up meeting.
. Implemented Release and Testing process for new, updated and migrated
pages in CQ system.
. Conducted demos to Writers/business users for completed functionalities
during the Sprint.
. Attend/conduct project meetings and do the coordination on testing effort
between
Adobe dotcom team and CHL team.
. As part of project reflow tested user experience on smart phones.
. Testing of component optimization for smart phone browsing.
. Worked on functionality, device / browser compatibility and template
integration.
. Participated in Acceptance and Sign-off meetings.
. Creation of Test cases, User stories and Test Matrices.
. Testing of components and templates; Cross-browser/MacOS/Windows.
. Testing compatibility of OOTB (out-of-the-box) components and proposing
changes to back-end developers to modify widgets.
. Testing of migration of product help and knowledge base articles from CPS
and World Sever into CQ5.5.
. Testing of custom servlet(s) functionality runs on the publish instances
just to make sure received content has published properly.
. Performed "Tough Day" test to measure the performance and scalability of
Authoring servers.
. Involved and coordinated the effort to Automation using Selenium and
Java.
. Involved in testing the page rating for high traffic community pages and
feedback badge rating functionality within scope of project.
. Involved in Art handoff schedules for EN and other Locales and experience
in handling Production issues.
. Experience in using Xenu tool to identify the broken links on published
pages.
Environment: CQ 5.5, Device testing (Android and iOS), Agile/Scrum, World
Server 9, Site Catalyst 15, Jive, Adobe FrameMaker 9, Watson Express, Test
studio, Manual testing, Xenu tool, JMETER, Selenium2.0, Java1.5.
Client: Google Inc, Mountain View, CA. [May'09 to Sept'11]
Project: Google Toolbar.
Role: Technical Lead in QA.
Description: Google Toolbar is an Internet browser toolbar available for
Internet Explorer and Mozilla Firefox in in 16 different languages. Toolbar
offers two types of features.
. Basic features like Search box with helpful search suggestions as you
type in the search box. While most suggestions come from popular Google
searches.
. Advance features like Sidewiki, Translate, Quick Scroll, PageRank, My
Location, Autofill, Custom Buttons, Bookmarks, SpellCheck.
Custom Buttons: Make your own Button Gallery using Google Toolbar API which
provides access to the websites in single click.
Sidewiki: Google Sidewiki is a browser sidebar that lets you contribute and
read helpful information alongside any webpage.
Translate: View instant translations for individual words by pausing your
cursor on an English word, or auto magically translate entire websites on-
the-fly into more than 40 languages.
Responsibilities:
. Involved in Functional, AU (auto update), and System testing of Google
Toolbar for Internet Explorer.
. Integration testing of the Google Toolbar with other products such as
Bookmarks and Protector.
. Created test cases to test various features of Toolbar and automated test
cases using Python.
. Developed test cases for features like Bookmarks, Translate, and Sidewiki
in Google Toolbar.
. Wrote Python scripts for parsing test results, generating test reports,
and email notification.
. Running all automation scripts for new functionality and regression
testing and manual testing as needed.
. Involved in preparing manual test plan, test cases and test execution.
. Analyze the test results to determine the root cause for the failures
before opening defects.
. Worked on reproducing bugs reported by OEM customers.
. Responsible for bug logging and tracking process to ensure defects are
logged in a clear way and are escalated to the concerned team for optimum
defect turnaround time.
. Communicated effectively with the Automation Framework team to exchange
ideas to improvise the new TLC Automation framework.
. Following the Google coding standards and making sure the scripts work in
different Windows OS (Operating System) and Internet Explorer.
. Assisting in ongoing efforts to ensure test case Automation development
is effective, coordinated, and integrated.
. Verify regression test coverage and Assigning the task to the team and
tracking it day to day.
. Effectively communicating with offshore team and ensuring the regression
testing activities were done thoroughly.
. Submitting daily and weekly status reports to the manager and performed
timely escalations to the management.
Environment: Python 2.6, Eclipse 3.x, Notepad++, Fiddler, MySQL, WinDbg,
Windows, Linux, Goobuntu[Ubuntu], Perforce.
Client: Ericsson Inc, Plano, TX. [Dec'06 to Mar'09]
Project: Ericsson EBI System.
Role: SQA Analyst.
Description: The main function of the Ericsson EBI System is to complement
and administer the Ericsson PrePaid System (PPS) while offering multiple
machine interfaces based on a Service Oriented Architecture. This allows
integration with Cingular (AT&T) CBS, Cingular's 3rd party providers, and
Ericsson WEB CARE applications. ATTM uses Ericsson's EBI System to manage
Accounts and set of network features.
Responsibilities:
. Communicate effectively with the team, Business Analysts and Managers to
nail down the requirements within the scope of project.
. Actively participated and conducted meetings to review Requirements, Test
cases and providing any updates prior to or during the Review meetings.
. Involved in preparing manual test plan, test cases and test execution.
. Used MARS (Matrix Reference System) to store test cases, test results and
to generate test reports evaluating test results.
. Analyzed and documented the test results for each build.
. Responsible for tracking defects using MHWEB and make sure those defects
are efficiently passed from one layer to another.
. Defined scope and test approach/strategy for Extended Regression and
Automation testing.
. Developed the LoadRunner test scripts for performance testing. Estimated
the time required to develop and execute Performance automation suite for
different releases.
. Feasibility Study (Finding Bottlenecks and work a rounds using LoadRunner
9.0).
. Experience in Analyzing of the load test results and provide the
recommendations to development team.
. Used JMETER to build tests, generate SOAP requests and validate the
response from Web Services.
. Lead the team for manual execution of short and large releases.
. Experience in software Deployment and Installation on Tomcat.
. Performed extensive SQL/PLSQL querying for Oracle backend database.
. Helped in creating project related documents such as Installation,
Interface and System Administrative Guide (SAG).
. Participated in meetings and discussed the enhancement and modification
issues.
. Active team lead in Tracking and maintaining the project and submitting
daily and weekly status reports to the manager and performed timely
escalations to the management.
Environment: QTP9.5, Load Runner9.0, Quality Center9.0, JMETER, SOAP,
Oracle10i, Oracle SQL Developer, Apache Tomcat 6, Unix, and Windows.
Client: Google Inc. [May'06 to Dec'06]
Project: GSA.
Role: Senior QA Engineer.
Description: The Google Search Appliance is an integrated hardware and
software product designed to give businesses the productivity-enhancing
power of Google search. The Google Search Appliance makes the sea of lost
and misplaced data on your web servers, file servers, content management
systems, relational databases and business applications instantly available
from a single familiar search box.
Responsibilities:
. Involved in discussions to develop Test Plans and Test Cases.
. Conducted and attended Walkthrough meetings for Test Plan and Test Cases.
. Developed and executed the test cases for functional and regression
testing of the AUT (Application Under Test).
. Exercised component checklists verifying correctness for each build and
performed Sanity, Functional and Regression testing for each new build.
. Prepared traceability matrix for test cases by mapping them to business
and system requirements.
. Integration testing of the various modules to verify the interfaces and
control flows between the modules.
. Responsible for bug logging and tracking process to ensure defects are
logged in a clear way and are escalated.
. Troubleshooting, problem resolution and timely escalations to the
development team.
. Involved in coordinating different off-shore and on-site teams, leading
daily status meetings with both on-site and off-shore teams.
Environment: Google Search Appliance - Mini, Cluster, Super-GSA, GWS, Unix,
Windows, Perforce, Oracle, Sybase, MySQL.
Client: AOL Inc. [Aug'04 to May'06]
Project: AOL- Strauss and WSP Tools.
Role: Senior QA Engineer.
Description: Strauss is an America Online product that provides many
features that can be accessed online, e.g. News, Entertainment, Sports,
Finance, Travel, Mail, Instant Messenger, Chat etc. AOL- Quality Assurance
Strauss group is responsible for testing various features in AOL product
going for Beta release.
WSP Tool is a Big Bowl publishing system designed with the goal of creating
a modern platform for Web content management, with an emphasis on editorial
efficiency, performance, and scalability.
Responsibilities:
. Worked with the team to create feature wise test plan from the master
test plan.
. Planned test strategy and selecting the test cases for Smoke Testing and
Regression Testing.
. Installation of QAR on Unix boxes in the QA environment and clearing the
QAR to Production.
. Downloading the build from CTS (Client Tracking System - configuration
Management Tool) and performed Smoke test and release to team for
testing.
. Furnished QA/test skills to Functionality, Regression, Integration,
Compatibility, Security and related testing of a wide variety of web-
based application at American Online and related software products on
Windows (using Install Shield, Ghost, and Microsoft Windows unattended
installation technologies).
. Used BLT (Bug Logging and Tracking) for logging Defects.
. Interacted with developers to resolve the defects. Responsible to
validate the bug fixes.
. Experience in working with Python automation tool. Benefited from
comprehensive internal training program in software testing theories,
procedures and methods.
. Pythonwin editor used to create Python scripts for test valid tasks
associated migration of AOL and MapQuest tool bars.
. Improved Python testing quality by creating more tests & longer test
cases, adding subtests to cases, using two test groups in parallel,
testing more scenarios and combinatorial test focus.
Environment: Test Case Manager, BLT, Python 2.X, Javascript, HTML, Apache
Tomcat, Win2K/ XP/Me/98 and MAC with IE 6.0/SP1/SP2, Client Tracking
System, Sybase, Internet Explorer, NetScape 8.0, Firefox, Safari
browsers.
Client: Chordiant. [Aug'01 to July'04]
Project: CDMP.
Role: Software Engineer.
Description: Chordiant Decision Management portal provides the business
solution for the healthcare business. It offered enterprise software to
help other companies improve the customer interaction. The project involves
the testing of Internet portal of Chordiant. The application facilitates
easy and efficient contact between Companies, third party and Customers of
Chordiant.
Responsibilities:
. Involved in the complete testing life cycle for different modules of the
application. Actively participated and conducted meetings to review
project specific documents to create future specific test cases.
. Created Test cases for all the Components involved in System/Functional
testing.
. Contributed in performing test activities like Functional, Regression
testing within the group.
. Performed Integration Testing of the various modules to verify the
interfaces and control flows between the modules.
. Lead the testing efforts for the new product developments, existing
product enhancements for Chordiant user profile module.
. Manage Test Director to enter and maintaining the test cases, create test
sets for functional and regression test cycles and entering test results.
. Defined the defect tracking process to ensure defects are logged in a
clear way and are escalated.
. Mentored entry level Testers on reviewing requirement, preparing and
executing test cases, and report software defects.
Environment: TestDirector7.2, TOAD, Java/J2EE, HTML, Servlets, Oracle8i,
JDBC, Web logic 7.0/8.0.
1