•Diversified experience in Quality Assurance and Software Testing.
•In-depth knowledge of Software Development Life Cycle (SDLC), having thorough understanding of various phases of SDLC such as Requirements gathering, Analysis/Design, Development and Testing.
•Experience in preparing Test Plans, writing Test Cases, mapping, developing and maintaining test scripts for manual and automated testing environment.
•Experience with multiple ticketing systems such as Parature, Auto Task and JIRA
•Performs Back-End testing manually and using automated testing tools by executing SQL queries.
•Accomplished reporting bugs using bug-tracking tools such as HP Quality Center and JIRA.
•Experience in Mobile application testing using Simulators and real devices – IOS and Android.
•Documentation and Process Management skills with the ability to effectively understand the business requirements to develop a quality product.
•Superior analytical, troubleshooting, and problem-solving skills.
•Demonstrate methodical, detail-oriented and thorough approach to all assignments while adhering to compressed timelines while completing all assignments on or ahead of schedule
•Liaise with Developers, Business Analyst, and user representatives in application design and document reviews
•Effective in executing multiple tasks and assignments and dedicated team player.
•Name of Institution
•Information Systems Technologies – Information Assurance.
•MIC College of Engineering and Technology
•Electrical and Electronics Eng.
•Narayana Junior College
•Board of Intermediate Education
•Sri Krishnaveni Talent School.
•Board of Secondary Education
Client: AAA, Heathrow, FL
Job role: Quality Analyst
Duration: Nov' 17- Present
Project: RSO/DRR (Digital Road service Request)
RSO/DRR is an upgrade to the Intelematics RSO to provide a user with better user interface. Replacing the Rap central server with new web service and removing all the dependencies to make it simple.
•Lead QA in Digital Road service Request (DRR) testing for functionality within the AAA app, Web and Voice Channels (Both Google and Alexa).
•Actively participated in the pre-testing that included review of the requirement documents and studying the use cases for developing test plans and collection of test data.
•Analyzing the functional specifications and User Requirement documents.
•Conduct Sanity testing before starting our actual execution.
•Utilize Firebase for pulling analytical data for record tracking, problem identifiers and the overall feed from all mobile devices.
•Created JMeter scripts to measure performance and functionality of web services.
•Monitor, debug the JMeter scripts and API’s on a daily run to check the status of the API’S.
•Worked with both physical devices, remote devices, and simulators on the Sauce Labs platform.
•Run entire platform compatibility test (software’s including Android and iOS – across mobile devices: smartphones, tablets, iPads, etc.) Cross browser compatibility on Windows and Mac.
•Heavy involvement in Smoke and Regression testing for every sprint cycle.
•Performed manual testing and verified various conditions and business rules using various test techniques like Decision tables and Use cases.
•Create and execute test cases in Zephyr.
•Analysis and record maintenance of the test result and maintenance of defect log report.
•Performed Integration and System level testing on the application and worked with different verification points like Standard checkpoints, Bitmap checkpoints and Synchronization points using Quick Test Professional (QTP).
•Responsible for transferring the test cases from Quality management to Zephyr.
•Performed Data-driven testing to read test input data from an Excel File to test the application with different positive and negative data.
•Coordinated production calls with different clubs before giving the sign off to ensure the quality of the product is intact for every sprint cycle.
•Participated regularly in review meetings with Project Manager, Product Owner, and QA Manager.
•Reported club specific issues to the production support team to ensure the defects in production are taken care off.
•Used postman for comparison testing for the Rest API’s.
•Used Charles proxy to debug the errors/failures in the app.
Environment: Jira, Quality Management, Zephyr, Postman, Charles Proxy, Jenkins, QTP, SQL, MongoDB, Web Services, Test Plans, Regression Testing, Windows 7/UNIX.
Client: SRK Systems, Naperville, IL
Job role: Quality Analyst
Duration: June' 17- Nov’ 17
•Executed Low to High-level test scenarios based on user requirement specification
•Performed manual testing for web interface, identified the test cases, and documented them in Quality Center
•Tracked bugs, process, and provided updates via JIRA.
•Heavy involvement with SQL queries to perform data verification and immediate high-level data fixes.
•Identified test requirements based on user requests and design specifications using Quality Center.
•Used QC to document tests and defects encountered during testing.
•Loaded test Data into QC to prepare for test case execution
•Provided progress updates and Status Reports for weekly Scrum meetings to scrum lead, managers, and developers
•Perform Functional, Negative, Positive, UAT, Cross Browser, and Backend Testing.
•Involved in writing the Test plan and Test Cases based on Functional requirements and Design documents
•Well versed in analyzing results, bug tracking and providing a detailed status report
•Identified test cases and documented them in QC while working with developers and test lead to verify
Environment: Quality Center, SQL Developer, and SQL Plus
Client: DXC Technology, Hyderabad, TL
Job role: Quality Analyst
Duration: May' 14- Aug’ 15
•Created Testing Requirements based on functional and technical specifications.
•Testing was done at all the stages of SDLC under both Waterfall and Agile development models
•Developed test documentation as per project specification: test plans, test designs, test cases
•Involved in User Acceptance testing (UAT) to validate the application, to check that the application properly reflects business functions and requirements.
•Involved in system integration and validation of populated data.
•Performed Manual Testing of web based application.
•Provided daily production support and administration of portal operations such as problem resolution, performance management, maintenance upgrades, and production migrations.
•Used Quality Center for reporting and defect tracking.
•Involved in Quality Center administration for the project.
•Ensured that the test cases coverage included the product and its integrations with other legacy systems.
•Participated and discussed test execution progress with the team at the end of the day during the testing Phase.
Environment: HP Quality Center, Android SDK tools, IPhone Configuration.