Post Job Free
Sign in

Manual Tester

Location:
Edison, NJ
Posted:
November 02, 2021

Contact this candidate

Resume:

SREDEVI ROKKAM

ado8ey@r.postjobfree.com 469-***-****

Summary:

Over 7+ years of experience in Quality Assurance and Software Testing in multiple domains.

Strong knowledge of the SDLC, STLC, QA Methodologies and Software Bug Life Cycle.

Strong skills in performing Manual Testing of Web and Client-Server applications to ensure the functionality, usability, and reliability of the software.

Expertise in Functional Testing, Application Testing, Black Box Testing, Regression Testing and Integration Testing, Adhoc Testing.

Experience in analysis of Business Requirements and Software Requirement Specifications to formulate Test Plan and write Test Cases to make sure that technology solutions always meet the highest quality standards.

Knowledge in Java programming.

Strong Knowledge of Oracle, SQL Server databases and good knowledge of SQL.

Proficient in Front End and Back End testing.

Identify, analyze, and document defects and expertise in bug reporting and defect tracking using Jira.

Strong Experience in Coordinating with Development, Environment and Release Management Teams.

Good experience in Prioritizing/ Ranking/ Sizing of test cases.

Hands on experience on SQL, PL/SQL, DB2, UNIX.

Very Good Experience with Waterfall, Rapid prototyping, Spiral model, Agile model, Incremental model, V Model Methodologies.

Exposure to quality related concepts of ISO, SEI-CMM standard levels.

Excellent Communication, Documentation, Analytical and problem-solving skills.

Technical Skills

Testing Tools : Load Runner, Win Runner, Quick Test Professional, Test Director, Jira, Rational Clear Quest, Bugzilla.

Programming Languages : SQL, PL-SQL, Java.

Operating Systems : Windows 2000, UNIX.

RDBMS : Maria dB MS Access, and Oracle.

Web Design / Scripting : HTML, ASP, VB Script, Java Script, XML, Json, UNIX.

Education

MCA (Master in computer applications), Andhra University, India, 2005.

B.Sc. (Bachelor in Computer Science), Andhra University, India, 2002.

Professional Experience

IHS Markit, NYC, NY May 2019 – Present

QA Analyst

Project: MTA (Market Trading Analytics) IHS Markit is an Information technology company with experience across world largest industries which leverage technology and data science to provide the insights, software and data to help customers to make more informed decisions, driving growth, performance and efficiency in various fields. Current Application MTA where data for all the trading is processed and evaluated to give the clients a border research to their business. Each client registered with Markit will have an experience of getting the analytical information and performance of their business in the form of reports so that they can view all the statistical data.

Responsibilities:

Analyzing Business Requirement Specifications (BRS) and Functional Documents.

Thorough understanding of requirements and used Manual Testing Procedure to generate test cases.

Test case management, creating, and executing.

Developed Re-usable Test Cases for automated testing for APIs like Batch Monitoring, Client Setup and File Batch Management.

Created Projects, Requirements, Test Cases in Jira.

Performed Adhoc testing to assure maximum coverage of test cases.

Coordinated patch releases/fixes with the development team.

Testing the use cases based on the requirements documented in Jira. Tracking the Jira’s with the Development team members to make sure they are resolved in the Sprint.

Actively involved in the Sprint planning meeting to prioritize the Jira’s and helping other team members in analyzing the use cases that are linked to other modules.

Debugging the application logs to get the necessary errors using Unix Commands and providing them to the Development Team as part of the testing process.

Talking with the Product team to analyses the requirements and creating necessary Test cases and flow diagrams which are helpful as part of the application testing.

Developed the Test Cases and Test Scenarios based on business requirement and Use-Cases documents in Jira.

Planning and reviewing Test plans and execution process to Increase efficiency and reduce costs involved in testing process. Organize Go-No go decision meetings with all relevant teams before each major production release to analyze the effect of Open bugs on the system.

Participated in Walkthrough’s and Technical Reviews all through the testing phase.

Analyze and debug system program flows and recommend improvements and changes.

Tested stored procedures by providing different arguments and checking the impact of the data.

Written PL/SQL blocks to replicate the test seniors for data validation and written test scripts to populate the master data for testing.

Conducted the Test Case review meetings to take the sign off on the Test Cases.

Documentation of the test results and reporting status of test tasks and issues to project manager.

Provided recommendations and implement improvements on Test case designs and Test automation approaches.

Conducted the Test Case review meetings to take the sign off on the Test Cases.

Created Shared Object Repository for different projects and organized the objects in the Repository.

Environment: Java, J2EE, XML, XSLT, Angular, Json, HTML, AWS SQS, Messages, AWS EC2, XML, Agile, Maria Db, dB viewer, Jira, Git, I.E.,6, Firefox.

LionnBridge Technologies, Mumbai, India May 2006 – March 2010

Project: Pearson Phoenix (e1)

Pearson Phoenix is UK based product development Software Company. The Pearson’s Phoenix e1 is a complete education management system at fingertips which brings together the tools necessary for managing teaching and learning with organizational and administrative MIS functionality. The UK and Scotland schools can develop and deliver curriculum plans, monitor & evaluate pupil performance, while parents can use the system to monitor their child’s performance and school activities.

Responsibilities:

Analyzed the user/business requirements and developed the test cases based on requirements.

Worked with System Test Review Team and Developers to resolve system test issues and re-test.

Developed the Test Cases and Test Scenarios based on business requirement and Use-Cases documents.

Conducted the Test Case review meetings to take the sign off on the Test Cases.

Provided documentation on Test results for every phase of Testing Cycle.

Coordinated patch releases/fixes with the development team.

Performed Back End testing to verify front end data using SQL queries.

Used Clear Quest and track as a test management and defect tracking tool.

Wrote test cases for Positive and Negative Testing as per the requirements.

Environment: Quality Center, TOAD (SQL), Oracle, WebLogic, Java/J2EE, XML, JSP, HTML, CSS, UNIX, Windows, Executing macros, HTML.

Savi Technologies, Mumbai, India May 2006 – March 2010

Project: CMA

The Smart Chain Consignment Management Application (CMA) is a web-based application enabling logisticians to process consignments from the factory to the end user of a supply chain. It provides real-time visibility of consignments, enabling logisticians to identify precisely where consignments are in the supply chain, and accurately forecast when consignments will arrive at the destination.

Responsibilities:

Understanding the Business requirements, Application and function specifications, Design documents.

Created Test Plans and Test Cases for testing the application.

Tested the data using SQL and appropriate ad-hoc queries.

Developed test cases to perform functional and regression testing.

Worked on UNIX server for getting source files.

Responsible for Integration Testing, System testing and Regression testing for modified builds.

Tested the Alert System for corporate changes.

Written stored procedures to test the data that is coming from the application.

Developed small and complex queries to replicate the data of the test environment.

Developed master data for testing using queries and procedures.

Documented test progression Matrix.

Involved in gap analysis between manual and automation scripts.

Verifying the Navigation like Links and online forms.

Carried out the User acceptance testing after the QA sign off.

Environment: Java/J2EE, XML, XSLT, JSP, Servlets, HTML, Win XP, Oracle, QTP, Test Director, VSS, Weblogic.



Contact this candidate