Post Job Free

Resume

Sign in

Qa Tester Project Manager

Location:
Atlanta, GA
Posted:
February 09, 2023

Contact this candidate

Resume:

Dipa Patel

QA Tester

PROFESSIONAL SUMMARY:

Over 6+ years of extensive experience in Information Technology with emphasis on Quality Assurance, Database Testing, Manual Testing of Web, and Client/Server based commercial applications.

Experience in Agile-Scrum methodology.

Involved in front-end, Back-end/Database testing on Web Applications.

Experienced in developing Automation test scripts using Cucumber and Ruby.

Extensive experience on various Testing Methods like Validation, Verification, Functional Testing, Ad-hoc Testing, Integration Testing, System Testing, Performance Testing, Stress Testing and User Acceptance Testing (UAT).

Extensively worked with large Databases in QA and UAT Environments.

Experience using query tools for, MSSQL Server, Oracle Toad, IBM DB2 to validate reports and troubleshoot data quality issues.

Extensive experience in understanding the Design and High-Level architecture of the software and work closely with the developers and End Users to deliver a bug free and easy to use software product.

Good Knowledge on RDBMS and access methods SQL, PL/SQL.

Outstanding Communication, Analytical and Presentation Skills.

Proficient in developing, maintaining and executing test cases for different Black Box and White Box Testing methodologies.

Maintaining Daily Status Report and Weekly Status Report.

Experienced in writing Complex SQL queries on various Databases to perform data driven tests and involved in front-end and back-end testing. Strong knowledge of RDBMS concepts.

Strong Knowledge and experienced on all the various SDLC, STLC methodologies.

Experienced in preparation of Test strategy and Test plan, Test execution, Defect management, Database Testing (Oracle, SQL Server), Integration Testing and other testing activities.

Expertise in Problem Solving and Bug tracking using Quality Center/ALM, logging and linking defects to requirements and test cases

Experienced in developing data driven Automation test scripts using Quick Test Professional, Win Runner and Test Director.

Experienced in developing, maintaining and executing Test Cases, Test Scripts and Test Scenarios from business, technical and/ functional requirement documents on Business Objects.

Expertise in Defect Tracking using Mercury Interactive Quality Center report bugs and follow up with the development team.

TECHNICAL SKILLSET:

Automation Tools : Cucumber, QTP/ UFT, Quality Center

Defect Tracking Tools : Test Track Pro, Version One, Test Director, Clear Quest, ALM, Jira,

Databases : Oracle, DB2, and MS SQL Server

Web Technologies : HTML, DHTML, XML, JSP, ASP, VB.Net, Web Sphere

Languages : SQL, PL/SQL, C, C++, Java and VB Script

Office Tools : MS-Word, Excel, Power Point, Adobe Acrobat

Operating System : Windows XP/Vista and UNIX

Methodologies : Agile (SCRUM), Waterfall

Other Tools : TOAD, VISIO, MS PROJECT, SOAP UI, Putty, Web logic

EDUCATION DETAILS:

Master of Computer Application

PROFESSIONAL EXPERIENCE:

University of Minnesota Physicians, MN Feb 2019 – Till Date

QA Tester

Responsibilities

Worked closely with business users, developers and business Analysts to ensure that the application meets the business requirements, business rules using SDLC 4.0 standards.

Used Agile-testing methodology for achieving deadlines in UAT.

Perform manual testing in Validation Framework once code was moved into the UAT environment.

Performed GUI Testing and Manual Functional Testing.

Responsible for all software quality assurance activities, including test plan engineering, test case generation, Manual test execution, usability analysis, problem reporting, and documentation analysis.

Responsible for performing various types of process evaluations during each phase of UAT.

Conducted XML file data validation, tested Web Services/API using SOAP UI software and utilized XML spy.

Planned, developed and conducted unit, function, integration, data quality and UAT tests.

Created requirements traceability matrix in ALM.

Provided data to the business analysts during UAT.

Managed UAT testing for business analytics team when any changes were made to the website feed - created test cases, organized meetings, liaised with SIT testers and developers, reported on progress to project manager and signed off UAT.

Application and database testing by SQL Queries and wrote SQL scripts to perform database testing and to verify data integrity.

Performed data quality analysis using advanced SQL skills.

Functional responsibilities included Smoke Testing, UAT testing, User Interface testing, functional testing, Integration testing, Regression testing, Back- end testing, UI testing, Parallel testing, Rules testing.

Performed Functional, Regression, End-to-End Testing, UAT on multiple browsers (Chrome, Firefox, Executed features manually on Cross- Browser testing in Windows, IE, Firefox and Chrome

Testing the data flows from back end to front end and validating the front-end reports.

Participant in UAT defect triage calls and reported work status to UAT Manager on a weekly basis.

Utilized Rally to track the user stories and file defects.

Maintained daily and weekly reports using Microsoft Excel keeping track of progress, test results and defects.

Maintained Requirement Traceability Matrix (RTM) to make sure that test plans were written for all the requirements.

Functional testing and data validation across different sub modules after the data aggregation

Responsible for development, documentation, and maintenance of the test data for sub-modules.

Involved in testing/validating and debugging the applications using SQL scripts and appropriate log files.

Provide test estimates, project high-level scenarios and Level of Effort (LOE) for various phases of the project and release.

Ran the AutoSys jobs using Putty.

Develop and execute scripts to test data migration using SQL and UNIX Shell Scripting.

Proficient in UNIX, Shell scripts and used Putty and FTP for file transfer.

Monitoring and tracking the server logs in UNIX for validation of request and response.

Involved in pre-deployment, post deployment, smoke test in dry run/production environments

Environment: UAT, Rapid SQL, SQL Server, DB2, Putty, Web services, MS Excel, VB, XML spy, Oracle, Autosys, Central Customer Database (CCD), SQL Developer, Toad, SharePoint, Rally, JIRA, IBM DOORS, QC, ALM, Clear case, SQL Plus, MS Excel, Version One

AutoNation – White Plains, NY July 2018 – Jan 2019

QA/UAT Tester

Responsibilities

Extensively worked in an Agile-Scrum (methodology) environment and also involved on entire Sprint Life Cycle (Sprint Planning, Daily Scrum activities, Product Increment, Sprint Review, Sprint Retrospective and Update Product backlog).

Developed & executed Test scenarios for User Acceptance Testing (UAT).

Coordinate with end-users to schedule and support User Acceptance Testing (UAT).

Responsible for driving E2E Test scenarios efforts and in identifying test data for UAT

Designing and Executing Test Cases in HP Quality Center.

Defects were tracked, reviewed, analyzed and compared using HP Quality Center.

SQL to Access Database used to perform back-end testing.

Responsible for creating test cases for End-to-End Testing.

HP ALM was used for Test management and for defect tracking.

Coordinate efforts with Developers to resolve identified UAT defects swiftly.

Worked on various Unix commands in order to perform the various operations with the files.

Analyze the Business Requirement Document (BRD), Functional Specification Documents, Test Plans and Use case documents to prepare Test Cases.

Write SQL queries for Backend Testing and to verify the database updates, inserts and deletions etc., and validate them.

Black Box Testing and of the implementation and as black box tester on the software modules, documented Test cases and results in the standard templates of MS Excel and MS Office.

Performed functional, integration, regression, UAT, negative and positive testing.

Worked on writing complex SQL Queries on different databases (Oracle, SQL Server)

Identified defects during execution and managing the defects in HPQC and JIRA.

Involved in Web Services Testing using SOAP UI and tested the migration of existing APIs to Web Services APIs.

Created and executed automated test scripts using UFT.

Defining user defined parameters to execute various functions in UFT.

Perform Integration testing to make sure interaction between systems and subsystems.

Determine potential issues and risks related to Testing process, and assist with the implementation of an appropriate mitigation plan Requirements.

Interacted with developers, business analysts, and the project manager for test case reviews, defects issues, and tests data.

Used Selenium Web Driver to run test cases in multiple browsers and platforms.

Interacting with developers to resolve the technical issues.

Involved in the creation and maintenance of Test Requirements, Test Plan, Test Cases in the Test Repository using Quality Center

Environment: JIRA, VISIO, ALM, Quality Center, Version One, Selenium, SQL Developer, Java, Putty, WinSCP, UFT, VB Script, .Net Frame Work, SOAP UI, Web Services, UNIX, XML, Microsoft Office 2011.

Discovery, New York City, NY Jun 2016 – Jun 2018

Jr. Quality Assurance Engineer

Responsibilities

Reviewed the Test Plans and written Test Cases based on Functional Requirements.

Generated the Test Cases and executed them for Testing the Application Manually to get them reviewed by the Quality Managers.

Created the Requirement Traceability Matrix (RTM) based on customer Specifications.

Performed Back-End Database Testing by writing and executing SQL Queries on Relational Databases in SQL Toad in order to ensure the Data Consistency on Front-End.

Used Test Director as a Test Management (Analysis, Reports & Defect Tracking) Tool.

Used Quality Center for defect tracking and reports generation.

Defined the Defect Tracking Schema and Collected Data for Collection Defect Metrics like Defect Density, Defect Age, Effort Variance and Schedule Variance.

Created Java-based scripts for Selenium Web Driver with TestNG as automation framework.

Summarized Detail Test Reports with Defect Logs and submitted them to Developers through proper Channels.

Environment: Manual Testing, Windows XP, Windows 2003, Selenium, Mercury Test Director 7.0, IIS Web Servers, Microsoft Product Studio, Visual Source Safe.



Contact this candidate