Post Job Free

Resume

Sign in

Quality Assurance Qa Analyst

Location:
Cumming, GA, 30041
Posted:
March 20, 2024

Contact this candidate

Resume:

RAMYA REDDY

*** -***-****

QA Analyst

SUMMARY

●16 years of Quality Assurance experience in a variety of industries and environments.

●Experienced in SDLC Methodologies - Waterfall, Iterative and Agile SCRUM (Daily Standup, Sprint Planning and Sprint Retrospective meetings) and Kanban.

●9 plus years of experience in Agile environment

●Experienced with QA Practices, Software Test Life Cycle, Test Plan, End to End Testing

●Experienced in leading and working with on-Site, offshore, and distributed teams.

●Experienced in testing Client Server, web based, Mobile, Big Data applications.

●Experienced in developing, executing, and maintaining Automation testing in CI/CD.

●Experienced in working SOLARIS, UNIX, LINUX, HADOOP, and Google cloud platforms (GCP).

●Experienced in working in Web Services/API testing using SOAP UI, Postman.

●Ability to work independently as well as with highly cross-functional teams, focus on deadlines and deliverables.

CORE COMPETANCIES

• SDLC, STLC • Project & People Management

• QA Strategy Definition & Execution • Defect Life Cycle Management

• Test Estimation & Release Planning • Metrics & Status Reporting

• Strategic & Operational Planning • End to End testing

• Test Environment Set up • Risk & Issue Management

• Manual testing • Exploratory Testing

• Automation • Security Testing

• Compatibility testing • Usability Testing

• Black Box testing • Smoke Testing

• Regression Testing • Integration Testing

• Traceability Matrices • SQL & Backend Validation

• Processes, procedures, and Standards Design • Test Data Automation

TECHNICAL SKILLS

Tools: HP Quality Center, QTP, Test Director, JIRA, qtest, Rally, Test Link, UFT, WCF storm, UFT, WinSCP, FileZilla, SOAP UI, Jenkins, Postman, Rest, SFTP, Putty, Puttygen, Cloud SDK, Toad, Service Now, Test Project, Zephyr, Maven, TestNG, Selenium, JiraXray, Service Now, Beyond Compare, Eclipse, Browser stack, Sauce labs, Play wright, Splunk.

Languages: .Net, Java, J2EE, SQL, JAVASCRIPT, HTML, CSS, XML, ASP, JSP, VB, UML, PYTHON, BASH

Formats: CSV, Fixed layout, Delimited layout, AVRO, XML, YAML, JSON.

Databases: MS SQL Server, Oracle 10g, DB2, MS Access, DBeaver

Packages: MS Word, MS Excel and MS Power point, MS Project, Dev 2000

Operating Systems: Windows 95 / 98 / NT / 2000 /XP, SOLARIS, UNIX, LINUX, MS DOS

Version Control: Clear Case, TFS, Visual SourceSafe, Bit bucket, Git, GitHub

Bug Tracking tools: QC, Bugzilla, JIRA, Extra View.

Reporting Tools: Business Objects XIR2, Crystal Reports, Visio, MS Office Suite

XML Tools: XML Notepad, Notepad++, Xml Editor, xml validator, xml spy, Vi Editor.

Cloud Environments–GCP, AWS, AZURE.

EDUCATION/ CERTIFICATION

Bachelor of Engineering in Computer Science

AWS Cloud Practitioner

BACKBASE, ATLANTA JUNE 2021 – TILLDATE

SR QA Analyst

Backbase provides the leading Engagement Banking platform that empowers banks and other financial institutions to accelerate their digital transformation.

Working with business analyst, Project managers, developers to understand requirements and refine acceptance criteria and ensure overall quality of coverage.

Participating in Sprint Planning and Grooming user stories and providing estimates of work-effort.

Working with cross functional teams and gathering test data for the user stories.

Performing Functional, Regression, Automation, Database tests as well as using postman for API testing

Identifying Defect and log & Report the Defects.

Creating defects and provide developers with details required to reproduce and resolve defects in Jira.

Develop, enhance, and execute Automation scripts to verify functionalities.

Performing Regression testing on daily basis and troubleshooting & fixing the scripts as per the functional changes in the application.

Performing Database verification using DBeaver for the created accounts using postman.

Working with clients and participating in sprint demos and sprint retrospective

EQUIFAX, ATLANTA JAN 2015 – APRIL 2021

SR QA Analyst

Equifax is a consumer credit reporting agency.

Participated in requirements gathering, Reviewed requirements and technical design documents. Work closely with development resources to understand overall design and streamline testing of new features.

Experienced in Understanding all aspects of software testing including test environment, test data, and test automation, complete software testing life cycle and Agile methodology.

Defining test coverage, develop test plans and testing scenarios in accordance with the functional requirements and detail technical specifications provided.

Maintained the Requirement Traceability Matrices to track the coverage of Requirements v/s designed test cases.

Leading and guiding other QA team members in developing and evolving standard QA practices. Was responsible for training offshore QA members and new hires.

Participated in Bi-Weekly User Story Grooming sessions and Iteration Planning meetings with Scrum Master, Product Owners, and Team members.

Created User stories in Jira and assigned it to the QA resources onsite and offshore.

Gathered test data or mock the data using the masker and build the Datasets as needed for testing purpose. Experienced in creating Manual Test Data generation or Automated Test Data.

Experienced in working with valid Data, Invalid Data, sensitive Data (PII), TOKENIZATION like Primary account numbers and credit card information.

Installed the tar file build provided by the developer in QA environments. Compile the modules and copy the dependent files as needed for the build and distribute them to the cluster in Hadoop environment.

Automation experience in python framework. Triggering the Run deck Automation Regression suite and verify the results.

Experienced in configuring, Adding, Updating, and maintaining the Automation regression suite.

Executed Smoke test as soon as the build was provided, Analyze, Debug, and send report to the team.

Created Automation scripts for building datasets and executing binaries and scheduling batch jobs.

Developed and maintained an Automation framework and updating scripts to drive quality and efficiency improvements.

Executed the jobs and performed positive, Negative, Functional, Integration, Performance, and end-to-end testing. verifying the results, logs, and counter files using Unix commands.

Attended the Daily Standup, Daily Bug review meetings, weekly status meetings and interacting with Developers for resolving Defects.

Collaborated as well as Communicated cross-functional team members to achieve project goals.

Updated the CICD Automation scripts with the environment specific parameter and the build and jar information

Executed the CICD Automation with every Build or JAR release and informed developers for the differences received. Taking initiative and working with various teams on the differences observed and created defects in their dashboard.

Executed the CICD Automation Baseline and Current for jar or build release before UAT installs.

Added jobs in continuous integration for end-to-end Automation testing.

Performing Data Analysis for all incoming feeds from different data sources. Updating the Test Data in multiple environments. loading datasets into HDFS and processing the datasets

Executed the Map Reduce jobs and downloading the results from Hadoop environment to local and verifying the results. Executing Spark compare scripts as needed.

Troubleshoot the error messages, checking logs, and creating bugs and assigning them to the developers.

Performed Web Services Testing and XML validation by creating test cases. Submitting jobs in postman using our own created xml and yaml dependent files and Validating the responses of web services by passing various requests. Verifying the logs in the cloud environment.

Responsible for the Maintenance of QA Environments. Approve/Remove access for the users and Reaching out to the SRE teams whenever the environment was down and creating incidents tickets in Service Now since other teams were depended on us.

Responsible for Setting up the UAT Environments. Creating tickets for install in UAT environments. Verified the Production install tickets in Service Now. And Getting approvals as needed.

Participating in Production Installs along with the SRE.

MDA INC Oct 2012 – Oct 2014

SR QA Analyst

Medical Doctor Associates (MDA) is a national medical staffing company, Provide Healthcare services and technology company, Malpractice coverage insurance and risk management support.

Actively Participated in Requirements gathering along with Business Analyst’s, worked with Business User’s and different Stake holders.

Attended Weekly IT Managers Meeting, Resource Planning Meeting, Team Meeting and Defect Review Meetings in Agile environment.

Created Testcases, Test Scenarios, executing test cases in Test Lab, Creating Defects in Quality center.

Performed Manual, Functional, Regression, Exploratory, Integration, Smoke, Browser compatibility, Security testing and Database Testing.

Created SQL statements using TOAD for data verification.

Worked closely with the Dev team and helped them with Defects.

Performed End to end testing while upgrading oracle from 10g to 11i which includes but not limited to Functional testing, Data verification testing, Performance testing, and Regression testing.

Responsible for Handling multiple projects with changing priorities to complete tasks.

Performed window Services testing. Ran regression tests on deployment timings reach out to developers for the differences received which are not expected.

Testing the application in different browsers like IE, chrome, Firefox, Safari, and testing application different devices like Mac, iPad, and iPhone.

Scheduling one on one with the users and working on UAT Environment Setup, Test case Scenarios and Test Data for the UAT. Provided pre-and post-production deployment support end to end.

Participated in daily status meetings, track progress and open issues; follow-up with other team members as needed. Providing weekly Status reports.

Delta Airlines, GA Feb 2012 –Oct 2012

Test Team Lead

Delta is one of the major airlines of the United States.

●Actively participated in weekly meetings along with Business Analyst’s, Project Managers and worked with Business User’s.

●Facilitated customer meetings to present, define and gather specific business requirements and determine priorities. Had one to one meeting with different users to document Use cases.

●Attended Weekly team meetings, Lead status meetings, defect review meetings, and Critical defect by tower and across tower review meetings.

●lead a small team of testers, providing work direction, guidance, and coaching.

●Developed and Executed Test Plans and Test Cases using Quality Center

●Deployed code to QAT or Stage environments as needed during weekly code moves using RET.

●Performed smoke Test, Functionality Testing – Links, Forms, Cookies, HTML/CSS, and Business Workflow, Usability Testing – Site Navigation, Content, and Image Testing, Compatibility Testing - with various combinations of Operating Systems and Browsers.

●Performed Interface Testing – Application, Web Server, and Database Server.

●Performed Mobile APP Testing, Content Testing, Security Testing – Session Expiry, Unauthorized access to secure pages and downloadable files.

●Performed Data, Functional, System, GUI, Integration, Regression, Interface, System and End to End testing manually.

●Documented test failures in defect report and resolving all issues related to testing.

●Categorizing bugs based on severity and interfacing with developers to resolve them.

●Coordinated testing with Off-shore team, Assigning, and tracking their day-to-day assignments and reviewing their results.

●Expertise in determining Manual and Automated test cases and scripts for different phases of testing.

●Involved with UI Navigation, functionalities testing and creating scenarios scripts and providing them to the automation test engineer.

●Performing integration testing for Facebook, twitter and google plus.

●Testing delta application in Mobile and iPad devices.

●Provided test scenarios and test data to the performance engineer for Load Testing using Load runner.

●Assisted business users in defining User Acceptance Testing, test cases and plans.

●Maintained and executed Defect Tracking workflow. Creating new defects, reviewing defects, repairing open defects, updating, and maintaining recent screenshots and closing defects using QUALITY CENTER.

●Escalated issues to the Manager in a timely and appropriate manner

●Participated in the go-no-go meetings and proposed suggestions with the metrics related to pre-agreed release criteria

Kroll Background Services, TN June 2009 –Feb 2012

QA Analyst

Kroll's Background Screening division which supports the end-to-end process of background screening right from data collection to screening report with status updates

Responsibilities:

●Implemented Software Test Life Cycle – Requirement Analysis, Test Planning, Estimation, Test Case Development, Environment Setup, Test Execution, Defect Management, Regression and Test Cycle Closure.

●Validated relationship between requirement definitions, functional system design and use cases to ensure each document is in synch with other documents.

●Managed QA defect process. Coordinated communications in prioritizing issues and performed Root Cause Analysis.

●Used Quality Center for Test Management, Requirements, Test Plan, Test Lab

●Creating Defects in JIRA and assigning them to the developer.

●Established Requirements Traceability and Test and Defect coverage using Quality Center.

●Created Master Test Plans, Test Strategy, Test Approach documents for both strategic and operational projects. Identified and documented the applications, data and specific functionality associated with each test case.

● Performed Smoke, End to End, Functional, GUI, Usability, Regression, Integration, Migration, Interface, and System Testing manually.

●Gathered test data for test environment. Supported operational issues - coordinated fixes, builds, releases with development and release management teams.

●Extensively used SOAP UI to validate web services response. Performed load testing and Regression Testing using SOAPUI.

●Tested Web Services with multiple parameters (Request and Responses). Developed Complete Data Driven test suite and ran the Automated script in SOAP UI.

●Performed API testing using SOAPUI and database testing to confirm all the data is being migrated to the specified tables.

●Translated exhaustive requirements specifications in to UAT Test Cases to recognize errors and gaps that could be present in requirements. Developed and implemented UAT Test Strategy and Test Plan.

●Managed QA defect process. Coordinated communications in prioritizing issues and performed Root Cause Analysis. Working with the development team to fix backlog bugs and getting them prioritized.

●Performed Data Analysis, validation, and Database testing with Microsoft SQL Server Management Studio. Developing and executing moderate to complex scripted automation tests,

●Coordinated and communicated with IT resources across the US and offshore to support multiple projects.

●Tracked and reported on testing activities including test results, test case coverage, required resources, defect status and performance baselines and milestones.

OHL, TN Oct 2008 –May 2009

EBooking, EFocus

QA Analyst

Ozburn-Hessey Logistics, LLC, a supply chain management company.

Responsibilities:

●Reviewed requirements and design documents to ensure adequate understanding to develop test cases.

●Created the strategy document that defines the test environment, phases of testing, entrance and exit criteria into different phases of testing.

●Created and executed Master Test Plan and designed testing strategy involving Various types of testing,

●Developed test cases based on various levels of requirements and Business rules, Developed automated test cases.

●Used Quality Center for Test Creation, Execution, Defect tracking and Result Reporting.

●Designed Test Scope, Test Requirements Outline and Requirements Tree and associated requirements, defects to establish Traceability and maintained Requirements and Test Traceability Matrices.

●Used Quality Center to track and report system defects and bug fixes. Written modification requests for the bugs in the application and helped developers to track and resolve the problems.

●Created Reports on weekly basis, kept the concerned parties notified, documented the proceedings.

●Provided functional expertise to developers during the technical design and construction phases of the project.

●Created, executed, reviewed, and documented test scripts for manual and automated software testing.

●Executed tests, tracked defects, and performed analysis.

●Performed Functional, Navigational, Security Testing. Developed Automation test scripts for functional and Regression Testing using QTP.

●Experience in Identifying Risks and communicating them to the developers.

●Created basic SQL statements and executed scripts using SQL server for database testing.

●Documented status and final reports for QA activities.

●Worked closely with UAT Testers and End Users during system validation, User Acceptance Testing to expose functionality/business logic problems that unit testing and system testing have missed out.



Contact this candidate