Resume

Sign in

QA

Location:
Raleigh, North Carolina, United States
Posted:
September 16, 2019

Contact this candidate

Resume:

Experience Summary

Over *+ years year experience in Software Quality Assurance with Software manufacturing, and financial industries exploring GUI automation, functional testing, regression testing, platform compatibility testing, and test case development, for mainly web based applications.

Currently working with banking domain as a QA Engineer.

Specific areas of expertise include manual test case development, automation test case development, executing manual and automation tests for multiple products in Agile scrum as well as Waterfall SDLC.

Excellent analytical, problem-solving, multi-tasking, strong verbal and written communication skills, solid work-ethics, goal oriented personality along with education in Computer Engineering.

Tested both Client-Server and Web based applications.

Writing SQL Query and checking/testing the data in database to make sure that data is saved/modify as per test cases.

Performed Integration Testing, System Testing, Regression Testing,

Participated in monthly production rollouts to do final end-to-end testing and acceptance testing

Experience in documentation of the testing strategy for Test Scenarios, Test Cases, Test Steps and Logging Bugs

Knowledge in System testing, Integration, Regression and End to End Testing

Technical Skills

Databases: Oracle 11g/10g, AQT, Teradata V2R6, MY SQL

Database Programming Tools: SQL Developer, SQL*Plus

Operating Systems: Windows 10/8.x/7.x/2003/2000/XP/XP/NT/98/95, UNIX

Software Engg. Methodologies: Agile, Waterfall, Kanban

Reporting Tools: Cognos 8.4 Suite, SSRS

Defect Management Tools: HP ALM/QC, JIRA, Rally

Programming: SQL, PL-SQL, Core Java, UNIX

Testing Tools: Selenium, Informatica

Professional Summary

Union Bank, Phoenix, AZ Jul 2018 - Sep 2018

Test Coordinator

Responsibilities:

Conditional testing of constraints based on the business rules

Conducted Functionality testing during various phase of the application

Validate the reports to make sure all the data is populated as per requirement

Reviewed the use case and business requirements (BRD) for functional testing

Responsible for creating Test Plan(s), Testing Tracker(s), Test Execution Status Reports, and Defect Reports.

Coordinate the testing activities across the project/program and engage all testing parties

Hold testing kickoff meetings and meet with all project stakeholders

Develop test strategy with input from test planners, development teams and focus on end to end test coverage, test scenarios and interfaces

Ensure ambiguity reviews are completed either as part of the agile process or through a formal review, confirming any ambiguities identified are appropriately assigned and resolve

Facilitate the development of test plans, by test planners, across all areas and bundling into comprehensive test plan package with end to end coverage confirmed

Facilitate the development of appropriate test cases, scenarios and scripts by test planners/testers and support test data, test IDs and test environment requirements

Identify testing dependencies & risks with requirements, scope, and application availability as well as understanding dependencies between testing teams

Review testing results summary certification and driving the approvals process

Tested Cognos reports for data quality and cosmetics according to requirements

Created test case scenarios, executed test cases and maintained defects in RALLY

Performed dashboard/UI testing, Wrote and executed test cases

Tested Reports, Dashboards and other functionality within OBIEE Suit

Validate the reports by using SQL

Wrote the SQL query

Prepared Test Data and executed Test Cases

Worked as Test Coordinator between BA Team, QA and Dev Team

Environment: Oracle11G, SQL, Cognos, Kanban, Cisco Jabber, Windows 10, Rally, HP-ALM

NTT DATA, Charlotte, NC Jan 2018 – Jul 2018

UAT Tester

Responsibilities:

Worked extensively on SQL

Written ETL Test Cases to compare Source and Target database systems

Wrote and executed several SQL scripts to validate data and database integrity

Extracted data from various sources like Oracle, flat files and SQL Server

Tested reports generated by OBIEE and verified and validated using custom SQL queries

Scheduled and ran several Job/batches in Autosys on an ad-hoc basis

Executed the Autosys batch processes to test the data quality, data accuracy and data complexity on the numerous files and tables

Worked on Snowflake data warehouse/Schema and wrote the SQL query based on requirement

Exported Manual Test Cases from MS Excel template directly to HP Quality Center and executed all the Test Cases in HP Quality Center with Pass/Fail/Blocked status

Analyzed and created the bug tracking reporting and summary report in the Quality Center

Maintained the Control and Data file in Winscp

Dashboard (Tableau) Visualization for end user requirements

Environment: Oracle11G, SQL, PL/SQL, HP ALM, Agile, Autosys, Winscp, tableau and Windows

Prime, Minneapolis, MN Jul 2017 – Dec 2017

Manual /ETL Tester

Responsibilities:

Extensively involved in business analysis and requirements gathering

Created and Configured Workflows, Work lets, and Sessions to transport the data to target using Informatica Workflow Manager

Tested SQL queries to validate the data and performance of the database

Planned and conducted OBIEE report tests for wide variety of applications

Tested -reports, Dashboards and other functionality within the OBIEE suite

Tested and validated the cube data, ensuring that the data is correct by comparing the data results to comparable source system reports or by querying individual transactions and forms

Testing the source and target databases for conformance to specifications

Conditional testing of constraints based on the business rules

Design and execute the test cases on the application as per company standards and tracked the defects using JIRA

Interacting with senior peers or subject matter experts to learn more about the data

Identifying duplicate records in the staging area before data gets processed

Extensively written test scripts for back-end validations

Environment: SQL, Oracle 10g, Informatica 9.1 JIRA, UNIX, Cognos

Synechron, Charlotte, NC Nov 2016 – Jun 2017

Manual Tester

Responsibilities:

Performed manual testing using the test cases for positive and negative testing

Involved in writing and testing test scripts using Selenium WebDriver validated the scripts to make sure they have been executed correctly and meets the scenario description

Used firebug to identify object’s in the application

Used Selenium IDE for Open source web testing

Developed test automation in an Agile development environment using a Test-Driven approach

Supported BA and Business Users during UAT phase

Defect Tracking and Bug Reporting was performed using Quality Center

Created all Test Plans, Test Cases and Manual and Automated Scripts to create greater coverage for all initiatives assigned

Generated test result report at Test Case level and Test Suite level

Reviewed test plans, test cases and test scripts to ensure consistency with strategic direction, goals and objectives of QA resources working on the project with Agile SCRUM methodology

Environment: Selenium, Java, WebDriver, Quality Center 9.0, A, UNIX, CITRIX, Windows 2000, HP ALM and SharePoint

Accenture, Austin, TX Jul 2016 – Oct 2016

Manual QA Tester

Responsibilities:

Extensively involved in business analysis and requirements gathering

Created and executed test cases as well as verified the actual result based on expected result

Performed functional, regression and end to end testing

Documented and reported all found defect in Jira

To-End Testing Document.

Automated test cases using Microsoft Test Manager, and Web Test Manager.

Executed Test Cases and created Defects. Assigned Defects to right resources and tracked them.

Provide support to users and developers during the entire process from development to release.

Production issues, if any, were promptly replied. Resolved critical production issues within agreed timeframe.

Coordinated application changes with different functional teams during development, testing and release phase.

Tested SQL queries to validate the data and performance of the database

Tested -reports, Dashboards and other functionality within the OBIEE suite

Testing the source and target databases for conformance to specifications

Design and execute the test cases on the application as per company standards and tracked the defects using JIRA

Review web based application and identify critical functionality

Environment: Windows, Lync SQL, Oracle 10g, 1 JIRA, UNIX, Cognos, Excel

EDUCATION QUALIFICATIONS:

MS in Electrical Engineering, USA



Contact this candidate