Post Job Free

Resume

Sign in

QA Analyst

Location:
Charlotte, NC, 28262
Posted:
March 19, 2012

Contact this candidate

Resume:

Summary

• Strong team player with excellent analysis, organizational and writing skills. Good interpersonal skills having solid technical skills, high level technical aptitude, and excellent customer service skills, and team oriented, a quick learner and strive to deliver the highest quality product to my customer and company.

• 6 + years of IT experience Software Quality Assurance testing in Mainframe and Windows environments.

• Proficient in Manual testing on Mainframe, Client server and Web based applications.

• Experience with creating Test Plans, writing and executing Test Cases, Manual Testing, and automated Test Execution.

• Extensive knowledge of Quality Assurance Methodologies and strategies with better understanding of Software Development Life Cycle (SDLC).

• Experience in Financial, Retail and Insurance domains.

• Experience in creating and executing Test Cases (Manual / Automated). Test procedures, Test Strategies, Test Suites, Test Scripts, Test Data and Expected results.

• Extensive working Experience in Black Box, Functionality, GUI, Regression, Integration, System, Database testing in all stages of TLC.

• Knowledge and experience in handling every phase of Quality Assurance Life Cycle (QALC) and Software Development Life Cycle (SDLC).

• Experienced in communicating with Team, Customers, Business and Technical group.

• Ability to handle multiple tasks and work independently as well as in a team under minimal or no supervision.

• Strong analytical, Communication, and problem-solving skills are positive points.

Core Competencies

Operating Systems : Windows NT/2000/XP, UNIX, LINUX, MVS OS /390 (MVS): TSO/ISPF

Scripting : JavaScript, VBScript, Shell Script

Database: ORACLE, MSAccess, DB2, Toad, WINSQL , SQL Query Analyzer, Erwin

Test Management Tools: Test Director Quality Center 8.0, XML Spy, JIRA, BugGid 2000 Clear Quest, Test manager, Win Runner 7.6

Protocols: SOAP, FTP, Telnet and TCP/IP

Application Software: MS-Excel, MS-Word, MS-PowerPoint and MS Outlook, SQL Navigator, Visio

Technologies HTML, DHTML, XML, XSD, XSL, C, Java, ASP, IMS, CICS

Experience

Feb 2011 to March 2012 Wells Fargo, Charlotte, NC

IVR (Interactive Voice Response) System QA Analyst

CIV (Customer information view)

Responsibilities:--

• Performed necessary requirements inspection, design inspection, Test plan and case creation, Test environment / data setup, Testing and reporting during various phases of the Testing life cycle.

• Writing test cases based on functional design documents, setup testing environment, and testing the application.

• Performed and executed Regression testing to identify errors in each modified build in the process of testing

• Extensive experience with Data Analysis.

• Performed functional testing on Mortimer (Application Programming Interface tool).

• Worked on different kinds of accounts like Mortgage, Brokerage, Credit Card, Debit Card activation and so on.

• Effectively interacted with various streams of people like Users, Developers and Administrators to translate business requirements in to technical specifications to ensure overall quality of the software.

• Responsible for creating the End user Manual and for preparing test cases and test reviews

• Worked with technical designers and architects to understand the requirements for a Test environment setup. Assist in establishing Test environment setup and associated requirements.

• Successfully implemented Mercury Quality Center for Test Planning, Test Case writing, Test Execution and Requirement Mapping with Test Cases.

• Verified application logs to validate the application auditing and logging requirements in UNIX environment.

• User interface data verified against CICS screens and SOR’s.

• JIRA is used to track the project tasks and issues.

• Performed GUI, Functionality, Integration and Regression testing.

• Involved in the development of test procedures for various stages like Integration, System and User Acceptance Testing. Positive and Negative testing.

• Interacted with Developers and management to identify and resolve technical issues.

• Prioritizing the bugs and keeping track of them through Defect Tracking Database.

Environment: Quality Center, UNIX, JIRA, Web Sphere, J2EE, Web Services, CICS, DB2, Oracle, Win SQL, Windows XP Professional.

Feb 2010 to Jan 2011 Bank Of America, Charlotte, NC

ACH Target end State Pariter PPV QA Analyst

PPV is a process by which all client files will be validated in both the BAC ACH Processing System as well as through Pre-Edit and edit process with the new Pariter application files will be processed on the Pariter Pre-Production environment. Results will be then compared to ensure that the files process successfully on both platforms.

• Analyzed Business requirements, Functional, Designee and Use Cases.

• Created Test Plans, Test strategies, Test Cases and Test Scripts.

• Used Test Director for maintaining the test cases and Test scripts for the application.

• Create flat files to send to File processor or pariter for testing from pariter.

• Execute batch job on Mainframe to process the files received.

• Created Test Plan for the PPV Project based on various requirement documents.

• Review the flat files generated by above process.

• Validate the files against the PEP+ screens(CICS) and DB2 database.

• Extensively used WinSQL for querying SQL queries against DB2 database to validate the Data Driven results.

• Log the defects using Quality Center for defect tracking.

• Worked closely with developers in reproducing bugs reported.

• Troubleshooting file process issues and recommend fixes

• Generate reports during test execution used for incident status meeting and resolution.

• Participated in weekly status meetings

Environment:, Quality Center, Mainframe, CICS, Pariter, CA7, TSO, DB2, File-Aid-DB2, WinSQL, Windows XP.

Aug 07 to Dec 09 Food Lion, Salisbury, NC

QA Tester & Analyst.

• Involved in manual testing of various concurrent Projects like Item Life Cycle Management: Vendor Management, Item Management deals, Expense Payable including General Ledger and Reconciliation

• Gathered application related requirements, to create Test plan, test cases, test scenarios and test Scripts.

• Developed, Implemented and maintained Test cases based on Business Requirements.

• Executed Manual testing and verified results with expected results.

• Interacting with Development and Analyst Teams to ensure overall quality of the software.

• Created detailed Test Scenarios and Test cases according to the business/functional requirements

• Gathering Business Requirements, analyze functional Specifications, which involve series of meetings with developers and Business System Analysts (BSA).

• Created, stored and maintained Test requirements, Test Cases, Test scripts in Quality Center.

• Functional, Integration and UAT tests were performed during different phases of TLC.

• Validated data against back end IMS and CICS regions.

• Validated Mainframe datasets to verify Data extractions.

• Worked closely with Test Data specialist and Test Environment Specialist to maintain test environment and Execution smoothly and successfully.

• Extensively used WinSQL for querying SQL queries against DB2 database to validate the Data Driven results.

• Created, maintained, and updated Regression suites for continuous builds.

• Created User defined functions to test the specific application related requirements.

• Performed Integration testing to make sure connectivity between systems and its subsystems.

• Logged defects in defect tracking database using QC Test Director with proper severity levels.

• Extensive use of defect tracking tools for logging defects and maintaining the work flow.

• Maintaining, Monitoring, and documented the behavior of the application in different phases of Testing.

• JIRA is used for issue and project task tracking.

• Rapid iterative testing and evaluation of prototype Web user interface.

• Responsible for providing Status Reports and Final Report of Testing to the QA Manager.

Environment: Quality Center, JIRA, WinSQL, DB2, OS/390 (MVS) TSO/ISPF, IMS, CICS, Websphere, J2EE.

Dec 05 to May 07 CITI Group, St Louis, MI.

QA Tester & Analyst

• Performed necessary requirements inspection, design inspection, Test plan and case creation, Test environment / data setup, Testing and reporting during various phases of the Testing life cycle.

• Define and implement the Testing approach and strategy.

• Created Testing requirements from the functional requirements, system specification and design specification. Interact with user groups, Business Analysts, Programmers to understand detailed technical/business process flows, managed testing assets in TestDirector.

• Created Requirements traceability matrix, Use case in order to analyze and test critical areas of application, organized several brains storming session with Business Analyst and Clients in this process.

• Created, stored and maintained Test requirements, Test Cases, Test scripts in Quality Center

• Worked with technical designers and architects to understand the requirements for a Test environment setup. Assist in establishing Test environment setup and associated requirements.

• Involved in creating Test plans, Performance plan created Test cases and Test Scenarios.

• Validated data against backend Mainframe regions IMS/CICS.

• Involved in creating Test plans, Test cases and Test Scenarios for verifying ACH files according to the NACHA Standards.

• Reviewed business documents of ACH processing flow and created test cases for ACH process verification.

• Extensively used Jira to track issues and tasks.

• Validated data against Stand-In DB2 database.

• Conducted iterative usability testing.

• Responsible for using proper naming conventions for the test assets stored in TestDirector.

• Identified performance bottlenecks, hits per second and response time using performance graphs.

• Interacted with Developers and management to identify and resolve technical issues.

• Prioritizing the bugs and keeping track of them through Defect Tracking Database

• Reported the issues in Quality Center with proper priority and severity.

• Responsible for enhancement of scripts and test cases according to latest versions.

• Partner with the Software Development and Program Management teams to investigate and resolve product defects

• Maintain coverage analysis for automated and manual test suite

• Troubleshooting software issues and recommending fixes

Environment:, Quality Center, Test Director, Jira, Websphere, J2EE, XML, XSD, IMS, DB2, WinSQL, WIN-NT/2000/2000 Adv.

Dec 04 to Nov 05 JP Morgan Chase, Jersey City, NJ.

QA Tester

JPCC solution is a web-based application that is used to interface with Advanced Helpdesk of the JP Morgan Chase standard request tracking system. This application assists the Requestors in becoming proficient in requesting services and reporting problems to the Help Desk without having to make a phone call. The Requestor submits a Request through USCC solution web interface. The US Customer care processes the service transaction in advanced customer care system by using the Full Seat Client. For Requests Submitted via the web, the Customer care Desk is responsible for monitoring the frozen status Request Bin on the Service desk screen. The Help Desk Will Confirm the request data and perform a detailed analysis, contacting the customer for clarification or further information if necessary. The Customer care desk determines which resource is the best suited to deliver the service.

• Analyzed Business requirements, Functional, Designee and Use Cases.

• Created Test Plans, Test strategies, Test Cases and Test Scripts.

• Performed Block box testing for the continuous build using Win Runner.

• Created Detailed and reviewed test cases to make sure all the requirements are covered.

• Enhanced the Client/server Test scripts with GUI, Functional, Window, Database checkpoints.

• Created, Enhanced, Updated, Maintained Regression Test Suites for Continues builds.

• Created the automated test scripts in Mainframe, Client/Server and Web Applications.

• Created, Updated and Maintained the Object repository.

• Created, updated, maintained Reusable scripts for continues builds.

• Enhanced Win Runner Automation scripts with Synchronization points, checkpoints, and functions.

• Tested the functionality of the Application with positive and Negative testing.

• Performed Build Verification, Integration, System, Performance & User Acceptance testing.

• Analyze the Discrepancies, severity, tracking system and reporting through Test Director.

• Tested the functionality of software during all stages of the development life cycle

• Involved in Functionality, GUI testing, Regression, Integration, System, UAT, API, Security, Positive, Negative and Performance Testing.

• Worked closely with developers in reproducing bugs reported

• Participated in Walkthroughs and weekly status meetings

• Conducted database testing, passing SQL queries to the Oracle database.

• Identified performance bottlenecks, hits parsec, response time, in performance graphs.

• Logged the errors and track them using Test Director and coordinated efforts with the development team to solve the problems

• Used Test Director for maintaining the test cases and Test scripts for the application

• Used Jira to track issues and releases in development region.

• Performed in exporting Test Cases from EXCEL to Test Director

Environment: Toad, Test Director, Jira, Oracle, HTML, Java, J2EE, JavaScript, MS Office, Windows NT and 2000.



Contact this candidate