Resume

Sign in

Test Cases Manager

Location:
Ashburn, VA
Salary:
70$-80$
Posted:
March 22, 2018

Contact this candidate

Resume:

Arati Kotah

Sr.Quality Assurance Analyst

Email:ac4wi0@r.postjobfree.com

Cell:703-***-****

PROFESSIONAL SUMMARY

Over 12 years of diverse experience across US and India in Information Technology with expertise in Manual and Automated Quality Assurance testing and analysis for Banking, Brokerage,Financial, Media, payments,Telecom industries, sales, Healthcare domain and e-commerce applications covering Business Requirements, Analysis, Functional, System, Load, Regression, Integration and End-End Testing for GUI and Non-GUI (IVR – Interactive Voice Response) systems on cross-platform (Web and client-server) applications. Experienced in Automation testing using Selenium and QTP. Developed excellent professional and communication skills with proven results in meeting aggressive timelines and working independently and as a coherent team member.

PROFISSIONAL RESPONSIBILIITIES:

Holding Industry experience in Information Technology with strong emphasis in Software Quality Assurance.

Experience in Manual testing, Backend testing, Automation testing and End-End testing.

Performed Pilot tests, Integration, Regression, Smoke, System, Functional, GUI, Database Integrity and User-Acceptance (UAT) testing

Experienced in Selenium, QTP, Quality Centre and Neoload.

Worked with Selenium Framework and Selenium Scripts.

Experience on Selenium Web Driver, IDE and creating Scripts in selenium.

Experience in with Data Driven Testing using Pom.xml in selenium.

Worked on estimations, metrics, demos, Planning and scheduling of projects.

Experienced in Software Analysis, Requirements Management, Quality Assurance, Quality Analysis and management, Modeling, Configuration Management and Change Management.

Experience in all phases of the SDLC/QA life cycle and strong understanding of SQA methodology and Web and Client Server architecture.

Experience in working with databases like Oracle 9i and SQL Server.

Experience with Financial, Brokerage Applications, Sales and Billing, Web applications, and Telecom applications.

Experience in Financial applications – portfolio management, position maintenance and custody applications for domestic and foreign assets (CUSIP, SEDOL based), reconcile positions on System of Record (SOR) with Custodians, recon aging, resolve recon breaks.

Coordinated and participated in Peer Reviews and External Walk-thru with the Client at regular project stages for reviews on Test Plans, HLTC and DTC.

Performed trouble-shooting to resolve production problems.

Good working knowledge of Software Engineering Methodologies like SDLC, CMM and Agile methodologies (Scrum status, Daily Status and Test Progress Reports).

Deployments on QA and UAT servers, and performing SIT, Risk based Testing, Requirement Ambiguity Testing, Testing, Database testing using SQL, Testing, Multi Browser Testing, Mobile Testing, Migration Testing, Sanity/Smoke Testing, Regression Testing etc.

Used Quality Center tool for all the test management activities like Release Planning, Test Planning, Test case Execution, Requirement Traceability and Reporting.

Excellent in driving the onsite and offshore model. Coordinated a 12-member offshore team to successfully implement an offshore-onsite model.

Knowledge in Core JAVA, HTML and .Net technologies.

Involved in Configuration Management and New Release Requirements Meeting

Timely Escalations to the management’s notice for any major/blocking issues and participated in Defect Triage Meetings.

Was involved in internal testing for projects involving ATM certification, testing middle layer changes, Config files and maintenance of servers.

Worked with Maven and Jenkins for the build integration.

Comfortable with change and tight deadlines and able to adapt quickly to new situations and technologies.

Possesses good communication skills and ability to work well both in a group and individually.

RESPONSIBILITIES AS TEAM LEAD:

Been an active participant in understanding the requirements of clients and subsequently in designing the estimates and metrics, plans and strategies, schedules and preparation of demos from the inception of project.

Called for, and assimilated on discussions and meetings from formal, technical and non-technical people for quick turnaround.

Artifact Review and preparation (Test Cases, Test Plan, Status Report, Summary Plan).

Creating WBS (Work Break Schedule).

Assign resource, track and team mentoring.

Have taken complete responsibility in setting up trainings and organizing it for the QA department as a whole.

Have been responsible for learning new tools and techniques on my own, and responsible for implementation and training to team.

Have been on active client coordination on all issues, risks, artifacts, data management issues and notifications to client.

Made sure the test environment is put into place before test execution and managed during test execution.

Resolution of defects and tickets (setting up triage meetings if necessary)

All presentations and demos like the Kick off demo, QA Approach demo, QA Proposal demo and Phase-wise demos have been successfully carried out.

Been proactive in identifying the risks and issues and prepared any risk mitigation plans.

Lead the QA testing team manage testing schedule and communicate testing progress (standard and custom reports)

Maintained the enthusiasm of the team to instil a healthy working environment.

Overall planed, monitored and controlled the testing work under different hats as Test manager and Test Coordinator apart from Team lead.

EDUCATION DETAILS:

M.Tech (Computer Sciences) - JNTU Campus – Hyderabad.

B.E (Electronics and Telecommunication)-JNEC- Marathwada University.

TECHNICAL SKILLS:

Testing tools

Selenium, QTP 10.0, Quality Center9.2, Microsoft TestManager,Rational Suite-Clear Quest, UFT,Test track,JIRA, Bugzilla and LoadRunnner.

Languages

Visual Basic, C++/C, Java.

Web/Scripting

HTML, Java Script, TSL and VB Script.

RDBMS

Oracle, SQL Server 2008, SQL and PostgreSQL.

Version Control

Visual Source safe, PVCS, Clear Case and SVN.

Methodologies

Agile &Scrum.

Quality Standards

CMMI L3.

Environment

UNIX, WINDOWS95/98/2000/NT, MAC, IIS, Tomcat and AUTH servers

Tools/Suite

Jenkins, Maven, Team Viewer, MS Office, MS Outlook, MS Project and Lotus Notes and Assembla.

PROFESSIONAL EXPERIENCE

Charter Global, Atlanta May 2013-TillDate

Testing Team lead

Project1:

Customer Power is a multi-channel e-marketing solution, allowing users to use the application to send targeted marketing campaigns across the following channels: Email, Mobile Text Messages, Banner ads and SST.

Responsibilities:

Worked on client requirements for NCR.

Involved in Automation Infrastructure Development using Selenium.

Meeting the challenges of Automation.

Setting up the Automation Environment Setup Using Eclipse, Java, Selenium WebDriver and TestNG jars.

Implementing Selenium framework(POM).

Used Selenium IDE for recording initially to develop scripts.

Enhanced the Selenium Scripts using Web driver and java and selenium commands.

Developed Data Driven testing to retrieve test actions, test data from Excel files.

Worked on Add ons like firebug, fire path, etc.

Configured the OR and accessories like data and content for products and prices

Tested the reports in Test-ng for e-mailable reports.

Enhancement and configuration of HTML reports.

Integration with Maven and Jenkins

Conducting cross browser testing and parallel test execution.

Responsibility of configuration and integration management of the test ware

Managed work on both Manual and Automation Testing.

Preparing metrics for client like defect density sheets and status reports.

Execution of Selenium Test cases and Reporting defects.

Worked in a highly dynamic AGILE environment and participated in scrum and sprint meetings

Identified weaknesses in QA Processes, Web testing, Selenium Automation. Suggested & implemented improvements.

Environment: Java1.8, Mozilla Firefox38,17, IE8,9,10&11; Chrome 47, 48; Selenium Framework HTML and XML, Eclipse IDE, Jenkins, Maven and SVN.

Project2:

The Public Utility Commission (PUC) of the state of Pennsylvania introduced the concept of competition to drive electric prices down for the consumer.

Based on the above, Concert1.2 was developed for Sales people for ordering utility service online on behalf of customers with a business objective to reproduce the Simple Choice and PECO customer changes that were implemented in ACCORD (old system) in the Concert application.

Responsibilities:

Involved in creating test plans, test cases and executing the test cases.

Review the test cases for involved in traceability matrix.

Reviewed the test cases and participated in peer reviews.

Involved in functionality testing, regression testing.

Manually tested the entire application as End-to-End.

Reported, tracked and prepared defects in Airs.

Adept with stage Testing and UAT.

As Team Lead:

Prepared Test Plan and test estimations for review at Onsite and Offshore

Co-ordinated between Onsite and offshore on doubts, clarifications, daily status call meetings and emails

Maintained and updated issues and minutes, recorded and tracked all new requirements and changes, updated work plans and Work status for reporting

Reviewed doubts with Team before call and after call

Daily status call preparation

Scrum status in morning and evening and regularly had scrum meetings to analyze the progress of the team.

Assigning tasks for next day.

Assisted on coordination between developers, onsite and offshore testers, Business analysts and users.

Participated in Defect Meeting with Onsite every Monday

Involved in training staff.

Reported daily tasks and duties completed on Alfresco, prepared Test progress reports for onsite and offshore management.

Woked on writing stand alone test cases for Peco’s windows version of the app using UFT

Prepared team for automation of common features using Selenium

Environment: Java1. 6, SOAP 3.6, Oracle,UFT,SQL Developer, WinSCP, pgAdminIII1.6, PostGreSQL, XML, KDIFF3, Mozilla Firefox21, RC-Selenium, AGILE Methodology.

Project3:

All-Connect is a privately-owned services and sales analytics business mainly in price comparisons and connection facilitation of power and media services. Provides home services for offers home utility services such as Cable & Satellite TV, Phone Service, High Speed Internet, Electricity, Natural Gas & Security services from the biggest providers – Dish, AT&T, Comcast and Verizon.

Responsibilities:

Thorough understanding of Web based technology, call center applications, back end systems, SOAP and XML technologies

Document the business process by identifying the requirements.

Preparing excellent documentation on business requirements.

Handling various activities of the project like information gathering, analyzing the information gathered, documenting the functional or business requirements.

Worked on the Project plan and the Test plan

Design, develop, and execute High Level Test cases using functional specifications

Point of Contact between Project Management and appropriate IT groups from solution planning, sizing, to fulfillment in all phases of the test cycle, including Integration/System Testing and User Acceptance Testing, etc.

Used Load Runner for load testing.

Working and reporting on Agile methodology.

Coordinating with developers, managers and Onsite clients, QA team on behalf of the team

Participate in defect meetings and understand resolution of defects through Triage meetings

Perform deliverables tracking and reporting as assigned Prepared Work Breakdown Structure, Estimation hours, risk assessment registers.

Act as a subject matter expert between other QA teased teams, management and users and client whenever necessary and as required.

Environment: Java1.6, SOAP 3.6, Oracle, SQL Developer, WinSCP, pgAdminIII1.6, PostgreSQL, XML, KDIFF3, Mozilla Firefox21, Load Runner,Selenium, Agile methodology.

4.Western Union – New York-USA Dec 2010 to March 2013

Senior QA Analyst/Onsite-Offshore Co-ordinator

Project: Western Union (WU) Speed Pay Application is a payment gateway used by multiple clients in different industries to accept payments from their consumers. Western Union offers the Payment Gateway service on behalf of clients. The Speed pay application facilitates different interfaces like the Internet (Online), IVR, Extranet, HTTTP Wrapper, and Web services. Mode of collecting payments includes Credit Cards, ACH, ATM and check Payments and reporting through through BI reports. Cash payments can be remitted at Western Union payment centers and a payment could be automated using Recurring Payment Plans (scheduled ACH, checking or credit card).

Responsibilities:

Analyze user/business requirements and develop detailed test cases based on requirements.

Analyzed and helped in modifying use cases and created test cases.

Performed QA and User Acceptance Testing for changes/upgrades to Speed pay clients

Deployments on QA and UAT servers

Actively pursued testing production hot-fixes for critical production issues.

Involved in the Test Cases Inspection and the use cases/requirements review with stakeholders, business and technology teams to refine the business/test scenarios.

Created and Executed Manual Scripts for Web, back-end, scrape, IVR, various standard and custom crystal reports for various Speed pay clients.

Customized database, server and client settings for the install.

Tested the XML scripts for data content

Extensively used the Quality Center tool for all the test management activities like Release Planning, Test Planning, Test case Execution, Requirement Traceability and Reporting.

Organized the test cases in Quality Center.

Copy selective databases and run database scripts relevant to the current install.

Used SQL queries to query test data for corresponding test cases.

Update/insert data if required to suit the test cases.

Run test and executed reports in Test Manager.

Coordinate and follow up with developers for timely resolution of issues/bugs.

Timely Escalations to the management’s notice for any major/blocking issues.

Participated in the review and status meetings on a daily basis.

Coordinated a 12-member offshore team to successfully implement an offshore-onsite model.

Worked as QA testing Point of Contact (SME) and active team player for Regression & coordination activities.

Tested BE processing using Notes posting, Recurring Deamons, Processing, MISC Processing, Print Module, and Crystal reports

Also worked on critical projects involving one-time emails, recurring emails and SMS notifications.

Was involved in internal testing for projects involving ATM certification, testing middle layer changes, Config files and maintenance of servers

Assisted Project Managers on all critical issues faced during testing

Environment: Java/J2EE, .Net, XML, SQL Enterprise Manager, QC 9.2, VSS, Visio, Lotus Notes,,Windows XP, IIS Servers, AUTH Servers, JIRA,Microsoft Outlook, Excel, MS-Visio, Citrix, alive, EFO (Clientele).

5.Lehman Brothers – New York-USA Dec 2009-Nov 2010

QA Analyst

Project: Wealth Management Reporting - Data Consolidation Operations Tools is entitled to be used across business and operations teams for further levels of data integrity to be established across all downstream and WMR sub-systems. Data is captured into staging area after batch process to build a repository across sub-systems. Settled positions are captured from the data repository. Positions and transactions are reconciled against custodians like DTCC, BONY, etc. Any transaction/position breaks are reported and tracked (aging) until resolved. Reconciled data is made available for further reporting/presentation.

Any data issues in staging are resolved and uploaded for further processing and unresolved exceptions are escalated via Exceptions Summary/Detailed exception Views/reports.

An Integral GUI part of the DC operations tool set has interfaces to the Data Inquiry, Data Correction, Exception Processing, Data Viewer and Portfolio Performance Reporting system.

Responsibilities:

Made a detailed study of the in Requirements, Use-Case Analysis and had a Gap Analysis for SR3 and SR4 Exceptions.

Developed Test Plans, Test Cases, Test Scripts, Test Scenarios, Test Beds, and Test Data for DC Prototype Tables for Pilot testing of the exceptions.

Reviewed requirements to reconcile positions for domestic and foreign assets by CUSIP and SEDOL respectively. Reconciled positions at individual account, Omnibus accounts and at asset level.

Recon breaks caused due to holiday discrepancy (global vs US) or trade reversals are aged for the holiday and expected to clear the following day.

Asset transfers (free receipts/delivers) are reconciled for units.

Run the batches in Pilot tests in Auto sys (development) environment for capturing raw sets of data.

Raised the JIRAs for resolution of issues and bugs.

Conducted FE testing for the UEM (dashboard)and Data Consolidation (Data Inquiry, Data Correction, Data Viewer and Hearsay Maintenance

Conducted, Pilot tests, Integration tests Regression, Smoke, and End-End Testing.

Coordinated team members (Developers and BAs and other QC team members) for reporting to Manager on status reports, defect management and development of tests.

Mapped the Traceability between the requirements and the test cases created in Mercury Quality Center’s Requirements-Analysis-Reports-Requirements with Coverage Tab.

Mapped the different Schemas for the tables as and when required for DB servers and connectivity in DB Artisan 81.1.1

Worked extensively with SQl queries to validate the Batch parameters, Input, Staging, Persistent Recon, correction, Exception, Recon, for Journal, Trades, Income, Holdings and transaction for different types of exceptions on Missing Product, Price, Fx Rates, OOB, Accrual Breaks, Unsupported activity, etc.

Worked on change requests generated for SR3 release and retest the changes for a sign-off on them.

Maintained Test Logs, Test Summary reports and participated in defect review / Status / GO-NOGO Meetings with an ability to demonstrate via the UEM dashboard.

Gave final presentations of the UEM dashboard to all users of the system and Offsite coordinators.

Environment: Mercury Quality Center 9.0, Java, SQL, DB2, Oracle 9i, DBArtisan8.1.1, Autosys, Sybase, Windows XP, IIS servers, JIRA, Microsoft Outlook, Share point, BEN, MS-Office, Excel, MS-Visio and IM-live.

6.UMG – SONY Corporation, New York-USA Oct 2008-Nov2009

QA Tester

Project: Universal Music Group (a SONY Corporation entity) Shared Royalty Platform delivers the client service for royalty calculation, accounting, maintain/disburse royalty payments, generate client reporting statements, MIS reports, accounts and more. Worked on Alpha and Beta test strategies involving calculation of statements, records and contracts. Coordinated with WMG (Time Warner) and Exigent during the designing, development phases and during the preparation of scripts, test data and result sets.

Responsibilities

Participated in Requirement Analysis, Business Analysis, Use-Case Analysis and Gap Analysis.

Developed Test Plans, Test Cases, Test Scripts, Test Scenarios, Test Beds, Test Data and Traceability Matrix.

Conducted Integration, System, Functional, GUI, Regression, Smoke, Database Integrity, User-Acceptance (UAT) testing.

Coordinated team members for reporting to Manager on status reports, defect management and development of tests.

Test Management was done using Quality Center.

Managed the requirements using Requirements Plan Manager (Quality Center).

Worked with XML files and messages.

Developed Test Cases and Test Design using Test Plan Manager (Quality Center).

Worked with SQL functions to test the database integrity (Oracle, SQL Server).

Created Traceability between Requirements and Test Cases.

Conducted Data Driven testing from flat files to test various sets of data.

Worked with UNIX commands.

Conducted UAT tests along with Regression and Integration Tests

Tracked the defects using Quality Center and generated defect summary reports.

Maintained Test Logs, Test Summary reports and participated in defect review / Status / GO-NOGO Meetings with an ability to demonstrate via the dashboard.

Environment: Mercury Quality Center 8.2, Java, J2EE, SQL Oracle 9i/10G, TOAD, XML, Windows XP, Web Application Server (WebSphere), Share point, Service Port, Lotus-i-notes, MS-Office, Excel, MS-Visio, Unix and TRACS.

7.iPASS, CA -USA Jul 2007- Sep 2008

QA Tester

Project: "iPass connect" is a SSL Virtual Network, User Services network with TCP/IP protocol. The system enables iPass clients to connect using broadband, ISDN, Dial-up, GSM, wired (using DSN and host services) and wireless networks into respective networks globally in a secured and flexible.

Responsibilities

Understanding the functionality of the application.

According to the client requirements, analyzed the application and prepared test cases.

Managed 4 test resources off-shore.

Coordinated between onsite and off-shore teams.

Execution of the test cases and scripts using QC and reporting the bugs to client.

Worked with SQL queries to validate data.

Performed smoke testing and regression testing on 3 languages (English, German, and Japanese) and on different platforms (Win2K Pro, WinXP Pro, WinXP Home, Win ME, Win 98).

Worked with Firewalls like Black Ice on different platforms.

Dealt with Third Party tools like different VPNs namely Cisco, Nortel, Avential and PPTP.

Having experience with Anti-Virus tools like Symantec and Trend Micro.

Used PVCS for version control management

Preparation of Status Reports on a day-day basis and coordinating various status meetings.

Environment: Java, Perl, VB, Test Director, Perforce, VPNs (Cisco, Nortel), Firewalls, (Black Ice and Sygate), Windows 2k/XP, UNIX, Linux, QC 8.2.

8.SBC, CA-USA Feb 2006 to Jun2007

Test Planner

Project: The Testing Automation Process (TAP) initiative has been developed to drive down costs of customer support by automating ‘test procedures’ via Interactive Voice Response (IVR) and other systems, before directing the call to Customer Service Representative (CSR). The proposed system interfaces with multiple back-end systems that maintain customer profile: maintenance, customer support, billing system and system performance monitoring applications. ‘Bridging’ these back-end applications drastically reduce the call duration and the number of calls routed to Customer Service Representatives.

Responsibilities:

Analyzed Business requirements, High Level design, and Software requirements specifications and documented the process, call and high-level data flows for every call scenario.

Document management and version control using PVCS.

Specified and established key validation points for signoff.

Identified Software Development specs for specific Business requirements.

Develop and document Test Strategy, Test Plan, and Test Execution with all required hyper-links.

Develop and Document High Level test cases with Expected Test Results, traced the data from Harness Sheets for specific test cases. Identified and Integrated Specific Business Requirements. Developed the Requirements Matrix

Created Detail and Design Steps in Test Plan Tab and mapped the requirements from Requirements Coverage tab in Test Director

Created manual High Level Test Cases (HLTC) and Detailed test cases (DTC) and embedded them into appropriate tabs and levels in Test Director.

Worked with XML test files.

Prepared Test Data to suit the HLTCs.

Coordinated and Participated in Peer Reviews and External Walk-thru with the Client at regular project stages for reviews on Test Plans, HLTC and DTC

Prepared and followed up Peer Review, Defect/Issue logs and Check lists for the Reviews and Walkthroughs

Worked with SQL queries to validate data.

Coordinated weekly QA team meetings and participated in weekly interaction with the Development and Build Teams.

All Business/Technology issues were logged and pursued thru test director.

Worked in sync with development team to ensure high quality and timely releases incorporating all latest changes

Supported Test Execution team and coordinated with the development team to fix any test case issues as required

Followed CMM-Level5 standards for content management of Test plans, testcases, defect logs, reports, status meetings, etc.

Clear Quest, MS Office, Windows-NT4.0, VSS.

Environment: Java, J2EE, WebLogic, XML, SQL, Oracle, PVCS, Mercury Test Director7.6, Clear Quest, MS Office, Windows-NT4.0, and VSS.



Contact this candidate