Post Job Free

Resume

Sign in

Test Cases Manager

Location:
Mountain House, CA
Posted:
February 06, 2018

Contact this candidate

Resume:

Lalitha Yanamandra

925-***-****

ac4ddn@r.postjobfree.com

SUMMARY:

Have 12+ Years of experience in Software Quality Testing.

Experience in analysis of System specifications and Business requirements.

Experience in QA Methodology (Agile and Waterfall) and Software Development Life Cycle (SDLC).

Expertise in preparing documents like Test plans, Test Scenario, Test Cases.

Expertise in Test Analysis and Manual, Automated-testing processes developing, executing and maintaining of test cases/ scenarios using automated testing tools Mercury Quality Center, ALM, Win Runner, QTP, and Defect Tracking tools like PVCS Tracker, Vantive, JIRA, BLT(Bug Logging and Tracking), Rational Clear Quest and Clear Case.

Strong in testing WEB and GUI applications including GUI configuration.

Experience in Black box, Functional, Integration, Backend and End-To-End Testing

Expertise in Data Driven tests, Creating User Defined Functions.

Involved Integration, System and Regression Testing.

Experience in working with UNIX environment and PERL and Shell scripting.

Experience in Mobile Device application testing.

Experience in Web API Services and SOAP UI.

Strong working knowledge in SQL and PL/SQL language for testing database integrity.

Experience in working with Databases like Oracle, MySQL and MS Access.

Familiar in testing Multi-Tier application architecture.

Knowledge in TCP/IP (L2/L4/L7).

Knowledge in Selenium WebDriver, TestNG, Grid and Frames.

Knowledge in Data warehousing concepts and ETL Tool Informatica Power Center.

Familiar with GIS ArcInfo and ArcEdit.

Good knowledge in C, C++, Java, JSP, J2EE, HTML, WEB Applications and XML.

Excellent professional skills in working independently and as a team member.

Creative, determined, well organized and self motivated with strong Communication and excellent co-ordination skills.

Experience in planning Test Cases/execution time lines and Coordinating on assignments and issues with Offshore.

TECHNICAL SKILLS:

Testing Tools

Mercury Quality Center, Win Runner 7.6, Test Director 8, Quick Test Pro, Rational Clear Case, Rational Clear Quest, PVCS Tracker, (Java) Bean Test Tool, Sonic MQ Series, Vantive, ALM

Operating Systems

Ms-Dos, Window 98, Window NT. Windows 2K/XP, Windows 7, Solaris and Unix

Hardware

IBM PC XT/AT and Pentium based desktop and Sun Solaris.

Languages

C, C++, C Sharp, and Java 2, J2EE, SQL 3.2 and PL/SQL.

Database

Oracle 7.3/8i, MS-Access, SQL Server, Sybase, MySQL

GUI Tools

VB 5.0/6.0, Developer 2000, Crystal Reports and Visual Studio 6.0.

Scripting Languages

Java Script, VB Script, PERL, Shell and TSL.

Web Tools

HTML, XML, XMLSpy, Active Server Pages.

Web/Application servers

Personal Web Server, IIS, Apache, Web Logic, Tomcat and WebSphere, SOAP UI

WEB Browsers

IE, Netscape, Mozilla

Software Packages

MS-Office, Toad, DBArtisan, SQL Developer, Putty, Blue Martini

AMDOCS Suite

Clarify CRM, OMS, AMSS

DW - ETL Tools

Informatica Power Center 8.0

ERP Solutions

Oracle Apps (OM, SC,IB,PO, AP, AR, GL)

GIS

ArcInfo, ArcEdit

Work Status: GC

Educational Qualification:

Post Graduate Diploma in Computer Applications.

Microsoft Certified Professional (V.B 5.0)

Masters in Arts (Osmania University, Hyderabad, India)

ATT, San Ramon, CA Nov’13 – Jun’17

Sr. QA Test Engineer

Project 1: IVR

Responsibilities:

Attend daily Scrum meetings and monthly team meeting.

Review the Business Requirement documents, IVR Call Flows, Routing Sheets etc.

Understand the call flows and Menu variations.

Write Test cases in Rally and update the Tasks for each User Story.

Conduct Test Case Review meetings with team members and approvals.

Request and get the test data needed to make calls to test particular task.

Execute IVR calls and record calls so that can be replayed attached to results and logs.

Executed test cases manually in ALM and updated test cases with logs.

Login EricM and configure data as needed to make IVR calls to get respective menus articulations.

Make calls and verify the Menu variations and Menu Options articulated as per the requirement and route to the expected department and exit point.

Verify logs in UNIX and verify Data in respective data bases using SQL Developer.

Login the EricM application to set the flags as needed.

Verify the voice articulations heard in IVR as described in EricM.

Verify the call flow, Menus, submenus, menu options in EricM.

Perform various Payment activities like making full payment and verify in Back end database.

Perform various payment types (using CC, DC etc) and verify in Backend database.

Login Billing system and update data as needed.

Log defects in Rally and follow up with Dev and close the defects after fixes.

Perform regression testes after defects are fixed and update Rally with results and logs.

Communicate directly with Dev and Analysts in designing test cases and resolving issues.

Conduct live Demos and make IVR calls with Clients, Dev, Architects and other team members after every Iteration.

Login Telegence and configure customer types and billing options.

Involved in Web Services / API testing.

Responsible to verify Web services request, response and data validations in SOAP UI.

Help team members in Test case reviews, getting logs and Configurations.

Conduct KT sessions to new team members.

Project 2: uDAS (Uverse Data Access System)

Responsibilities:

Involved in Test planning, writing and execution of test cases for all uDAS projects and also actively participated Interface testing activities (End2End).

Involved in writing test cases based on IDT( Interface Data Template document for Data Mapping).

Supported Test DATA preparation using bulk loads from SDP and assisted data team to validate the data in uDAS from FIE/PSE environments.

Analyzed the Data sync up issues and assisted to correct the data in respective systems by running external audits for data sync up.

Open tickets for any issues in JIRA.

Created and executed functional tests for web services.

Designed SQL queries to extract data from source tables for validating ETL based data extractions.

Provided technical support for the testers as needed for test execution activities.

Provided KT to new team members on uDAS functionality.

Extensively involved in ETL testing for validation of ETL process and data flows using Informatica Tool.

Extensively used various Informatica client tools like Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Workflow Manager and workflow monitor.

Work with different internal and client groups to resolve issues.

Environment: Windows XP, UNIX, SQL, Java, .Net, Rally, ALM, JIRA, WebSphere, Oracle, Putty, EricM, ED, Telegence, Parser Tool, SOAP UI, ASP.Net, Informatica Power Center, TOAD, SQL Developer

ATT, San Ramon, CA Nov'11 – Sep’13

Sr. QA Test Engineer / Lead

Order Management System: AT&T is implementing AMDOCS Order Management System, a Java based application for the Light Speed Project. AMDOCS Order Management contains four functions layers: Front End, Product Management, Order Management and Mediation layers.

Project 1: E2E FIE Flow-through

Responsibilities:

Check CTEM for assignments and work on the scenarios assigned.

Provide/Change orders from CRM/OMS//OPUS/MyATT/PDC

Request for Data if needed and request for Equipment.

Create defects in QC/ALM if any error occurs in the flow and follow up.

Validate OMS and BBNMS flows and validate Order status in Telegence and Enabler.

Validate the Scenarios using applications like FORCE, G2, CMS, and Exception Manager

Check logs in Splunk and check orders in PST.

Cleanup data and Abandon scenarios once completed or not used.

Execute Trouble Management test cases in SIL Lab by connecting respective RGs and STBs for specific customer interactive trouble shoot scenarios.

Use complex SQL queries to extract and validate Customer and Billing data in the backend.

Extensively perform the Web Services testing using SOAP UI.

Update CTEM, ALM and upload documents in P8.

Attend meetings, Training courses and Amdocs Business Conduct Training

Project 2: TDOC/TITAN

Title: QA Lead

Responsibilities:

Create BANs running scripts using automation tool QTP for given Batch Cycles.

Update Family Plans.

Add, Update, Activate and Disconnect CTNs in CSM(Customer Service Management)

Uploading data into CAPM and TDW spread sheets.

Project 3: CRM

Responsibilities:

Analyze Test requirements for the CRs and participated review the test plan documents.

Actively participate in planning and execution of the test cases.

Work with different internal and client groups to resolve issues.

Run different order flows in CRM/OMS as per scenarios.

Work with different groups and share knowledge with team members.

Validate test scenarios thoroughly in CRM/OMS and validating Backend Data Bases and logs.

Create defects in JIRA during execution and follow up with responsible teams and resolve the issues.

Validate request and response XML’s to for the data was processed as per the requirements.

Environment: Windows XP, Windows 7, UNIX, SQL, Java, .Net, ALM, Quality Center 10.0,WebSphere, Oracle, Putty Connection Manager, Enabler, TOAD, Citrix, CTEMP, CRM, BBNMS, CMS, MPS, FORCE, GCAS, P8, OPUS, Smoke Test Tool, Splunk, MyATT, PDC, G2, NumeriTracks, Scenario Tool, TL-EMS, Exception Manager, PST, SSAM Query Tool, DR Viewer, JIRA, QTP, SDP Request Generator, SSAM Tool, Soap UI 3.6.1

MetroPCS Sep’10 – Nov’11

Richardson, TX

Project: Enterprise / Mobile Device Testing

Sr. QA Test Engineer

Responsibilities:

Review Requirement documents (BRD, IA, Interactive Flow documents) and understand the business flow.

Create TRS (Test Requirement Subject) and TRs (Test Rules) in QC Requirements Tab.

Develop CHLs (Calendar High Lights) - High level scenarios for CRs.

Participate in client meetings and schedule and conduct review on CHLs.

Regularly interact with internal and client teams to deliver timely and high-quality testing solutions.

Develop Test Cases in QC Test Plan for the scenarios in CHL.

Prepared test data (created new accounts through ASAP/SOAP UI, .com, registered accounts through .com/SOAP UI).

Performed various activities such as BAN creation / activation / suspension / cancel / restore / resume of subscribers.

Performed Activations and payments through IVR.

Create Test sets in QC Test Lab and add test cases from test plan.

Add and Configure Mobiles in database.

Performed activities like Activation, payments, Change Plans, Add features, Add funds from Mobile application MyMetro from various Mobiles of both BREW and Android types.

Tested devices LG MN270, Samsung R380, Huawei M635, Kyocera S1350, Huawei M735, LG ms910 Bryce (Android), Samsung R720 (Android).

Load devices (Service Number or MEID) in Test environment.

Validate payments in ASAP (Front End) and Data Base.

Execute test cases, track and report daily on test execution.

Use complex SQL queries to extract and validate Customer and Billing data in backend.

Login Data Base using Toad and validate data in tables in Back End.

Extensively involved in ETL testing for validation of ETL process and data flows using Informatica Tool. Used various Informatica client tools Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Workflow Manager and Worflow Monitor.

Login UNIX server and verify logs.

Configure and load Mobiles in UAT Data Base.

Open defects in QC, follow up with development and update defects.

Attend all review meeting, Bi-weekly team meetings, KT sessions and Team events.

Prepare documents and give KT sessions on CRs assigned.

Help team members and share knowledge.

Environment: Windows XP, UNIX, SQL, SOAP UI 3.5.1, ML, Java, .Net, Quality Center 10.0,WebSphere, Oracle, JIRA, Putty Connection Manager, Informatica Power Center, Enabler, Ensemble, ASAP,cmiweb, TOAD, .com, IVR, AgentWeb

TCS (TeleCommunications Systems) Aug '09 – Sep’10

Seatte, WA

Huawei XLSG Project

Sr.QA Tester / QA Lead

Responsibilities:

Work directly with managers and development team in an Agile environment and involve from product design phase.

Attend everyday meetings and update status and discuss the issues.

Review Functional Specification document, OM-XML and wireframes and understand the database and table relations and functionality.

Write test cases in QC Test Plan and execute in Test Lab.

Write Test cases based on the wireframes and involve in the development of the front end (GUI) application.

Extensively involve in the Database testing in UNIX using SQL.

Connect to Database from UNIX and validate tables using SQL.

Extensively perform Database schema validations, Table validations, Field validations and validating constraints.

Perform Update, Insert, Delete and validate Data Integrity and table constraints

Run Batch Jobs and validate data in tables.

Add records from Front End and validate tables from the Back End.

Perform Negative testing to validate table and column constraints.

Verify functionality and all the properties of the respective objects of GUI/ Front end application.

Extensively perform the Web Services testing using SOAP UI.

Update the Data bases and install the new build and set up the environment.

Report issues directly to developer and get the fix immediately and regenerate the database with the new code and retest to verify if the fix is applied.

Log defects in Clear Quest to keep track of the issues.

Write Scripts for Protocol call flow test cases and execute in UNIX and verify logs.

Analyze OM files (Operational Measurements) and generate reports on transactions.

Check-In Test scripts in Clear Case.

Environment: Windows XP, UNIX, SQL, SOAP UI, XML, Java, .Net, Quality Center 10.0,Rational Clear Quest, Rational Clear Case, TOAD, Tom Cat, MySQL, Putty Connection Manager, JIRA, UI testing

AT&T, San Ramon, CA Oct’06 - Jul’09

Light Speed Project (OMS)

QA Analyst/Tester

Responsibilities:

Reviewed Technical documents like BR, TR, SFD and IA and Prepare Test Cases.

Created Test Plan and Test Lab in QC and upload Test cases.

Understand and change the Product configuration (for products IPTV, HSIA, CVoip and Cingular) and executed Test cases in different environments as needed and to validate the Functionality.

Performed Backend testing, Functional testing, System testing and End to End testing.

Issued defects and worked with different teams to resolve.

Provided orders Online and perform Web page testing.

Coordinated with Off-Shore Team and send status report everyday.

Verified logs in UNIX to check Request and Response XMLs and Exceptions.

Executed SQL queries in TOAD to check DB tables.

Involved in all phases of testing of the product from Release testing to Production Support.

Provided On-Call Support during Releases.

Environments: Windows XP, UNIX, AMDOCS CRM, OMS, AMSS, QTP, Quality Center, Oracle DB, Toad, Xml, SOAP, Java, J2EE.

AOL (America Online Inc) Oct’ 05 - Jul ‘06

Dulles, VA.

DMP/Ecommerce Project

OMS (Order Management System)

QA Engineer.

Responsibilities:

Attending scrub meetings and understanding the requirement documents.

Prepare the TSD (Test Strategy Document) and upload test cases in the Mercury Quality Center.

Execute Test cases in XML format in UNIX and verify logs for errors.

Execute the test cases in the Test Lab in Quality Center manually.

Accessing the Databases in DBArtisan and validating the data using SQL queries.

Keep changing the status of the QAR (Quality Assurance Request) in the WebTrak from submitted status through cleared status as the testing progresses.

Setting up the environment for testing, which includes configuring the servers.

Move the build to QA environment and perform the Backend, Regression testing.

Give presentations on the work in progress when required in group meeting.

Interacting with the managers, team members and developers when needed.

Verified Web API calls using WSDL by configuring Application end points (URL) and inputting request XML and validated the responses.

Entering bugs in the BLT (Bug Logging and Tracking tool) change the status of the bugs to open, working and closed.

Extensively worked in UNIX environment. Executed PERL scripts and run batch processes.

Creating the TSD (Test Strategy Document) and test cases and execute for the BTRM Tool.

Preparing weekly report on the project status and mail it to managers.

Prepare documents on the projects working on, issues faced, and steps to follow for future reference and as guidelines to new people who will be working.

Environments: Windows XP, Solaris, UNIX, Mercury Quality Center, BLT, Putty, WebTrak, DBArtisan, BTRM, C++, Java, .Net, XML, PERL Script, Oracle, Sybase, Mozilla, Weblogic, Apache, Blue Martini.

SBC Telecommunications Nov’ 04 - Sep’ 05

San Ramon, CA

Middleware Services

QA Tester/Analyst

Responsibilities:

Participated in Core Team Meetings along with the Project Manager, Developers, Requirement Analysts regularly.

Understanding the Project related documents such as TR (Technical Requirements Document), HLD (High Level Design Document), IA (Interface Assessment) document etc.

Prepared Test Plans, Test Scenarios, Test Cases based on the above mentioned documents prior to the testing.

Tested the applications after it is deployed to QC environment and logged the defects in Vantive and had discussions with the developers and requirement persons when needed to resolve the issues.

Analyzed the results for bottlenecks and checking the system database for validation and verification.

Coordinated with various groups for gathering test data for the services tested.

Performed regression, functional, ad-hoc, blackbox, positive and negative and system and extensive Backend testing.

Verify the logs online while executing the test or later copy the log from the Web Log Viewer and log the defects if there is any mismatch with the expected response.

Executing the XML files using the JMS client and analyze the response from the backend.

Environment: Java, EJB, Java Script, C-Sharp, HTML, XML, SOAP, JSPS, Window NT 4.0, Oracle 9i, UNIX Shell Script, Bean Testing Tool, Vantive, GRANITE, KARMIN, PANAGON, Manual Testing.



Contact this candidate