LaxmaReddy Julakanti
Ph: 763-***-**** Email: **********@*****.***
Over 15 years of experience in the IT industry in Quality Assurance using Web based and Client/Server applications with strong business understanding knowledge of Telecom, TMS, Healthcare, Supply Chain, Security, Media, e-commerce and Retail WMS projects. Key expertise includes testing and debugging GUI and Multi-Applications environment, Oracle EBS apps, Oracle OTM and Test Director/Quality Center.
Professional Summary
Expert in Testing of Applications in Windows, UNIX, Mainframes & AS400 Environments.
Good understanding of Business Intelligence (ETL) Test Processes and Test Methodologies.
Proficient in Data warehouse and Backend application Testing Life cycle.
Proficient in SQL, PL/SQL, Procedures/Functions, Triggers and packages.
Expert in test management tools like QC ALM, Test Director, Quality Center, Rally, Silk Test, and Rational Test Manager
Extensive experience in preparing Test Plans, Test Scenarios, Test Cases, Test Reports and Documentation for both Manual and Automated Tests
Proficient in Data Warehouse testing like Control Mechanism of batch loads, error/ rejected records processing, testing of SCD implementations and process dependencies
Strong skills in executing Black Box, White Box, Grey Box Testing, Unit Testing, Integration Testing and Acceptance Testing.
Involved in generating Test Scripts using Quick Test Pro (QTP) for different transactions to execute Functional Testing, Regression Testing, BPM Testing and Smoke Testing
Prepared QTP coding Standards and Automation approach documents for current project
Involved in all phases of SDLC and good knowledge in testing applications developed using RUP & Agile Methodologies like SCRUM and Test-Driven development etc.
Experienced in complete project life cycle development including Requirements Analysis, Design, Coding, Unit, Integration, System Testing,UAT and Maintenance
Highly proficient in using the defect-tracking tools like Test Director/Quality Center, Rational ClearQuest, Jira, StarTeam and Bugzilla2.0
Involved in creating Project Software Process Handbook, Risk Plan, Task Tracking, Project Status reports (Daily, Weekly and Monthly), Training Plan, Resource Plan, MPP and Project Metrics
Excellent knowledge of Software Development Lifecycle including Functional & Non-Functional Test Phases
Excellent management, interpersonal, written and verbal communication, and organizational skills, attention to detail
Involved in Requirements, Test Scenarios, Test Cases, Test Results and Defect reviews
Expert in testing Oracle EBS apps, Transportation (TMS) and Warehouse (Oracle WMS & Red Prairie) Management System applications, JDA
Involved in release planning, test estimations, test project plan, and test metrics
Ability to manage time well, organize and prioritize
Ability to understand Functional Requirements and Design Documents
Flexibility to adjust to multiple demands, shifting priorities, ambiguity and rapid change
Good interpersonal skills, committed, result oriented, hard working with a quest and zeal to learn new technologies
Ability to understand and integrate cultural differences and lead virtual cross-cultural, cross-border teams
Ability to build good relationships both within the QA & Test team and with other teams
Able to provide leadership, participate and be a productive member of the team
Certifications
CSTE (Certified Software Test Engineer) professional (CSTE #11571).
PAHM (AHM-250) certified professional.
Cognizant Certified Professional in Software Testing.
Cognizant Certified Professional in AHM-250 (Introduction to Managed Healthcare)
Brain Bench Certified in Software Testing, Software Quality Assurance, Win Runner 6.0, RDBMS Concepts and Programmer/Analyst Aptitude.
Technical Skills
Operating Systems : Windows 95/98/2000/XP, UNIX and AS400
Automation Tools : QTP, Win Runner, Load Runner and Rational Functional Tester
Defect Tracking Tools : Mercury Test Director, HP Quality Center, ALM, Rally, Rational Clear Quest, StarTeam, Jira and Bugzilla
Databases : Oracle, MS SQL Server, DB2, Sybase and MS Access
Programming Languages : PL/SQL, SQL, C, C++, HTML, XML, UML, VB Script, and TSQL
BI(ETL) Tools : Informatica, Autosys, Control-M Enterprise Manager
Testing Methods : Black Box, White Box, Unit, Regression, Integration, System Testing, UAT, End to End Testing and Smoke Testing
Project Details
1.AT&T July 2015 to Till Date
Role SIT/UAT QA Test Lead / Test Manager
Environment Oracle EBS R12, JDA, Red Prairie WMS, OTM, Linux, QC ALM, Rally (CA Agile Studio), TDP, SoapUI, XML, PL/SQL, Toad
AT&T Trinity project is a major Oracle Supply Chain implementation to bring multiple VOIP vendors like Polycom, Adtran and Edgewater onboard to align them with the existing supply chain solution within AT&T. It is one of the biggest agile project implementations within AT&T. Trinity project consolidates existing business Hosted Voice services on a common virtualized platform and provides option to integrate with other AT&T IP Transport Products (AVPN, MIS, IPBB, LTE) and Bring Your Own Transport (Works on any public IP network - over the top) with the goal of scalability to support from Small Business to Large Enterprise
Roles & Responsibilities
Worked within a high performing Agile environment, Participated in User Story discussions, grooming sessions to make sure User Stories are testable and measurable
Participated in Scrum planning, Release/Iteration planning and assisted others in estimating task effort and dependencies
Responsible for test deliverables within the sprints
Responsible for communicating (verbal and written) with Business, Development, senior management and partner systems on the progress of the Release goals, objectives and testing efforts
Responsible for prepare and manage QA project plan, prepare test estimates, test timelines (Functional, SIT and UAT)
Identify and communicate risks & issues and mitigation plans
Prepare and review test plans, test cases, scenarios and test data requirements
Tracked Test Instances readiness and connectivity. Keep a close eye on code migrations, understand the migrations requested for the project, the impacts to the testing and verify the QA environment is available for testing
Worked on JDA Planning Orders and Requisitions for testing
Worked with Red Prairie WMS orders for testing
Report QA findings, generate metrics & reports and QA Project Closeout Report
Worked with upstream and downstream application teams to create and execute tests
Supported the development, maintenance and end user training on testing procedures
Worked closely with multiple small to medium sized onshore and offshore teams, 3rd party vendors to deliver projects on time, cost and with quality
Have strong working relationships with other IT teams, business and customers to speed up delivery of IT Solution
POC for UAT, data creation & coordination with UAT team, Track and Report UAT issues
POC for PVT, understand the need of PVT, work with program sponsor, IT Lead and schedule PVT after code migration to PROD, update all stake holders with PVT status and issues
POC for project after Go-Live, maintain issue list and fixes, communicate with the stake holders and update leadership frequently on issues
2.Verizon (Hughes) Telematics March 2014 to July 2015
Role Sr. QA Lead / QA Manager
Environment UNIX, HP ALM (QC ALM), Excel Reporting (QC OTA API), Java, Web based testing, SOA BPEL, AIA, SQL, Toad, Soap UI, Oracle EBS, Oracle BRM (7.4), Oracle (Siebel) CRM
Verizon Telematics aggregates best in class wireless services into robust and adaptable solutions that can be tailored to any platform. It offers after-market solutions for safety and diagnostic services for vehicles, advanced Mobile Personal Emergency Response System (MPERS), wireless fleet management services and also a premier white label suite of services for vehicle manufacturers. My role is to provide quality control services to all the web portal portions of these products. As the Sr. QA Lead for all web related portions of the various products, I coordinate between stakeholders (BA + Developers + Project Managers) of different project teams here in Atlanta and I have offshore teams based at Hyderabad and Chennai, in India. My team is fully responsible for Functional Testing, regression testing, automation scripting and smoke testing.
Roles & Responsibilities
Test Planning & Management, Estimation, Defect Management, Testing Execution management, Automation
The projects that my team provides QA services to are – State Farm, Towers Watson, Volkswagen, Mercedes Benz and Nissan.
Developing and managing all aspects of the testing effort, including plans, interdependencies, schedule, budget, tools, and required personnel including documenting and communicating the status of testing progress against plans, taking corrective action as necessary. Participating in Sprint planning sessions and representing QA in these planning calls
Creating proactive proposals for testing strategy or automation framework decisions.
Created an excel based framework for estimation, that linked to QC ALM to fetch real time test case execution information and used that to produce testing estimates.
Created various excel based tools that utilized VBA/QC OTA to download various types of data from Quality Center (like test cases, test case execution data, defects etc.) and created reports/charts with this data – all at the click of a button
Performing SOA testing using SOAP UI for the Nissan Enrollment project which would replicate the functionality provided by the AMP portal (bypassing all web portals but calling the web services directly).
Providing technical leadership to project resources and the client to meet testing deadlines and objectives
Agile aspects :
Test cases developed strictly based on acceptance criteria found in user stories
Ensure test case creation, execution and revalidation of fixed issues happens within the 4 week scrum sprint schedule
Be part of the verification for the multiple code drops that happen every sprint and raise issues as observations on Quality Center ALM (QC/Test Director)
Be part of daily scrum calls – where all status/issues are discussed between scrum master, QA lead and Product owner
Ensure all critical and high defects are closed during same sprint and that medium/low defects are planned during subsequent sprints if not during current sprint
Observations were converted to defects on the last day of the sprint after DEMO of user story is accepted by product owner. Similarly Issues discovered post sprint during SIT/Regression/Performance testing will directly be raised as defects (not observations)
Reviewing project deliverables for completeness, quality, and compliance with established project standards.
On-time, on-budget and high quality testing delivery
Creating the Test Strategy document, the Test plan document, creating test scenarios, creating test cases, test execution, raising defects in QC, running defect calls, creating test data via SQL queries from user databases
3.Rheem Manufacturing Comp Aug 2012 to Feb 2014
Role Sr. QA Testing Consultant / QA Manager
Employer Rheem, U.S.A. Atlanta, GA
Environment HP Quality Center, PLSQL Developer, OTM, Peoplesoft ERP, Oracle Apps Suite, Business (Fusion) Intelligence (OBIEE), Oracle WMS, JDA, Web Sphere, Salesforce, SOAPUI, VSS, XML, Linux, Red Prairie, AS400
Roles & Responsibilities
Test Planning & Management, Estimation, Defect Management, Testing Execution management
Developing and managing all aspects of the testing effort, including plans, interdependencies, schedule, budget, tools, and required personnel including documenting and communicating the status of testing progress against plans, taking corrective action as necessary.Participating in Sprint planning sessions and representing QA in these planning calls
Providing technical leadership to project resources and the client to meet testing deadlines and objectives
Generated test plans and test cases for system, integration and performance testing.
Identified test conditions, reviewed test scripts and test plans for Oracle EBS, WMS-Red Prairie, JDA (Planning & Requisitions) applications
Generated test scripts which reflect the step-by-step instructions on how to execute a test conditions and setting up the test data for specific data transactions
Responsible for executing and validating test scripts
Involved in testing Salesforce portal for Leads & Opportunity Creation
Involved in functional test scripts creation, maintenance and execution using QC
Coordinated testing efforts for Oracle EBS apps and results with development team
Involved in web services testing using Soap UI
Worked in Agile environment, involved in Scrum meetings
Participated with the development team and client with regards to requirement clarifications and defects encountered during testing
Employer: Mastech Inc Dec 2008 – Aug 2012
4. Ascension Health/Accenture Sept 2011 to Aug 2012
Role Sr. QA Testing Consultant / QA Manager
Client Ascension Health/Accenture, ST.Louis, MO, U.S.A.
Environment HP ALM, SQL, PL/SQL, Informatica, OBIEE, Oracle EBS, Flat Files, Tidal, UNIX
Type of Testing Business Intelligence - Informatica ETL & OBIEE Reports testing
Roles & Responsibilities
Contributed in developing each test plan and test case based on the high-leveled and detailed design.
Performed Security testing of the basic security levels in the Client Link application security mechanism.
Have good knowledge in HL7 standards.
Worked with the bug fix team to solve the report & ETL issues.
Contributed in regular status meetings to state any bugs, problems and risks
Performed various types of testing, such as functional, regression, user acceptance testing.
Executed the test sets in ALM once the testing is done on the application.
Ran and monitored the ETL loads to load data into QA environment.
Logged defects and followed up on issue resolution ensuring timely customer reporting needs were being met.
5.OSCAR Dec 2008 to Aug 2011
Role Sr. QA Testing Consultant/QA Lead
Client Highmark Inc. (BCBS PA), Pittsburgh, PA, U.S.A.
Environment Rational TestManager, Manual Tester, Load Runner, ClearQuest, RequisitePro, AS400, DB2, SQL, PL/SQL, Informatica, Peoplesoft (HCM,ESA) Hyperion, Flat Files, Autosys, UNIX.
Roles & Responsibilities
Reviewed business requirement documents and technical specifications.
Designed Test Plans and Test Cases according to Business Requirements documents and Technical Specifications.
Documented test plan and test cases using Rational Test Manager and Rational Manual Tester
Carried out Business Intelligence (ETL) Testing using Informatica.
Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
Executed sessions and batches in Informatica and tracked the log file for failed sessions.
Write SQL queries to validate that actual test results match with expected results
check the naming standards, data integrity and referential integrity.
Responsible for monitoring data for porting to current versions.
Analyzed and tested various Hyperion reports for Hierarchies, Aggregation, Conditions and Filters.
Checked the reports for any naming inconsistencies and to improve user readability.
Compared the actual result with expected results. Validated the data by reverse engineering methodology i.e. backward navigation from target to source.
Used Rational Clear Quest for defect tracking and reporting, updating the bug status and discussed with developers to resolve the bugs.
Preparing Defect Reports and Test Summary Report documents.
Have fair knowledge in HL7,EDI X12 transactions and tested EDI 4010 & 5010 transactions.
Worked with appropriate team leaders to ensure quality of test deliverables & timely completion of individual test plans.
Developed documentation for User Acceptance Testing and trained the users about the system
Employer: Cognizant Technology Solutions Sept 2004 – Dec 2008
6.CSS Procurement Card & Ghost Card April 2008 to Nov 2008
Role QA Lead / QA Manager
Client Staples Inc., Framingham, MA, U.S.A.
Environment Silk Test, AS400, DB2, FileNet, SQL Server, TOAD, UNIX, XML, SOAP UI, ETL, Peoplesoft (HCM, ESA), Web Methods, StarTeam, PlanView and .Net
Roles & Responsibilities
Coordinating with the offshore team.
Participated in meetings with business customers to gather the requirements and business naming conventions.
Involved in project planning, coordination and implementation of QA methodology based on the Business requirement and Design documents.
Interacted with System Analysts and the Development team on system to investigate requirements and in resolving issues.
Wrote complex SQL queries to validate target data based on the business requirements.
Used PL/SQL programs for performance testing.
Implemented data verification procedures for Business Intelligence ETL processes in load testing.
Used SQL and PL/SQL scripts to perform backend database testing.
Used Quality Center for bug tracking and reporting, followed up with development team to verify bug fixes and update bug status.
Involved in coordinating the White, Black and Grey box testing for the data warehouse by checking ETL procedures/mappings.
Identified and Documented additional data cleansing needs and consistent error patterns that could diverted by modifying ETL code.
Effectively distributed responsibilities, arranged meetings and communicated with team members in all phases of the project.
Involved in Integration testing, Functional testing and Regression testing.
Checked for any inconsistent joins and resolved loops in the catalog.
Involved in SOA integration testing using SOAP UI, validating XML, XSD and XPath’s
Involved in the System Testing of the OLAP Report Functionality and data displayed in the reports.
Validating fields present in the reports are as agreed in the specifications.
Validated drill down features of reports.
Arranged meetings with business customers in better understanding the reports by using business naming conventions.
Analyze system issues and prepare business and technical requirements documents using established standards.
Worked with appropriate team leaders to ensure quality of test deliverables & timely completion of individual test plans
Involved in Data Comparison of data in flat files.
Developed documentation for User Acceptance Testing and trained the users about the system
Documented Test Plans and Test Cases using Mercury Quality Center for the application according to Business Requirements Specification, Use Cases and Design Documents
7.GPBS / OKC Switch October 2007 to March 2008
Role QA Lead / QA Manager
Client Emdeon Business Services (WebMD), Nashville, TN, U.S.A.
Environment UNIX, Mercury ITG, and Clear case
Roles & Responsibilities
Mastered the OKC switch in a short span of time.
Handled issues in OKC switch effectively.
Gained extensive knowledge in OKC Volume Testing with little KT.
Handled OKC Volume Testing effectively.
Gained extensive knowledge in EDI as well as Clearinghouse concepts.
Have good experience with HL7 standards.
Prepared OKC Volume Testing result reports, OKC Volume Testing activity logs, and status reports effectively.
Prepared OKC Volume Testing SOP documentation.
Involved in test estimations, prepared test preparation documents, test plans and system test scripts.
Successfully trained the Offshore Team on OKC Volume Testing.
8.Definity Health Apps Suite August 2006 to Sept 2007
Role QA lead
Client United Health Group, Minneapolis, MN, U.S.A.
Environment QTP, Mercury Test Director, Load Runner, PLSQL Developer, Peoplesoft ERP (HCM, ESA), Web Sphere, VSS, XML, UML, Control-M and ETL Share Points
Roles & Responsibilities
Coordinating with the offshore team.
Participated in Business requirements and test document review & walk thru meetings.
Identified and designed the Integration, System & Regression Test scenarios.
Worked as a Cross-Application test lead for Integration Testing effort.
Involved in QA Project Estimations and Resource scheduling activities.
Involved in preparing QA Status reports, Issues, Action Item reports, Project closure reports, SOX compliance documents etc.
Worked in FDA Validated Environment.
Implemented Hybrid Test Automation frame work.
Monitored the team activities, schedules and assigning the tasks to the team members.
Developed automated Test Scripts using QTP.
Performed Regression testing for every modification in the application & new builds using QTP.
Performed functionality and regression test.
Wrote, updated and executed manual test scripts using Test Director.
Worked closely with developers to resolve the defects and close them.
Coordinated with Project manager and other point of contacts to gather necessary information.
Performed regression, functionality, integration, positive-negative, grey and black box tests.
Coordinated with Change management to resolve issues with the test builds.
Executed the scripts and modified the scripts as per the enhancements and bug fixes of the application.
Met estimates and timelines for the completion of the project.
Involved in functionality and Sanity checking.
Involved in Web Services Functional Testing.
Assisted in executing Integration Testing.
Involved in Performance Testing of the ETL Batch jobs and comparison of data in flat files.
Used Test Director as defect tracking management tool.
Maintenance and update of scripts on release basis.
9.Employer eServices QA (EesQA) January 2005 to July 2006
Role QA Lead / Senior QA Analyst
Client United Health Group, Minneapolis, MN, U.S.A.
Environment Rational Clear Quest Web 2000, Rational Functional Tester, Rational Test Manager, Rational Clear Case, VSS, XML, UML, Autosys and ETL share points.
Roles & Responsibilities
Involved in functional study of application.
Analyzed the Test plan, which detailed the testing scope, strategy, test requirements and necessary resources.
Developed test related documents including Test Plans, Test Procedures, Test Cases and Test Scripts.
Utilized Rational Unified Process test methodologies and automated the functional testing using Rational Functional Tester.
Involved in Manual testing using Rational Test suit tools to develop test cases, test scripts, executing the scripts and logging the defects in ClearQuest.
Participated in all phases of the Software Life Cycle.
Developed test plan and modified the test plan when required in later stages of testing.
Performed Regression Testing and verification of software products.
Written SQL Queries to extract data from various database tables for testing purpose.
Performed feasibility studies and evaluated system requirements.
Involved in functional, Regression, Load, Performance and Stress testing.
Involved in UAT of the applications by providing users with test cases and scenarios, and guiding them during the testing process.
Tracked Bugs and reported using Defect Tracking (ClearQuest) and Maintenance Tool.
Designed and Developed the Test Scenarios for Renewal System.
Created Test Data to test different scenarios after a through requirements analysis of the applications.
Participated as a key member with the QA Team for Regression testing where day-to-day bugs were monitored.
Coordinate with developers and the testers for resolution of the problems reported.
Attended daily status/ defects review meetings.
Coordinated and interacted with clients and off-shore team.
Involved in business & design walkthroughs and inspections.
Prepared RTM documents.
Involved in SOX compliance documentation.
10.Voice Enabled Telephone Self Service (VETSS) Sept 2004 to Dec 2005
Role QA Lead / Senior QA Analyst
Client United Health Group, Minneapolis, MN, U.S.A.
Environment WinNT, VSS, Quick Test Professional, Test Director 8.0
Role & Responsibilities
Involved in functional study of application.
Participated in the development of Test Plan, Test cases for various forms in references to Use Cases.
Checked GUI objects, Bitmaps, and Batch Test with Mercury Interactive tool Quick test Pro.
Performed Back end testing and End-to-End testing using SQL Queries.
Developed Automated Test scripts to facilitate Regression Testing using QuickTest Pro 6.0.
Wrote SQL statements using Quick Test Pro and performed Back end testing.
Involved in White and Black box Testing. Involved in QA Test plan documentation.
Check the content retrieved on various windows against the Database tables to make sure that the process retrieved all the data it is supposed to by writing SQL statements, functions and procedures.
Used Test Director7.0 for Testing Planning, Test Designing, Test Requirements Analysis, Test Execution, Defect Tracking and Test Evaluation.
Developed Test Strategies, Test Plans, Test Cases and Test Data required for IVR QA Process i.e. for the Functional, GUI, Integration, Security, Performance, Regression and Installation testing.
Performed IVR QA and System testing for the IVR implementations.
Supported UAT for all IVR releases.
Implemented speech recognition based Hammer test methodologies.
Worked in Empirix Hammer system development environment.
11.4S e-Log: Warehouse Management System (WMS) Jan 2004 to Sept 2004
Role QA Lead
Environment J2EE, J Developer, Oracle 9i, Windows, CVS, Win Runner and Jira
Role & Responsibilities
Performed web testing for secure messaging. Exercised and tracked test cases to verify new client releases and JAVA development.
Developed Automated Test scripts to facilitate Regression Testing using WinRunner.
Involved in developing test scripts to test the product’s stability
Executed Test Cases and Test Scripts.
Reported the defects using Jira.
Executed test cases for both inbound and outbound WMS transactions.
Co-ordinate with developers in reviews and solving the problems encountered in the application.
Created Test Data to test different scenarios after a thorough requirements analysis of the applications.
Used Rational Clear Case for version control.
Documented the test cases, test results, test procedures and reported to team lead.
Engaged in assessment of Software Development Methodology and implement a modified RUP approach to complete SDLC.
Document, analyze, prioritize and maintain defect logs in Jira.
Executed test scripts and reported defects in terms of Major, Minor & Critical
Involved in Smoke testing, Integration testing, System testing
10. 4S e-Trans-Multi-modal Freight Management System June2003 to Dec 2003
Role Sr. QA Analyst
Environment J2EE, J Developer, Oracle 9i, CVS, Windows and Jira
Role & Responsibilities
Performed System testing, Regression testing, functional Testing using WinRunner.
I was responsible for full SDLC (REQ, CODE, ALPHA TEST, BETA TEST) including 4 test phases.
Developed checklist (test plan) and verified content with developers.
Exercise component checklists verifying correctness for each build.
Written the Test plan, Test Cases, Test Scripts and Test Steps for all the modules.
Executed all Test scripts, Test cases and validated data.
Performed database testing using queries from SQL.
Applying the SQL scripts to test the database and execution of scripts on the database.
Tested the application on IE 5.0/5.5/6.0, Netscape 4.73/4.74/6.0 as part of Portability Testing to maintain cross-browser functionality.
Created users, tables, constraints, sequences and scripts for database tables.
Wrote user defined functions in WinRunner, loaded them in Function Generator.
Used TestDirector for Test Planning, Test Designing, Test Execution, Jira for Defect Reporting and Test Evaluation.
Prepared Test Data and performed Positive and Negative testing.
11. 4S eSupply-EP September 2002 to May 2003
Role QA Tester
Environment J2EE, J Developer, Oracle 9i, CVS, Windows and Jira
Role & Responsibilities
Performed manual tests on JSP, JAVA, and HTML based web applications.
Developed test cases and test scripts for the specific modules.
Performed database testing using queries from SQL.
Performed white/ black box GUI/ Web testing.
Created detailed test plans, matrices and summaries on testing process.
Maintained documentation on all existing and new defects using TestDirector Database.
Involved in setting up the environment (installation, configuration).
Involved in unit and Integration testing.
12. 4S eSupply-SP January 2002 to August 2002
Role QA Tester
Environment J2EE, J Developer, Oracle 9i, CVS, Windows and Jira
Role & Responsibilities
Performed manual tests on JSP, JAVA, and HTML based web applications.
Created detailed test plans, matrices and summaries on testing process.
Performed Integration Testing and Acceptance Testing along with Users.
Written & Tested Web Applications Using Browsers Internet Explorer.
Involved in Unit and Integration Testing.
Involved in preparing Test cases.
Executed test cases and reported defects.
Involved in Integration Testing, System Testing.
Professional Education
Master of Computer Applications, Osmania University, India
Bachelor of Science, Osmania University, India