Vani Vemula
Contact : 703-***-**** - Email: add8jj@r.postjobfree.com
Key Qualifications
Quality Assurance Engineer over 9 years of experience in Testing Web applications and Client/Server applications both manually and using Automation tools.
Extensive experience in working on various test strategies and preparing Test Plans, Test Cases and Analyzing Test Results based on business requirements and systems and industry requirements.
Expertise in automation testing using Quick Test Pro, Quality Center, ALM QC Test Director.
Testing of Web Services using SOAP UI, XPath and XQuery.
Experienced in Integration Testing, System Testing, Regression Testing, Web Testing, GUI Testing, Functionality Testing and Database Testing.
Well versed with Software Development Life Cycle (SDLC) models like Waterfall and Agile Models.
Developed Test Strategy, Test Plan, Test Cases (Functional and Non-Functional), Test Scenarios from System Requirement Specifications and Business requirements.
Good exposure to .Net, Visual Basic, JAVA, ASP, HTML, Shell Scripts, Oracle, SQL Server development environments.
Solid understanding of Oracle and SQL Server databases and Windows/UNIX operating systems.
Communicated with support, business and development teams to resolve issues during test execution.
Self-Starter and ability to adapt to any industry and learn any new technology and trends.
Highly motivated worthy team player with ability to work in all environments to meet target lines.
Attended CMMI Level 3 and 5, ISO 9001:2008 Quality Management System
Experienced in using testing tools Testing Anywhere, Selenium, Quick Test Professional, Win runner, Test Director, Scarab, Fog Bugz and IBM Clear Quest and IBM manual tester, AccVerifier, Paros and Load runner
Experience using ALM/Test Director for execution of test cases and defect tracking systems
Writing SQL queries to retrieve data from destination tables
Expert in Testing Life Cycle (SDLC) phases including: risk analysis, planning process, test design, performing test, defect tracking and management and test reporting
Experience designing and writing test strategies for various phases of testing
Expert in mainframe testing, web testing, client/server testing and back-end testing
Strong working experience in Windows NT and UNIX environments
Extensively used PL/SQL queries to interact with ORACLE database
Coordinated between QA, Release Engineering and Documentation teams
Security Clearance
MBI — (Moderate Risk Background Investigation)
Education
Master’s in Science (MSc).
Professional Certifications/Licenses/Training
CMMI Level 5 Capability Maturity Model, Software Process Improvement
Security Awareness Training- Certificate of Completion
Privacy Awareness Training - Certificate of Completion
Security Awareness Virtual Initiative (SAVI)- Certificate of Completion
Certified in NIIT Course
Database/Middleware Systems
Oracle 11g financial app, MS SQL Server, DB2 Web Sphere, Web Logic, VB Script
Languages and Standards
AVA, .Net, SQL, VB Script, C, TSL (Test Script Language), HTML, XML, UML, Python
Tools and Utilities
ALM, Test Manager, Quality Center, Test Director, Rational Clear Quest, PVCS Tracker, MS Office Suite, MS-Visio 5.0, Testing Anywhere, Jira and AWS.
Professional Experience
Employer Name
Title
Date
FAA (Karsun Solutions)
Sr. Tester
Aug 2018 to Present
FAA (OST Global)
Sr. Tester
Aug 2014 to Aug 2018
Gennie Mae (Govt National Mortgage Association)
Sr. Tester
June 2013 to Aug 2014
IRS (Deloitte),
Sr. Tester
July 2012 to June 2013
GEICO, Bethesda, MD
QA Tester
Aug 2011 to June 2012
Qualifications and Specialized Experience
Role: Sr. Tester Aug 2018- Present
Client: Federal Aviation Administration (FAA)
Project: Regional Information System (REGIS), REGIS is a financial management and reporting tool designed to provide accurate, standardized, and timely tracking of financial data for the Federal Aviation Administration (FAA). REGIS has the ability to track all types of funds, facilitates reconciliation with accounting system used by the FAA, and has been designated as the official FAA Cuff-Record System for Operations, and Airport Administration Funds.
Worked on Multiple Projects
Project: (OVLTP) - Online Voluntary Leave Transfer Program.
Office of Human Resources Management Information Technology Applications-AHR
Roles and Responsibilities:
Analyzed requirements and developed Test Plans, Test Scenarios and Test cases from business, technical and functional requirements and Design document for FAA
Working in five projects and supporting several applications lead testing effort simultaneously
Documenting requirements, and revising existing system logic difficulties as necessary
Review business requirements and software system designs for testability
Performed manual testing to test the usability of the application
Performed positive/negative testing to check the functionality
Performed ad hoc testing, integration testing, end-to-end testing, system testing and function testing
Involved in documenting the defects and errors in the application and suitable actions taken by consulting developers and system engineers
Created complex SQL queries for back-end database testing & reports testing
Developed and maintained SQL scripts to extract data from the database
Generated test plans, test cases & test steps for manual testing through ALM QC
Reviewed the SRS documents and prepared high-level estimate of the LOE’S.
Reviewed incident tickets in REMEDY and provided test analysis.
Coordinated with software developers and System Analyst to discuss QA issues.
Performed testing activities related to REGIS, FMS, REDMACS, SAPS, AHR and AMCS applications.
Executed test cases and report test results in SharePoint.
Prepared defect summary reports and reported results in JIRA for further action.
Attended Sprint Planning sessions to discuss user stories and testing activities.
Joined Retrospective meetings for feedback and project progress.
Worked on UAT defect to ensure all the functionality is working after the fix.
Worked on performance issues and resolved as per the user request.
Took a proactive approach to improving test coverage and quality control efficiency.
Initiated automated testing procedures to increase efficiency and productivity.
Prepared bidirectional test results document for the test sign off.
Worked server migration for windows 10 and performed testing activates.
Computerized adaptive testing (CAT) is recognized as the next generation of tests. CAT works by tailoring both the difficulty and number of items to an examinee. This provides a wide range of benefits including cutting testing time in half. Many organizations hear of these benefits and desire to move to a CAT system without investigating the implications or requirements of CAT. This paper describes the pieces that an organization must have available before building.
Identified the requirements and analysis the RQ and update the Test plan and excure the TC and prepared the Test result document to the client.
Environment: Business Objects Xir2, Crystal Reports XI, J2EE, JSP, JavaScript, HTML, VSS, DHTML, Rational Clear case, Rational Clear Quest, ALM & Quality Center, Automation Anywhere, Selenium, SharePoint, Oracle 12g, Toad 9.0, SQL Server 2012, Windows XP 10
Role: Sr. Tester July 2014 to July 2018
Client: Federal Aviation Administration (FAA)- Worked on Multiple Projects
Worked on Multiple Projects
Project: Staffing and Payroll System (SAPS) application provides staffing and payroll information by individual or grouped Cost Center, Region, and Service Area. It is a web based and business-critical application system, it consists of Staff, Payroll, Actuals and Projections modules.
Project: Regional Information System (REGIS), REGIS is a financial management and reporting tool designed to provide accurate, standardized, and timely tracking of financial data for the Federal Aviation Administration (FAA). REGIS has the ability to track all types of funds, facilitates reconciliation with accounting system used by the FAA, and has been designated as the official FAA Cuff-Record System for Operations, and Airport Administration Funds.
Roles and Responsibilities:
Ensured all the applications are working properly based upon the continuous changing requirements of the customers
Reviewed the SRS and prepare high level estimate of the LOE’S
Performed different types of testing: Smoke testing, Functional, System, Regression, Performance
Completed the task and performed the duties as expected by REGIS, FMS, REDMACS, SAPS and AHR Team
Attended daily status meeting, schedule planning meeting and approval meeting
Completed Test cases for all the maintenance releases as per the Schedules
Executed test cases and report test results in SharePoint
Document the test cases and test results and uploaded in the SharePoint send the link across the team
Attaining peer review meetings, take the meeting minutes
Gave an Estimated time for the defects to the lead
Worked on UAT defect to ensure all the functionality is working after the fix.
Reviewed business requirements and software system designs for testability
Updated the test case as per as per the feedback from the lead
Continues communication with developers and clarifying issues with them.
Defect reporting improvements
Analyzed the outcome of results and reported to QA manager
Responsible for creating the QA sign-off check list
Ensuring that test-product documentation is complete
Performed cross browser testing to ensure compatibility of the application on IE for SAPS and AHR Systems.
Environment: Business Objects Xir2, Crystal Reports XI, J2EE, JSP, JavaScript, HTML, VSS, DHTML, Rational Clear case, Rational Clear Quest, ALM & Quality Center, Automation Anywhere, Selenium, SharePoint, Oracle 11g, Toad 9.0, SQL Server 2005, Windows XP
Role: Sr. Tester June 2013 to Aug 2014
Client: Gennie Mae (Government National Mortgage Association)
Roles and Responsibilities:
Worked as QA Tester on Pool number request, Commitment Management, Master agreements modules and developed high level test scenarios and presented to the management for review and approval.
Involved in Defining, and Analyzing the Testing Requirements based on the Application Functionality.
Involved in peroration of Test Data for different modules.
Created and executed test cases and scripts in Quality Center to cover all scenarios of stakeholder requirements, data requirements, technical specifications, business requirements, user interface requirements, system interface requirements and usability requirements.
Analyzed business requirements and technical specifications of application in development and produced test cases for functional testing, integration testing and regression testing.
Used Quality Center (ALM 11) as test reporting tool to maintain and execute test cases.
Involved in the meetings continuously with users and modifying the test cases accordingly to test the new features developed in the application as per the user requirements as part of UAT.
Worked closely with developers to communicate QA issues and testing status.
Identified and documented the defects in ALM and Celoxis.
Verified the bugs fixed by developers during each phase of Black Box Testing
Verified and retested the bugs fixed by developers during each phase of Black Box Testing
Involved in Regression Testing for better performance of with the new enhancements
Environment: HP ALM 11, Java, Oracle 11G, HTML, XML, Web Logic, Windows NT, Celoxis, SAP Business Objects.
Role: QA Tester July 2012 to June 2013
Client: Internal Revenue Service – (IRS)
Project: ACH- Under the Affordable Care Act (ACA), insurance companies, self-insured companies, and large businesses and businesses that provide health insurance to their employees must submit information returns to the IRS reporting on individual’s health insurance coverage. You are required to file electronically if submitting 250 or more information returns.
Created and executed detail function test script and test cases based on the business Requirements
Performed Positive/Negative testing to check the functionality
Developed Test Cases, and Test Scripts for Implementation testing
Created Test Cases and developed Tractability Matrix and Test Coverage reports.
Managed and conducted System testing, Integration testing and Functional testing and Unit testing.
Organized the tune systems and ensured its optimal performances on multiple platforms for Verification, Validation, and Transformations on the Input data (Text files, XML files) before loading into target database
Worked on SOAP UI with SQL scripts to load data in the tables
Tracked the defects using Sprint Reporting tool and generated defect summary reports
Experience testing on XML documents
Responsible for all new and existing ETL data ware house components
Experienced at testing ETL s’ and flat file data transfer without relying on GUI layers
Used INFORMATICA tool for ETL jobs
Attended meetings along with the developers for DML and DDL’s
Ran ETL jobs to extract the data transfer the data and load the data from source to target data base tables
Verified logs after running ETL jobs
Tested the Informatics mapping and other ETL processes (Data Warehouse testing)
Expertise in creating test data for data warehouse applications including web services testing
Validating and debugging its components and monitoring the resulting executable versions.
Prepared status summary reports with details of executed, passed and failed test cases
Created traceability relationship between requirements in the same module and also between different modules
Peer reviewed the Test Cases and made recommendations.
Verified the quality of the requirements, including testability, requirement definition, test design, test-script and test-data development, test-environment configuration; test-script configuration management, and test execution.
Participated peer reviews, daily status meetings, scrum conferences to optimize the test documents
Conducted test-design and test-procedure walk troughs and inspections.
Ensuring that test-product documentation is complete
Execute CAT test cases and document the test results.
Environment: Rational Requisite Reqpro, Rational Team Concert, Soap UI, SAP Business Objects Info view Tool, Web methods, XML, XQuery, XPath, XML SPY tool for XML, PL/SQL, Rapid SQL Navigator, Windows XP, Data Stage 7.5.3
GEICO, Bethesda, MD Aug 2011 to June 2012
QA Tester
Written functional test Conditions and test cases in ALM 11.0 Quality Center
Performed Smoke, Navigational, Integration, Functional and GUI testing of the application
Involved in back end, End-to-End testing phases and corresponding process of migration
Worked closely with developers to communicate QA issues, testing status and with the testing team
Identified and documented the defects in Mercury Quality Center
Verified the bugs fixed by developers during each phase of Black Box Testing
Communicated defects encountered during regression test and followed-up with developers until all the issue were resolved
Contributed overall testing efforts for performing smoke, functional, back end and regression testing
Test procedure and test case preparation, defect tracking and monitoring using HP Quality Center.
Test planning, designing test cases, test execution, error reporting, analysis and error
Test planning, designing test cases, test execution, error reporting, analysis and error
Verification. Coordination test cycles with client
We follow –Agile methodology on daily bases, to get the sates of the work.
Performed Regression testing using QTP
Worked with development, Marketing teams to release quality product
Analyzed the outcome of results and reported to QA manager
Performed cross browser testing
Implemented the SDLC for testing life cycle and followed the RUP in the application, which meet SEI standards.
Environment: - Mercury Quality Center, JIRA, Oracle, SQL, Java WebLogic, Unit and Windows XP