Resume
Login
Email

healthcare QA Analyst

Location:
Virginia Beach, Virginia, United States
Posted:
June 02, 2008   all resumes
Contact Info:
*******.****@*****.***


OBJECTIVE

I am seeking a challenging opportunity to apply my quality assurance expertise in a dynamic institution that is committed to continuous improvement and to providing high caliber solutions to meet customer needs and requirements .I am motivated by a keen interest to learn the technical side of an application and finding defects in order to assess and improve quality.

PROFESSIONAL SUMMARY

Diverse experience in the field of Information Technology with emphasis on Quality Assurance Testing, Analyzing and support activities. Experience in testing stand-alone, client server and Web based applications using automatic and manual testing techniques.

PROFILE

• Experienced in creating Test Plans, thorough hands on experience with designing test cases covering all test conditions and eliminating redundancy and duplications.
• Solid experience in developing high level documents, test plans and test strategy.
• System specification analysis, testing methodology and test plan formulation from Business requirements.
• Strong Knowledge of SDLC (Software Development Life Cycle).
• Strong knowledge of Six Sigma methodologies.
• Expertise in planning and managing projects based on Rational Unified Process (RUP) covering the full range of the software development life cycle (SDLC).
• Vast knowledge and experience in Health Care industry, experienced in testing different healthcare ERP solutions such as FACETS, MedPlus, EPIC and other claim processing solutions.
• Experience in testing Electronic Data Interchange (EDI) according to HIPPA Compliance.
• Experienced in Black Box testing, Database and Backend testing.
• Good experience in creating, modifying and enhancing both manual Test Scripts and Test Scripts created by Test Automation tools- Load Runner, Quick Test Pro, Rational Robot.
• Vast Working Experienced in Web-based environment.
• Extensively used Black Box testing techniques.
• Good working experience in analyzing changes and identifying areas of applications to be regression tested.
• Strong communication skills, both verbal & written, with particular emphasis on the production of clear & detailed written Test Plans, Business Requirements, & Functional Specification.
• Detailed knowledge with Object Oriented language environment.
• Ability to understand & analyze business processes & workflows with the objective of providing recommendations for the best use of technology to improve the process.
• Work closely with cross-functional teams to thoroughly test requirements and functionality.
• Extensively involved in Performance Testing, Load Testing, Stress Testing.
• Experienced in Data Driven Testing, Batch Testing, Functionality Testing GUI Testing, Regression Testing.
• Exclusively used Security Testing, Usability Testing, Positive Testing, System Testing.
• Experience in Backend Testing using SQL queries and Unix Shell Scripts. Experienced in using Test Central, Test Director, ClearQuest and other bug reporting tools.
• Experienced in Upgrade and Configuration Testing.
• Excellent Communication and Documentation Skills.
• Good working knowledge of major operating systems and tested applications developed in wide variety of environment viz., Windows 2000/NT/Unix/ Mac OS 8.
• Experience in interacting with business analysts, developers, and technical support and help them base line the requirement specifications.
• Proven ability to work cooperatively & effectively with business, team, & systems partners.
• Strong communication skills, both verbal & written, with particular emphasis on the production of clear & detailed written Test Plans, Business Requirements, & Functional Specifications.
• Ability to understand & analyze business processes & workflows with the objective of
providing recommendations for the best use of technology to improve these.

Technical Skills

SKILLS

Testing Tools Quick Test Pro, Win Runner, Load Runner, Rational Test Suite
Test Reporting Tools Test Central, Rational Clear DDTS, Test Director, Rational Clear Quest
Operating Systems UNIX, Windows 9x/NT/2000/XP, Mac OS 8.
Languages Advance COBOL, C, SQL, Java, Quick Basic,
VB Script, SQA Basic, T-SQL, PL/SQL.

Web Languages VB.Net, Java Applets, HTML, DHTML, ASP, JSP, PERL, Java Script, DHTML, XML.
Networking: TCP/IP, WAN, LAN.
Databases Oracle, Sybase, SQL Server 2000
Project Management Tools MS Office 2003/XP, Microsoft Project 2000


WORK EXPERIENCE


Client: AMERIGROUP CROPORATION, Virginia Beach, VA March’08 – Present
Project: Claim Auth UM Mismatch (WS7)
Sr. Test/Claim Analyst

AMERIGROUP is the leading publicly-traded company dedicated exclusively to caring for the financially vulnerable, seniors and people with disabilities through publicly-funded programs. America group is currently serving around 1.7 Million members across 10 different states and District of Columbia.

Project Description:

AGP had 331,907 Texas Medicaid claims that pended due to authorizations between 3/2/06 and 2/28/07. Pended claims require additional work in several departments to be processed correctly. The estimated COPQ related to the additional work for all FACETS markets during this time period is approximately $400,000.

The project goal is to improve the Sigma Level to greater than 3.2. Reduce the number of pended claims due to authorizations to less than 4%. Reduce the COPQ by 50%.

Environment: Quality Center, QTP, UNIX, C, C++, Pearl, PL/SQL, Oracle, SQL Server, FACETS, AMISYS, Manual Testing.

• Served as a lead for the Facets testing effort on WS7, involved in the Reviewing of Requirements Specification with functional manger and technical specialists.
• Develop testing strategy and test plan, outlined various capabilities of the testing process. Describe the tasks of each testing capability, outlined roles and responsibilities related to the testing process.
• Coordinate working sessions for testing; communicate risks/issues to WS7 project leads.
• Review the test environments that will support the various testing capabilities, highlighted testing schedule across all testing capabilities.
• Analyzed current Claim Authorization Process across all the FACETS Markets. Identified possible constrains that pends claims related to Claim Authorization Mismatch.
• Configured best suitable Facets CLUM Settings in order to auto-adjudicate maximum number of pended claims.
• After the successful configuration of CLUM settings, performed Volume, Parallel and Post Validation Claim testing to validate new CLUM settings and to ensure nothing else in claim processing is effected due to new CLUM settings.
• Outlined approach for documenting, tracking, and resolving issues found during testing, outlined approach for developing acceptance criteria.
• Identify any other potential risks that exist in the current authorization and claim process related to claims overpayment.
• Recommended improvements to standardize authorization and claim process in order to increase auto-adjudication of pended claims related to Authorization Mismatch.
• Analyzed test results using reports and graphs generated in Quality Center.
• Wrote test cases and test scripts, execute test scripts and analyzed outcomes.
• Created and maintained SQL Scripts to perform back-end testing on the oracle database.
• Prepared summary reports of configuration changes, test results and recommendations.

Client: Emdeon Business Services, Nashville, TN November’07 – Feburary’08
Project: BSC (Batch Switch Consolidation)
QA Lead

Emdeon Business Services provides revenue cycle management and clinical communication solutions that enable payers, providers and patients to improve healthcare business processes. Emdeon offers a full suite of products and services to automate key business functions for healthcare payers, providers, and vendors.

One of the leading solutions of Emdeon is Clearing House that process claims for different health care payers, providers and vendors. In past few years Emdeon acquire number of claim processing companies, however the product offered by these individual companies was based and processed in different platforms (switches). In order to cut maintenance and process cast for all these individual switches, Emdeon recently start Batch Switch Consolidation (BSC) project that will consolidate all the individual switches (application) into one switch.


Environment: Mercury ITG, UNIX, C, C++, Pearl, Main Frame, SQL, Rational Clear Case, Oracle, MS Project, SQL Server, Manual Testing.

Responsibilities:

• Actively participated in different stages of project such as Design, Planning, Development and Debugging.
• Lead the efforts in designing test strategies, creating test scopes and test standards for different phases of BSC project.
• Write test plan and test strategies for End to End testing of BSC project.
• Worked BA’s and Development team to design and create some In-House Automation tools for testing.
• Estimate work hours and mile stone for different testing efforts.
• Design System, Integration, Parallel, Performance and Post Validation testing efforts for BSC project.
• Designed Assumptions, Pass fail Strategy, Entry & Exit Criteria for end to end testing efforts.
• Write test plan and test cases, prepare input data for test scenarios, execute test scripts manually or using In-House tool.
• Participated in various meetings and discussed Enhancement and Modification Request issues.
• Involved in the Review of Requirements Specification with functional manager and technical specialists of the application
• Review Unit Test cases developed by Development team; perform System and System Integration testing.
• Perform Parallel Testing using data from production environment in order to compare result of new enhancement in production.
• Design and Execute Regression Test Bed at the end of every phase.
• Perform Performance and Post Validation Testing.
• Perform HIPPA Validation of Clearing House application.
• Validate different Electronic Data Interchange (EDI) formats such as X-12, EMCDS, PCDS and 1500 formats.
• Validate different EDI formats and transactions under HIPPA compliance.
• Performed back-end testing by extensively using SQL commands to verify the database integrity.
• Created and maintained SQL Scripts and UNIX Shell scripts to perform back-end testing on the oracle database.
• Write and track the defects using Mercury ITG tool.
• Participate daily in Triage meeting to discuss defects issues.
• Worked with BA Lead, Development Lead, QA Manger and Release Manger on Daily basis, and provide weekly status reports of QA team to Product Manager
• Lead a team of 4 onsite and 5 Offshore QA Analysts.


Client: Regence Blue Cross Blue Shield, OR April’06 – October’07
Project: CPSS (Common Process Single System)
Sr. QA/Implementation Analyst

The Regence Group is the largest affiliation of health-care Plans in the Pacific Northwest/Mountain State region. It includes Regence BlueShield of Idaho, Regence BlueCross BlueShield of Oregon, Regence BlueCross BlueShield of Utah and Regence BlueShield (in Washington).
Regence Group is implementing a set of projects under a common program called CP-SS (Common Process Single System). CP-SS will bring individual four states legacy system affiliated with Regence group into a common Platform to deliver valuable experiences to its members.

Environment: Rational Test Manager, Rational Manual Tester V6.1, Rational Clear Quest, Rational Robot, Facets 4.31, UNIX, Java, XML, Oracle, SQL Server, Sybase, SQL advantage, SQL*Plus, SQL Navigator, Mainframe, Manual Testing,

Responsibilities:

• Actively participated in all phases of testing lifecycle (Design, Planning, Development and Results).
• Coordinate with Development and Business team to develop high level Business and Technical documents.
• Design Test Plan and test strategies based on high level business and technical documents.
• Implemented Standardized and Unified process through out the software Development Life Cycle (SDLC).
• Involved with other team members to set up testing tools, implementation and testing environments.
• Modified previously existing test cases that were driven by the manual testing.
• Involved in Designing and Analyzing of Test Scope strategies with other Test Analyst.
• Write Unit, Integration and System Test cases using Rational Test Manger from Functional, Technical and other high level documents.
• Performed both manual and automated testing.
• Manually Conducted Positive and Negative testing.
• Extensively involved in writing Test scripts using Rational Manual Tester to perform Manual and Automation testing on the AUT under the different Programming Environment.
• Performed Sanity and Smoke Testing of the application manually after each build.
• Involved in FACETS Implementation Testing, involved end to end testing of FACETS Billing, Claim Processing and Subscriber/Member module.
• Set claim processing data for different Facets Module.
• Design, analyze and performed Integration and System testing on different leading health care software’s such as FACETS, MedPlus, Onyx etc to test all the different software components under one complete system.
• Write test cases using Rational Test Manger, Used Rational Robot and Rational Manual Tester for Functional testing.
• Validate EDI Claim Process according to HIPPA compliance.
• Test HIPPA regulations in Facets HIPPA privacy module.
• Create Data pools using Rational Robot to perform Data Driven Testing.
• Conducted System, Integrated and Regression testing to the application.
• Conduct Business, Functional, User Acceptance and Usability testing.
• Involved in creation and maintenance of Test Matrix and Traceability Matrix.
• Using UNIX manually tracking all the Logs Activities on the server.
• Write UNIX Shell Scripts to perform Backend testing.
• Coordinated for Batch Jobs scheduling for SIT (System Integration Testing) Team.
• Run the SQL queries using SQL advantage and SQL Navigator.
• Produce Reports from Oracle Database using SQL*Plus.
• Perform Backend testing by extensively using complex SQL queries to verify the integrity of the database.
• Use Rational Clear Quest for the reporting and tracking of defects.

Client: Motorola Inc, Seattle, WA July’05 – March’06
Database/Upgrade QA Engineer

Currently working in a Motorola in a web based application called Kamet this real-time application is developed for various wireless providers globally such as Cingular Wireless (USA), Telefonica (Spain) etc.
This server based application enables customer to access RSS feeds (News, Weather, Sports etc) on their Motorola wireless phone.

Environment: Test Central, Rational ClearDDTS, Windows XP, Toad, DB Commander, UNIX, Java, Java Script, XML, Oracle, Manual Testing.

Responsibilities:

• Involved in the Review of Requirements Specification with functional manager and technical specialists of the application.
• Participated in setting up testing environment.
• Participated in Test Planning.
• Write and update Test Cases and setup Test Scenarios.
• Implemented the SDLC and followed the Standardized process in the application.
• Performed Smoke and Sanity Testing of the application manually after each build.
• Performed rigorous manual testing before a release.
• Performed Functionality and GUI testing of the application.
• Performed Security, User Acceptance and Usability testing.
• Perform Regression Testing.
• Manually test each and every module of the application and verify against expected results.
• Performed Back-End Testing of the application for Oracle databases.
• Performed Upgrade and Configuration testing on the application to check the effects of upgrade.
• Checked the schema of Oracle database manually or using Toad after every upgrade.
• Compare Database Schema between different builds and Track the defects generated due to schema changes.
• Perform Back-end testing for reusability by writing UNIX Shell Scripts.
• Created and maintained PL/SQL Scripts and UNIX Shell scripts to perform back-end testing on the oracle database.
• Extensively involved in Batch Testing.
• Using UNIX manually track all the defects in Logs.
• Modified previously existing test cases that were driven by the manual testing.
• To propose a cost effective test strategy that will provide comprehensive functional testing of the application and uncover any mission critical deficiencies.
• Conducted result analysis and interacted with developers to resolve bugs.
• Used Rational DDTS as a debugging tool..
• Used TestCentral for Project management, Test Case writing and planning.
• Participated in the weekly team and project meetings.

Client: Key Stone Mercy, Philadelphia, PA Feburaty’04 – June’05
Quality Analyst

Keystone Mercy Health Plan is Pennsylvania's largest Medical Assistance (Medicaid) managed care health plan serving more than more than 250,000 Medical Assistance recipients in Southeastern Pennsylvania including Bucks, Chester, Delaware, Montgomery, and Philadelphia counties. Key Stone is implementing Facets to manage their healthcare needs.

Environment: FACETS, Quick Test Pro, Test Director, Facets, Windows XP, C++, PL/SQL, Oracle, UNIX, Manual Testing.

Responsibilities:

• Involved in the Review of Requirements Specification with functional manager and technical specialists of the application
• Design and execute Test Plans and Test Cases, generate Test Scripts and Test scenarios.
• Prepared Test cases, according to the business specification and wrote test scripts and maintained them.
• Prepared proper documentation with reference to company policy
• Implemented the whole life cycle of QA methodology starting from planning, capturing, creating, executing reporting and tracking the defects using Test Director.
• Performed Functionality using Quick Test Pro.
• Designed, Communicated, and enhanced QA testing plan for the application.
• Connected to SQL Plus in UNIX and created and executed complex SQL queries.
• Performed back-end testing by extensively using SQL commands to verify the database integrity.
• Performed back-end testing for reusability by writing UNIX Shell Scripts and using Quick Test Pro
• Created and maintained SQL Scripts and UNIX Shell scripts to perform back-end testing on the oracle database.
• Performed Manual and Automated testing on UNIX environment.
• Manually performed User Acceptance Testing of FACETS implementation.
• Work as a Data analyst to set Facets data in Claims Processing, Members/Subscribers, Groups, Billing for different testing efforts.
• Extensively involved in Batch testing of claims submission.
• Conducted System, Integrated and Regression testing to the application.
• Performed Sanity and Smoke testing on the application manually.
• Wrote and Enhanced test cases and test scripts to meet new functional requirements as per the new business requirements.
• Involved in creation and maintenance of Test Matrix and Traceability Matrix.
• Conducted Security Test on the application manually.
• Manually Conducted Positive and Negative testing.
• Analyzed test results using reports and graphs generated in Test Director
• Worked on Test Director in setting up and Customizing Project entities for Defect Module Screens.
• Walkthrough on all of test cases.
• Coordinated on all testing resources.

Wachovia Mortgage Corporation, Charlotte, NC December’02 – January ’04
QA Engineer

Wachovia Mortgage Corporation is a mortgage provider on the east coast. Wachovia was updating their website to include online application for home loans and wanted to retain the data thus collected from the applicants. They also wanted to maintain a database of the applicants who were rejected, including the basis of rejection so as to focus on those criteria on re-application by same applicants

Environment: Win Runner, Load Runner, Test Director, UNIX, SQL, Windows 2000, DB2, Oracle, SQL Server 2000,JSP, UNIX, C++, TestDirector, Manual Testing.

Responsibilities:

• Prepare test plan, test cases and test procedures in accordance with the Business Requirements.
• Performed Regression, functionality, System, Performance, Interface, front end, back end, negative, positive and User Acceptance Testing using Mercury WinRunner/LoadRunner.
• Created extensive Data Driven Test scripts.
• Performed back end testing to expire and delete user registration.
• Conducted load testing, front end testing and security testing.
• Used WinRunner for GUI tests and tested GUI Standards of this application.
• Documented Test cases for Functionality, Back End Testing.
• Inserted Checkpoints in WinRunner for objects to verify the behavior of component as expected.
• Performed the Backend integration testing to ensure data consistency on front-end by writing and executing SQL statements on the Database.
• Analyzed the results from online monitors and generated reports using Load runner.
• Performed Backend testing using PL/SQL Queries.
• Conducted parameterization in LoadRunner.
• Conducted performance testing using LoadRunner.
• Configured Multiple IP addresses for Load testing in LoadRunner.
• Conducted load testing to generate the load on the server by creating Vuser scripts in LoadRunner.
• Performed auto correlation testing for checking difference in two versions in LoadRunner.
• Conducted manual data integrity testing using SQL
• Manually test each and every module of the application and verify against expected results.
• Manually tested the application for IE and Netscape browsers.
• Manually conducted Positive and Negative Testing.
• Performed smoke testing of the application manually after each build.
• Studied the test plan to provide insight into the business transactions to monitor.
• Perform ad-hoc tests to uncover defects and log them.
• Modified previously existing test cases, which were driven by the manual testing effort so that they are more appropriate for future automation testing.
• To propose a cost effective test strategy that will provide comprehensive functional testing of the application and uncover any mission critical deficiencies.
• Execute test cases following test procedures manually.
• Data Driven Testing to test the application with different sets of data.
• Developed and executed test cases test Scenarios and followed-up defects using Test Director.
• Conducted timely Walkthrough’s for informational purposes.
• Participated in bi-weekly meeting between testers, developer and Project Manager to
• Resolve issues arising after each build, assigned priorities to tasks and suggested solutions and methodologies to overcome problems and streamline everyday processes.
• Helped test engineers trouble shoot and problem solve any issues that would arise.
• Assisted in the user documentation process.
• Prepared Load test Strategies for the application and set guidelines for best practices.
• Tested Reports functionality and result applying Business rules with estimated Results
• Verified fixed defects.

Education:

• Associate in Data Processing Springfield Technical Community College
Springfield, MA.
• Bachelors of Science (BS)- Computer Information System,
Minor- Business Administration Westfield State College Westfield, MA.