Post Job Free

Resume

Sign in

Test Quality

Location:
Jacksonville, FL
Posted:
March 30, 2021

Contact this candidate

Resume:

ARKADIY ROMALIS

Jacksonville, FL ***** (C) 904-***-**** (E) adlahr@r.postjobfree.com

QUALITY ASSURANCE ANALYST

SUMMARY OF QUALIFICATIONS

Software QA tester with full system development lifecycle experience, including designing, developing and implementing test plans, test cases and test processes fueling swift corrective actions, significant cost savings and fault-free audits.

Highly experienced Quality Assurance Analyst with a record of developing and successful QA projects and solutions in Waterfall, Agile, and CMM environments. Skilled at employing both manual and automated testing tools to sort out systems, conducting integration, user- acceptance, functionality, object, regression, load, performance, and stress testing. Experienced in developing use cases and user requirement specification documents. Experience of enterprise level automation and test management tools.

Well-organized, with an eye both for detail and the “big picture”

Extensive experience working individually, as part of a team

Able to communicate clearly and concisely with members of Business, Technology, and Management

Significant exposure to various layers of functional and integration testing.

Consistently recognized and tasked by management to improve Quality Assurance effectiveness and efficiency, able to effect leadership that realizes accelerated performance and sustained strategic flexibility while creating a competitive corporate advantage. Expertise in all phases of the software life cycle, including requirements gathering, risk analysis, project planning, scheduling, testing, defect tracking, management, and reporting.

Ability to take responsibility for areas of a test plan and work with others to execut that plan. The flexibility, intelligence, and motivation to work in a variety of business domains.

Possess an extensive knowledge of the processes, procedures, and policies necessary to consistently produce a superior product and service. Technically savvy and a quick study in leading QA technology.

CORE COMPETENCIES

SQA Testing & Methodologies • Test Plans, Cases & Processes • Functional Requirements • Functionality Testing • Scripting & Documentation

System Integration Testing • Regression & Negative Testing • UI & Compatibility Testing • System Testing (Load, Stress, Performance) • Web and

Client / Server Applications Testing • Client Relationship Management • Process Improvement • Project Management • Manual and Automated

Testing • Quality Assurance Engineer • Training • Testing Automation • Defect / Bug Tracking • Test Strategies & Coverages • QA & QC Standards

TECHNOLOGY

WinRunner 6.0/8.0 • Test Director 6.0/7.6 • Mercury Quality Center 9.0/9.2/10.0 • HP ALM Explorer 12.53 • LoadRunner 6.5/7.5/8.2/9.0/9.5/10 • QTP 8.2/9.2/10.0 • e-Load 8.2 • e-Tester 8.2 • SmarteSoft/SmarteScript 5.4/5.4.6/5.5.2 • SmarteSoft/SmarteTime 5.0/5.0.8 • SmartBear/TestComplete 8/12.1/12.42/12.5 • JIRA •RALLY (CA Agile Central) • SQA ROBOT 6.1 • SQA MANAGER • APL Plus 2.1 KEA! 420 v.5.1 • COBOL • Sun Solaris • UNIX • UNIX SHELL scripting • Java • C • C++ • HTML• DB2 • MySQL • ORACLE 8.0 /9.0/10 • SQL • TOAD Data Point 4.0, 8.5 • VB Script • OS / MVS • MS Access • LOTUS • PVCS • MS Windows DOS • MS • Visual • SourceSafe • .NET • allTRA • XML • SWIFT • Test Tracker • FogBugz • ReadyAPI 2.3 • No SQL Booster for MONGO DB • Beyond Compare 4.2.9 •Zeppelin 0.7.3 •MS SQL server 2016 Management Studeo •HADOOP 3.1 •HUE 3.11.0 – The The HADOOP UI •Postmen for API

TRAINING

UNIX Fundamentals and SHELL Programming - SAM Career Center New York, NY

Business Intelligence and RDBMS Technologies: Oracle 9i RDBMS • ERStudio • DB Artisan • MS Excel (PTS) / MS Access

Data Warehousing • MS SQL Services • MS Analysis Services • DTS • Business Objects 5i Reports • Business Objects 5i

Universe Design - New Age Training New York, NY

PROFESSIONAL EXPERIENCE

Florida Blue/Synergy Technologies, LLC – Jacksonvile, FL May, 2018 – October 2020

QA Analyst

Worked on Agile/Scum development environment with frequently changing requirements and feature set. Reviewed product requirement documents, functional specifications, and involved in developing test strategy, test plan and test case documents. Interacted with Business Analysts and Software Developers for bug reviews and participated in QA meetings.

Created, maintained and executed detailed regression testing. Developed test plans and executed test cases for manual functional testing, in a large-scale server systems architecture environment. Maintain standard test templates for functional test plans, test scenarios and test cases.

Developed test plans and executed test cases for manual functional testing, in a large-scale server systems architecture environment. Maintain standard test templates for cases for manual functiotargeted areas of improvement for CAHPSnal testing, in a large-scale server systems architecture environment. Maintain standard test templates for functional test plans, test scenarios and test cases. Documented test procedures, reconciling test results from different tests. Help team deliver well-designed and written test cases to project. Oversee the execution of test cases, review test results and report defects.

Create structured, clean, and cohesive test cases for all existing and new features and/or functional changes in the software. Clearly and concisely identify software defects and other issues, both in written and verbal language. Track status on team testing activities accurately and in a timely fashion.

Providing UAT support by understanding business requirements. Works with other team members performing UAT.

DB: MONGO, MS SQL,HADOOP, DB2 Created SQL queries for testing data in DB by compearing data from Source file and Target file, compearing data from DB and Metrics reports. Tested logic for creating data for metrics reports.

Used RALLY (CA Agile Central) and Mercury Quality Center for creating User Stories, Tasks, Test Cases, executing test Cases, and Defect tracking toolto maintain different projects.

On Line Provider Directory, Digital Health and Analysis Marketing (IT Web and Mobile)

CX MEASURMENT (MSRT). The intent of this project is to design and deploy a pilot of the Post Doctor visit survey to assess ability to understand the members experience with their provider and early indiratcors/targeted areas of improvement for CAHPS/HOS performance.

CX Analytics (CX Insights Jorney, Metrics Report – FCR Tracker, Member Time line). The intent of this project is to sent an automated syrvey to providers at FloridaBlue and collect their feedbackwith regards metrics based on the responces from provider.

Inspire/Highmark/Print/Mail: Verified that the contract generation requests are fulfilled and composition at Inspire for each test condition setup in test environment. Verify that Control Page front has ID Card (external to Inspire), Welcome brochure, Master Policy (only for group package), Schedule of Benefits, Endorsements, notices. Verified that Control page back has sections of benefit booklet. AFP file should be generated at Inspire for printing and mailing at Highmark. PDF file should be generated at Inspire for ECMS data archival. Reconciliation/Inspire web services: Verified that the attributes (record processing status) within Highmark acknowledgement files are processed at Reconciliation database. Verified that contract processing information is forwarded to ECMS and image id is retrieved. Verify that the touchpoint extract file is created at Inspire services. ECMS: Storing Documents: Ensure that content connect receive, process and store PDF files from Inspire and provide acknowledgement back with a Unique Id for each document. ID retrieval: Contract documents should be retrieved from ECMS using Unique IDs by Inspire services. Metadata Retrieval: ECMS should provide matching contract documents as response to MWS request containing Package ID/HCCID/ both via Quadient web services. Metadata Search UI: Ensure that the UI displays matching documents for all the combination of metadata search. Touchpoint: Ensured that Image ids are created at ECMS for every contract generated at Inspire. Ensure that the touchpoint files are created at Inspire/web services with printed timestamp and image id retrieved from ECMS. Acknowledgement file from Highmark for print completion should be processed at reconciliation and an update is provided to Touchpoint from Inspire. Verify both Highmark and O’Neil touchpoint integration files for group communication contracts. MWS: Contract fulfillment package should be viewable by the subscriber/group for which HCC id was set up and contracts were generated. Ensure that all the Benefit booklet and summary of benefit coverage documents are displayed as appropriate via member website. Group should be able to view the package at www.Floridablue.com. Subscriber/member should be able to view the package at Member website. Ensure that Group master policy is not displayed at Member website.

AVAILITY, LLC – Jacksonville, FL January 2017 – May 2018

QA Analyst

Availity is an industry-leading, HITRUST-certified health care information technology company that serves an extensive network of health plans, providers, and technology partners nationwide through a suite of dynamic products built on a powerful, intelligent platform. We integrate and manage the clinical, administrative, and financial data our customers need to fuel real-time coordination and collaboration amongst providers, health plans, and patients in a growing value-based care environment.

Worked on Agile/Scum development environment with frequently changing requirements and feature set. Reviewed product requirement documents, functional specifications, and involved in developing test strategy, test plan and test case documents. Interacted with Business Analysts and Software Developers for bug reviews and participated in QA meetings. Proactively came up with innovative methods to improve software quality, test coverage, efficiency and regression coverage.

Created a quality assurance program that allowed application development teams to provide higher quality deliverables in shorter turnaround times. Produced detailed test reports via Quality Center, leading testers in the execution of test cases, reviewing test results, and reporting defects.

Supervised all functionality testing on each milestone of the application. Advise on best practice for successful QA. Communicate with all levels of management on test status. Meet daily with Business Analysts, Product Staff and Development on Test Issues and Requirements.

Created, maintained and executed detailed regression testing of Web apps across multiple platforms.

Tested Android based application

Providing UAT support by understanding business requirements. Works with other team members performing UAT.

Created a formal quality assurance structure, developing testing processes according to SDLC and USE Cases standards, implementing test plans, matrixes, scripts, and tools using gathered business requirements. Drafted policies and procedures, trained business analysts and developers in its use, and guided the team throughout the implementation of the program. Developed test plans and executed test cases for manual functional testing, in a large-scale server systems architecture environment. Maintain standard test templates for functional test plans, test scenarios and test cases. Documented test procedures, reconciling test results from different tests. Help team deliver well-designed and written test cases to project. Oversee the execution of test cases, review test results and report defects.

Conducting system integration, user acceptance, functionality, object, regression, load, performance, and stress testing. Development experience in the creation of Performance Testing and metrics. Act as lead for all Performance Testing.

Experience and development of automated testing scripts using LoadRunner

Coordinated component, system and documentation testing with the appropriate technical groups and release management. Documented test procedures and findings. Reconcile test results from different test and different groups. Assessed readiness and deviation of product/ project performance based upon test results and product specifications. Performed functional and system tests on Web-based applications on different browsers: IE, Firefox, Safari, and Chrome

Processed transactions from system entry to exit. Maintained metrics reports. Oversaw all aspects of automated test script creation, execution, and analysis. Developed test scripts based upon business requirements and processes, in the line with defined workflows and use cases. Maintained a library of tests and scripts. Created and updated test databases, defect database, plans, schedules, and scripts. Documented and prioritized defects, using JIRA defect tracking tool. Review all reported bugs and assign them the appropriate individuals. Experience in the preparation of Automated Tests within the Keyword-driven automation framework. Extensively used automated test tool HP ALM, READYAPI for Regression Testing. Worked with business and technology leads to identify the appropriate data for testing, and prepare that data for test cases using SQL.

AIG/DIVERSANT LLC (contract) - Jersey City, NJ September 2013 – July 2016

Sr. QA Engineer/ Coordinator

AIG Property Casualty CCUW (Complex Commercial Underwriting) – is a full policy lifecycle system which covers all the Financial and Casultylines of business for Global and Domestic clients that include different products.

Worked on Agile/Scrum development environment with frequently changing requirements and feature set. Reviewed product requirement documents, functional specifications, and involved in developing test strategy, test plan and test case documents. Interacted with Business Analysts and Software Developers for bug reviews and participated in QA meetings. Proactively came up with innovative methods to improve software quality, test coverage, efficiency and regression coverage. Involved in occasional testing for Worker Comp and BOP lines of business.

Created a quality assurance program that allowed application development teams to provide higher quality deliverables in shorter turnaround times. Produced detailed test reports via Quality Center., and reporting defects. Meet daily with Business Analysts, Product Staff and Development on Test Issues and Requirements.

Used Continuous Delivery to get changes of all types—including new features, configuration changes, bug fixes and experiments—into production, or into the hands of users, safely and quickly in a sustainable way.

Created a formal quality assurance structure, developing testing processes according to SDLC and USE Cases standards, implementing test plans, matrixes, scripts, and tools using gathered business requirements. Drafted policies and procedures, trained business analysts and developers in its use, and guided the team throughout the implementation of the program. Developed test plans and executed test cases for manual functional testing, in a large-scale server systems architecture environment. Maintain standard test templates for functional test plans, test scenarios and test cases. Documented test procedures, reconciling test results from different tests. Help team deliver well-designed and written test cases to project. Oversee the execution of test cases, review test results and report defects.

Providing UAT support by understanding business requirements. Works with other team members performing UAT.

Conducting system integration, user acceptance, functionality, object, regression, load, performance testing. Coordinated component, system and documentation testing with the appropriate technical groups and release management. Documented test procedures and findings. Reconcile test results from different test and different groups. Assessed readiness and deviation of product/ project performance based upon test results and product specifications. Performed functional and system tests on Web and mobile based applications, using QTP, HP ALM, and LoadRunner

Tested on Mainframe environment

Processed transactions from system entry to exit. Maintained metrics reports. Oversaw all aspects of manual test cases creation, execution, and analysis. Developed test cases based upon business requirements and processes, in the line with defined workflows and use cases. Maintained a library of test cases, using Mercury Quality Center. Designed and executed performance test plans and strategies. Created and updated test databases, defect database, plans, schedules, and test cases. Documented and prioritized defects, using QC defect tracking tool. Review all reported bugs and assign them the appropriate individuals.

Surecomp – Hoboken, NJ August 2012 – July 2013

QA Team Lead

Surecomp is the leading provider of global trade solutions for banks and corporations. Surecomp provides an integrated portfolio of trade finance, supply chain and treasury confirmation matching solutions that anticipates constantly changing market requirements. allTRA is Surecomp’s latest back-office trade finance solution. Developed entirely in the advanced Java J2EE multi-platform environment, allTRA meets the trade finance requirements of banks of all sizes.

Working with product designers, business analysts and developers at all stages to promote quality. Primary responsibility for identifying problems with the software and its design, involves close interaction with the development and support team members, product designers and business analysts at all projects and development stages to promote quality.

Worked simultaneously with multiple requirements / projects concurrently in a matrix managed environment.

Plan and produce test plans execute manual tests. Successfully tested the web service and integration of alltra with multibank platform using Soap UI and XML messages. Tested integrated messaging via SWIFT, email, fax, telex and printed output. Tested the Batch program such as Auto Renewal, Update participation facility, Memo etc. Used UNIX command line and shell scripting to work with files and Bach programs.

Assign and manage work to QA testers. Collaborated within and outside Test Team to achieve desired end result. Took part in delivery scope meeting and initiated testing process to deliver quality product. Work with developers to perform root-cause analysis and preliminary problem diagnosis. Accountable for timely and quality test deliverables throughout the project life cycle.

Used Continuous Delivery to get changes of all types—including new features, configuration changes, bug fixes and experiments—into production, or into the hands of users, safely and quickly in a sustainable way.

Providing UAT support by understanding business requirements. Works with other team members performing UAT.

Produce and prioritize bug reports, reproduce bug reports from the customer, using Test Tracker. Coordinate with the development / support teams to ensure required test platforms are available. Work with developers to perform root-cause analysis and preliminary problem diagnosis, prepared deliverables such as Test summary reports and Test metrics documents based on coverage. Wrote SQL queries to access the data from the database tables and validated the results. The data fed from MySQL.

Avenue-e Health Strategies (contract) - New York, NY September 2011 - August 2012

A Sudler & Hennessey Company/ Solomon-Page Technology Partners

Senior Quality Assurance Engineer

Worked on Agile development environment with frequently changing requirements and feature set. Reviewed product requirement documents, functional specifications, and involved in developing test strategy, test plan and test case documents. Interacted with Business Analysts and Software Developers for bug reviews and participated in QA meetings. Proactively came up with innovative methods to improve software quality, test coverage, efficiency and regression coverage.

Gathered requirements for the creation of test cases for Functional testing, providing official test documentation. Developed and reviewed functional specifications with the product management and development teams. Managed Manual test and works with Developers, Product Staff and Business Analysis to flush out requirements and create test strategy and plans. Maintain standard test templates for functional test plans, test scenarios and test cases. Conducting system integration, user acceptance, functionality, object, regression testing. Created, maintained and executed detailed regression testing of Mobile and Web apps across multiple platforms.

Delivered well-designed and written test cases to project. Oversee the execution of test cases, review test results and report defects. Creation of detailed test status reports in a variety of formats. Advise on best practice for successful QA. Communicate with all levels of management on test status. Meet daily with Business Analysts, Product Staff and Development on Test Issues and Requirements. Work with the offshore test and development teams on the division of responsibilities and test management.

Wrote documentation for Test Plans, Test Requirements and Test Case Specifications. Executed Functional, Negative and Regression tests for WEB based application. Created SQL queries to monitor data transactions, conducting manual testing and defect tracking, using FogBugz. The data fed from MySQL.

Vault.com - New York, NY August 2008 – September 2009

Senior Quality Assurance Engineer

Worked on Agile development environment with frequently changing requirements and feature set. Reviewed product requirement documents, functional specifications, and involved in developing test strategy, test plan and test case documents. Interacted with Business Analysts and Software Developers for bug reviews and participated in QA meetings. Proactively came up with innovative methods to improve software quality, test coverage, efficiency and regression coverage.

Created a quality assurance program that allowed application development teams to provide higher quality deliverables in shorter turnaround times. Produced detailed test reports via Quality Center, leading testers in the execution of test cases, reviewing test results, and reporting defects.

Supervised all functionality testing on each milestone of the application. Advise on best practice for successful QA. Communicate with all levels of management on test status. Meet daily with Business Analysts, Product Staff and Development on Test Issues and Requirements.

Created, maintained and executed detailed regression testing of Mobile and Web apps across multiple platforms.

Created a formal quality assurance structure, developing testing processes according to SDLC and USE Cases standards, implementing test plans, matrixes, scripts, and tools using gathered business requirements. Drafted policies and procedures, trained business analysts and developers in its use, and guided the team throughout the implementation of the program. Developed test plans and executed test cases for manual functional testing, in a large-scale server systems architecture environment. Maintain standard test templates for functional test plans, test scenarios and test cases. Documented test procedures, reconciling test results from different tests. Help team deliver well-designed and written test cases to project. Oversee the execution of test cases, review test results and report defects.

Testing the Ad Content, Ad Quality and other industry expected standards in multiple devices (supported by all operating systems) and browsers. Capture any deviance found in the creative against the industry standard and report it back to

the customer.

Conducting system integration, user acceptance, functionality, object, regression, load, performance, and stress testing.

Coordinated component, system and documentation testing with the appropriate technical groups and release management. Documented test procedures and findings. Reconcile test results from different test and different groups. Assessed readiness and deviation of product/ project performance based upon test results and product specifications. Performed functional and system tests on Web-based applications on different browsers: IE, Firefox, Safari, and Chrome, using QTP (VB SCRIPT) and LoadRunner, in both the QA and back-up environments.

Processed transactions from system entry to exit. Maintained metrics reports. Oversaw all aspects of automated test script creation, execution, and analysis. Developed test scripts based upon business requirements and processes, in the line with defined workflows and use cases. Maintained a library of tests and scripts, using Mercury Quality Center. Designed and executed performance test plans and strategies. Created and updated test databases, defect database, plans, schedules, and scripts. Documented and prioritized defects, using JIRA defect tracking tool. Review all reported bugs and assign them the appropriate individuals. Experience in the preparation of Automated Tests within the Keyword-driven automation framework. Extensively used automated test tool QTP (VB Script) for Functional and Regression Testing and Loadrunner for Load, Stress, and Performance testing... Tested Web-based applications built on IBM portal technology, written on JAVA, to test relevant existing system functionality. Worked with business and technology leads to identify the appropriate data for testing, and prepare that data for test cases using SQL. The data fed from DB2 and MySQL.

TIAA – CREF/ KEANE (contract) - New York, NY February 2008 – July 2008

Senior Quality Assurance Engineer

Led Performance Testing for Web Applications running on distributed environments: J2EE platform, BEA WebLogic, and Sungard applications using HP LoadRunner and Empirix e-Load Test Automation Tool suite. The data fed from Access, SQL, and Oracle.

Coordinated Pre-Test Tasks. Ensured performance test requirements are received. Developed performance test plan. Scripts recorded in VU Generator and E-tester. Calculated number of virtual users in Mercury HP LoadRunner and/or Empirix e-Load. Understand monitoring requirements based on test objective.

Coordinated Test Execution Tasks. Coordinated test window and secure technical support for monitoring of infrastructure, and to qualify observation. Ensured correct environment configuration prior to execution. Executed performance and other non-functional tests. Monitor environment components during test included: Test Automation Tool Consol (LoadRunner and/or Empirix) and Applications logs such as log4j, CPU and Memory. Addressed all technical issues. Facilitated the resolution and necessary follow up.

Coordinated Post Test Tasks. Gathered various reports and statistics from monitoring groups, packaging and reporting results to Project Teams. Assisted in explanation of results and other posttest initiatives. Generated performance graphs, session reports and other related documentation required for validation and analysis. Published results and received appropriate signoffs. Prepared detail status reports and monitoring of all defects and issues. Reported issues in a defect tracking system Mercury Quality Center. Used UNIX command line to collect results of testing.

Facilitated Administration and Service, Participated in: Unit metrics development. Web site maintenance. Related process improvement initiatives.

Christies Auction House - New York, NY February 2006 – October 2007

Lead Quality Assurance Analyst

Worked on Agile development environment with frequently changing requirements and feature set. Reviewed product requirement documents, functional specifications, and involved in developing test strategy, test plan and test case documents. Interacted with Business Analysts and Software Developers for bug reviews and participated in QA meetings. Proactively came up with innovative methods to improve software quality, test coverage, efficiency and regression coverage.

Managed a Functional and Automation Test team creating test strategy and plans, producing detailed test reports via Quality Center. Maintain standard test templates for functional test plans, test scenarios and test cases. Work with the off-shore test management team on the division of responsibilities and test management. Worked with business and technology leads to identify the appropriate data for testing, and prepare that data for test cases using SQL. The data fed from SQL Server. Tested functionality delivered across applications and workflows. Tested Web based applications on .NET environment, written on JAVA, C++ to demonstrate that relevant existing system functionality.

Instituted a global quality assurance program that has allowed application development teams to provide higher quality deliverables in shorter turnaround times. Management activities include recruiting, project assignments, scheduling, training, performance reviews, and mentoring as well as participating in budget activities. Supervised all functionality testing on each milestone of the application. Supervised all additional internal quality assurance testers assigned to the product as well as any outside testers. Developed testing process. Created a formal quality assurance structure for the team to follow. Developed a plan to implement the V Model QA process. Drafted policies and procedures, trained business analysts and developers in its use, and guided the team throughout the implementation of the program.

Created a formal quality assurance structure, developing testing processes according to SDLC and USE Cases standards, implementing test plans, matrixes, scripts, and tools using gathered business requirements. Drafted policies and procedures, trained business analysts and



Contact this candidate