Post Job Free
Sign in

Lead Qa Analyst - Test Automation

Location:
Milton Keynes, City of Milton Keynes, United Kingdom
Salary:
$130,000
Posted:
August 31, 2025

Contact this candidate

Resume:

Raja M. Ismail

678-***-**** (C) ***********@*****.***

SUMMARY

Accomplished QA professional with 19+ years of experience (14+ years of manual test efforts, and 5+ years of test automation). Experienced working in complex QA projects..

Worked with clients in different industries such as Health Care, Telecommunication, Insurance, e-commerce, Banking, Retail, Metrology, Finance and Audit (Assurance)

Held different roles, such as a Tester, QA Analyst, Sr. QA Analyst, QA Lead, Test Architect and QA Manager. Well versed with various phases of testing, such as Functional test, Regression test, Performance test, End to End test, User Acceptance Test, Smoke Test, etc.

Proficient in creating Test Strategy, Test Plan, Test Cases, Test Manuals, including training of a QA team. Have undertaken independent consulting for QA projects, during my career.

Professional expertise in leading the testing efforts of:

Web applications, built in different technologies such as .Net and Java

Mobile Applications (Native and Hybrid Applications) built for iPhone, Android, Windows Phone, etc., (including rugged Hand-Held Motorola Devices, at Home Depot).

API and Web Services testing, using SOAP UI, Postman and Swagger

MDM (Mobile Device Management (AirWatch)) features

Database Testing (setting up test data, and performing complex queries to verify reports, and data updates in the database)

Mainframe and AS/400 applications

Expertise includes creating Test Automation Strategy as well. Have worked with different automation tools such as Selenium, QTP, TestComplete, RanoRex, eTester, eLoad WinRunner and SmartWare. Have experience creating Java programs, for automating tests with Selenium, and TestComplete.

Worked with Project/Test Management tools such as JIRA. Azure DevOps, HP ALM, Version One, QA Complete, etc.

Worked with the Defect Management tools such as, JIRA. BugZilla, Service Desk, TFS, Quality Center, etc.

Throughout the career, worked closely with Development team, Business, Sales, Product Management, etc.

Worked in Agile/Scrum and Waterfall development methodologies.

Familiar with Oracle CRM (Incentive Comp) module, and Oracle ERP (Accounts Payable module).Vertex, Snowflake

Excellent written and communication skills

Have completed 15 hours of project management training. Exposure to basics of AI and Machine Learning as well.

EDUCATION

Graduate of NIIT (US equivalent of BS Information Systems) Sep 1993 - Jan 1996

National Institute of Information Technology, India

Bachelor of Law, Chennai India June 1987 – June 1990

ACHIEVEMENTS

Standing Ovation Award for performance, recommended by Project Manager in AT&T Jan 2000

“You Make the Difference” recognition from the client in just 5-month

EXPERIENCE

Client: Albertsons Nov 2020 – Present

Employer: Tata Consultancy Services (TCS)

QA Lead (Tax & VPOS)

Lead the testing efforts of Taxation Phase II feature, implemented by Tax Services. The Tax Phase II service would provide taxes for ecommerce orders, during checkout, and after fulfillment of the order.

Scoped End to End testing of the Tax Ph II feature. Planned, scheduled and executed the entire testing efforts for Tax Ph II. Ensured data was consistent between Tax Services and Vertex.

Coordinated the testing efforts with the upstream and downstream systems, end to end.

Designed and executed functional tests, regression tests, and supported release to production. Supported performance test as well, by providing performance data load. Designed regression test suite and worked closely with automation test engineer. Also developed automation scripts.

Worked closely with Vertex team in ensuring data flow between Vertex and Albertsons Tax system.

Also, lead the testing efforts in implementing the cloud architecture of VPOS, to support the Clean Receipt 2.0 feature.

Provide QA Signoff of new features and fixes to production and verify the changes in production environment.

Manage the testing efforts of complex VPOS changes, to support the launch of SNAP feature to Albertsons customers.

Support the testing efforts in implementing the cloud architecture of VPOS, to support the Clean Receipt 2.0 feature.

Train new QA resources in the QA processes, and procedures in VPOS.

Provide QA Signoff of new features and fixes to production and verify the changes in production environment.

Participate in Sprint Planning meetings, and other planned and ad hoc meetings.

Participate in Daily Stand- Up meetings and provide testing updates.

Participate in Project Retrospective meetings and contribute ideas and suggestions.

Create test plans and test cases for user stories in a sprint.

Automate new features developed in a sprint, using the combined technology of IntelliJIDEA, Cucumber and Java.

Work closely with Developers and the Product Owner to test and close user stories in a sprint.

Environment/Tools: Vertex, Oracle, MongoDB, Snowflake, Azure Cloud, Acupick Emulator, Postman, SQL Server, Swagger, Blob Storage, Cloud Log Analytics, SNOW-R, etc. Automation: IntelliJIDEA, Cucumber and Java, Studio 3T, Robo 3T, MongoDB Compass, etc.

Client: Ernst & Young (EY) Apr 2018 – Oct 2020

Employer: Tata Consultancy Services (TCS)

Sr. QA Analyst/Tester (Canvas Mobile Applications)

Responsible for the testing efforts of 3 Mobile Applications (EY Canvas Engage, EY Canvas Pulse and EY Canvas Inventory).

Worked in Sprints, based of the Agile/Scrum software development methodology.

Participate in meetings: Sprint Planning, User Story Estimation, Design Review, etc.

Create Test Cases to cover all the Acceptance Criteria in the User Stories. Test cases should cover the happy path and alternate paths in the test scenarios.

Test Execution on iOS and Android phones, and iOS tablets.

Enter Defects in Azure DevOps

Provide Logs, Screenshots, and recreate steps for bugs entered.

API testing using Postman, and Swagger

Provide updates during Daily Standups

Resolve User Stories

Work with Product Owner and complete the pre demo of the user stories and close the user stories.

Support UAT testing

Provide QA Signoff for release to production

Provide feedback and enhancements ideas in the Project Retrospective

Environment: MAM (Mobile Access Management), Azure Cloud, On Prem applications, MS Authenticator, Citrix SSO VPN, Company Portal, iOS, Android, iOS tablets, Postman, SQL Server, Swagger, etc.

Client: SunTrust, Atlanta, GA Jun 2017 – March 2018

Employer: Infosys

Test Team Lead (Channel Link Argo)

Responsible for all testing efforts, such as test planning, test deliverable reviews, test execution, and support Customer Acceptance Test (CAT), as well.

Lead the offshore team on multiple projects to accomplish the testing efforts.

Work with the Business Analysts to review and confirm the requirements.

Work with Data Management Organization to prepare for data needs for complex testing efforts.

Work with the CAT team and support them during the start and execution of CAT.

Handled Core testing projects, as well as Cross Workstream projects.

Plan, and execute the testing efforts in conjunction with Sun Trust release SLAs and timelines.

Monitor automation testing efforts and ensure at least 50% of testing effort for each project is automated as per the SLA with the client.

Client: Securities Exchange America Inc Jan 2017 – Jun 2017

Sr. QA Automation Engineer

Contribute to the optimization of automation framework standards for NBS (New Business System)

Create TestComplete scripts using KeyWord driven methodology

Create JavaScript functions to support the KeyWord driven framework

Establish documentation standards for the automation of the functional tests

Utilize the features of TestComplete tool in the areas of Name Mapping, Data Driven Tests, Scripting (Java script) and accomplish the automation needs

Integrate TestComplete with TFS, and QA Complete

Client: Sensus, Morrisville, NC Jan 2016 – Dec 2016

QA Lead/Automation Architect

Created automation strategy to automate testing of the Mobile version (Android phone), and Windows desktop version of the Field Logic Tools application

Implemented the automation framework

Created Automation Scripts

Identified the training needs for other testers

Trained the team in automation scripting

Maintained the scripts in a repository

Maintained Smoke Test Suite, Regression Test Suite to run before each release

Logged detailed defects including step-by-step instructions to recreate the defect, in TTP (Test Track Pro)

Helped to establish best practices in testing processes

Worked closely with IT, Hardware, Software, and Firmware Engineers, while testing and debugging the FieldLogic application in Android HHD

Provided inputs to enhance the proprietary software SmartWare

Environment: SmartWare automation tool (to automate firmware tests), SmartWare tool, .Net, Appium Mobile driver, Android LG mobile device, UI Automator, Inspect Tool, Test Track Pro, MS SQL

Client: McKesson (Health Care) Jan 2015 – Dec2015

Lead QA/Sr. QA Analyst

Participated in the User Story discussion and provided estimates for testing efforts, to determine the complexity of each user story, so that appropriate functional points can be assigned to each user story

Held QA team meetings and articulated scope of testing for each iteration, and set timelines, and expectations for the testing efforts

Trained the QA team on testing the Web Services using SOAP UI

Updated and maintained the regression suite of tests for each iteration

Maintained test cases and defects in Version One (Agile Project Management Tool)

Provided daily updates of the testing efforts in the daily scrum meeting

Built and published the latest code to the QA environment

Identified known bugs and provided testing status to the management

Updated and setup test data in the SQL database

Create and execute Selenium automated scripts

Environment: Version One, HP ALM, Selenium, JavaScript, API Testing using SOAP UI and Postman

Client: Motorola Solutions Jan 2014 - Dec 2014

Lead QA/Sr. QA Analyst

Created test plan covering the Requirements specified by THD in the PRS (Product Requirement Specification)

Created test cases (positive and negative) for each requirement

Executed the test cases on every new build of the OS for the device, as provided by Motorola

Create and execute Selenium automated scripts

Entered issues in the Clear Quest defect management tool

Attended testing progress meeting every day

Visited Home Depot stores with the devices to test the WiFi coverage and strength within the store to make sure phone calls did not drop, and that various functions can be performed in the device without affecting the current use of the phone

Provided training to other teams on the AirWatch MDM features and services

Worked with client (THD) personnel, and resolved many issues regarding the hardware and software installed on the device

Environment: ClearQuest, AirWatch (MDM), Motorola Handheld Rugged Devices (MC70, TC70), PhoneGap, Selenium, JavaScript, HP ALM

Client: McKesson (Health Care) Oct 2013 - Dec 2013

QA Lead

Participated in User Story discussion, estimation and define Acceptance Criteria for each User Story

Responsible for Test Planning, Scheduling, Execution and Reporting Test Status

Mentored and led the QA resources in the testing efforts, and accomplished the testing

Drafted Test Plan for each iteration

Created Test Cases in Version One and HP ALM

Executed functional and web service tests and updated the test results

Entered defects in HP ALM and tracked the defect status, and retested and updated promptly

Reported testing status in daily scrum meeting

Participated in iteration completion/release meetings

Demonstrated the product functionality developed in each iteration

Provided feedback during the project retrospective

Create and execute Selenium automated scripts

Environment: .Net, Web Services, Version One, HP ALM, Selenium, JavaScript, SOAP UI, SQL Server, Agile/Scrum Methodology

Ingenious Med Inc. (Health Care) Aug 2008 – Oct 2013

Sr. QA Analyst/Lead QA (Full Time)

Worked closely with Business Analysts, Development team, Database team, and handled the testing efforts of every release of IMBILLS

Verified and approved that new functionalities met the requirements

Performed QA testing on Web version, and Smart Phone versions of the application on devices such as PDA, BlackBerry, iPhone and Android

Performed back end verification of data during testing by creating and executing complex SQL queries

Performed Web Services Testing

Performed regression tests before every release

Performed production verification after release to production

Analyzed production issues to identify the root cause and identify steps to recreate the issue

Created and executed automated regression tests using Test Complete automation tool

Supported other teams such as, Tech Support, Sales, Marketing, and other departments with their testing needs/environments

Worked in an agile development environment

Led QA efforts in individual projects: Enterprise Billing, LDAP Authentication, Code Correct, and Patch Releases

Updated and setup test data in the SQL database

Verified reports and worked with the developer on the report generation from the various tables in the SQL server, and verified right presentation of data in the dashboard

Mobile Testing:

Tested the native application on mobile devices such as PDA, BlackBerry, iPhone and Android

Tested various user functions on mobile devices

Worked with Agile methodology for mobile projects

Security Testing - LDAP:

Performed testing of the authentication and authorization by LDAP Services

Verified different roles were authenticated properly and were able to access the authorized resources of the application.

Created entries for users in the active directory, specified the roles, resources and authentication credentials, and verified the same from the application

Web Services Testing:

Performed web services testing using SOAP UI sending request to different application and verified the result set for correct format and data validity

Environment: C#, AJAX, VB, .Net, Visual Studio 2008, SQL Database, Mobile Devices (PDA, Black Berry, iPhone, iPad and Android), Web Services, LDAP, TestComplete etc. Agile/Scrum methodology

Client: Macy’s Jan 2008 – Jul 2008

QA Manager (Level 5)/Test Advocate

Managed the QA efforts for the Purchase Order (PO) Number Reuse project: the objective of PO Number Reuse project was to provide a solution to meet the growing demand for PO Numbers in the Order Management system due to increased business in the FedFil department

Managed a team of 4 resources to accomplish the testing efforts for the project

Planned the testing effort right from the early stage of requirement consolidation, to the final stage of implementation

Established a QA team (Recruit and Retain)

Handled multiple release testing efforts

Prepared Master Test Plan, and Master Schedule

Identified training needs for the team and provided training for the team

Scheduled a review of deliverables with Product Manager, Development Manager, Project Manager and Operations Advocate

Coordinated with Configuration Management team, Database team, Architect team, etc. to ensure all components are in place in the test environment

Ensured sign-off obtained on deliverables from all key players and stakeholders

Monitored team progress by formal weekly meetings and informal daily meetings

Coordinated test planning and test execution with different Family of Business (FOB)

Planned and coordinated User Acceptance Testing (UAT)

Environment: VB, DB2, Mainframe, COBOL, BugTracker, MOSS (Microsoft SharePoint Server), Cube technology

Client: AT&T Oct 2006- Dec 2007

Lead QA/Automation Strategist

Developed a strategy document laying out various phases involved in automation of CARE functionalities

Obtained Signoff of the Strategy document from the project management

Implemented each phase in the strategy document. The phases for the Automation Infrastructure Development included: select Automation Tool, obtain Software License, work with Network Support team to setup server in the network, with Citrix access for Offshore and Onsite team

Built an automation team of 3 members in the offshore site, and 1 member onsite

Provided training to team members on the automation tool, and the application

Developed an Automation Plan using HP ALM and Automation Schedule

Handled the manual testing efforts of RPA (Rate Plan Analyzer) system; led a team of 2 onsite

Planned, scheduled and executed an urgent request from Project Management for QA Certification of RPA

Maintained Proof of Test documents for the testing efforts, and supported Production Verification

Effectively handled the challenges in working in an AS/400 environment for the first time

Environment: AS/400, Clarify, Oracle, QTP, HP ALM

Perfect Commerce July 2004 – Sep. 2006

Perfect Commerce is an e-commerce enabler that provides a suite of hosted applications to corporate Buyers and Suppliers. The hosted applications enabled Buyers and Suppliers to eliminate paperwork relating to orders, receipts, invoices, and payments. Vendors created purchase orders using Procurement Manager. Suppliers used the Supplier Invoice Manager to create invoices. Buyers used Receipt Manager for creating receipts against goods received, and Payment Manager to keep track of payments. The applications also interfaced with ERP systems of Buyers and Suppliers, to provide information relating to Orders, Receipts, Payments, and Invoices that the ERP systems require for processing

Sr. QA Analyst/QA Lead

Responsible for Automated and Manual Testing:

Reviewed Requirements with the business team, development team and architecture team

Determined requirements as GUI requirement, Functional requirement, Batch process requirement, System Interface requirement, Report requirement

Prepared Test Plan and Test Schedule and obtained signoff from project management

Led the offshore team of 3 and 1 onsite resource

Managed automation test efforts for automated test using e-Tester, and performance testing using e-Load

Performed Integration testing between the hosted applications and the ERP systems, and ensured data flow and data validation between the systems

Performed complex SQL queries against the database to verify data integrity

Led the testing efforts for data center migration from Las Vegas, NV to Kansas City, MO

Visited the Data Center and Development center at Las Vegas to understand the various components involved in the migration, and to determine the scope of testing required to test each component

Studied a .Net based application that was being migrated, while primarily working on a Unix based system

Prepared a Level of Effort document and obtained approval from project management

Scheduled and coordinated test execution with the offshore team of 4 to test the migration

Logged, tracked and resolved defects in Service Desk

Verified various reports and extracts generated by the hosted applications

Supported User Acceptance Test, and executed Production Verification Tests after release

Environment: .Net, Linux, Java, Apache Web Server, Windows NT, Perl, ServiceDesk, eTester, Citrix, Oracle, SQL, SiteMinder, Empirix products such as eTester, eLoad and eManager

US Central Credit Union Sep. 2003 – April 2004

US Central Credit Union hosts the IT services to other credit unions. Several applications supporting the business needs of credit unions were developed, hosted and maintained by US Central Credit Union with customization of features for the clients. The applications such as, Lockbox, Positive Pay, ACH, Wire Transfer, Alerts, and Book Transfer

Senior QA Analyst/Tester

Responsible for Automated and Manual Testing

Performed Functional and Regression testing of the application

Automated tests using WinRunner

Created test cases to test SiteMinder functionalities, such as Basic Resource Protection and Basic Authentication

Integrated Test Plan, Test Cases and Test Scripts in Test Director

Built automated test scripts with WinRunner and Integrated automated scripts in Test Director

Performed regression test, to ensure that changes made did not affect other functionalities

Worked closely with System Engineers, System Analysts and Programmers during system test, and troubleshot and resolved non-conformities

Supported Load/Stress test using LoadRunner

Supported User Acceptance Test, & Executed Production Verification Tests in Production

Environment: Windows NT, Windows 2000, Java, C++, Perl, Test Director, WinRunner, Citrix, MS-MQ, SQL database, Visual SourceSafe, SiteMinder, LDAP Server

Client: AT&T Jun 1998 – 07/2003

Sr.QA Analyst/QA Lead

Performed Manual Testing: Black Box & Glass Box

Reviewed Feature Requirement Specifications (FRS) designed by the system engineer for every new/modified feature in SSD. The review team includes System Engineers, System Analysts and System Testers. (SSD (Single Source Data) is an application that is database of records for various switches, trunks between switches, and route patterns that define routing of calls generated at each switch in the network

Participated in review of Process Specifications (PS), developed by system analysts

Analyzed the FRS and PS extensively to understand the impact of the changes

Created processes and templates to for effective testing of the application

Created test plan defining the scope, schedule and resource for testing

Generated simple and complex test cases to extensively cover the application functionality

Identified/Generated test data as required, and tested and validated conformity to the requirements

Reported test status to Project Manager on a weekly basis

Conducted team meetings on a regular basis

Performed regression test, to ensure changes in the application did not affect existing functionalities

Worked closely with System Engineers, System Analysts and Programmers during system test, and helped troubleshoot and resolve non-conformities

Logged, and tracked defects in Sablime (defect tracking and configuration management tool)

Supported User Acceptance Test before cut to production

Performed Integration Test and tested the interface between SSD and other systems such as Access Control Management System (ACMS), Route Code Component (RCC), Service Now –Routing (SNOW-R) and Traffic Capacity Management (TCM), to ensure flow of data between systems is in accordance with Application to Application Interface Specification (AAIS)

Supported Solution Test (End to End), working closely with System Testers, System Engineers and System Test Managers of other systems, during solution test

Environment: Unix, Sybase, C, C++, Sablime, Teamwork, TOAD



Contact this candidate