Post Job Free
Sign in

QA

Location:
Indian Trail, NC
Posted:
March 23, 2025

Contact this candidate

Resume:

PROFESSIONAL SUMMARY:

Results-driven IT professional with 17+ years of expertise in Data Engineering, AI-driven Testing, and Cloud Platforms across Banking, Healthcare, Retail, and Insurance sectors.

Specialized in Data Analytics, Data Lake Architecture, Cloud Migrations, AI/ML Validation,

And Automation

Experienced in Data Quality Engineering, ETL/ELT validation, MLOps, and CI/CD pipelines in Agile/DevOps environments.

Adept at leading cross-functional teams and utilizing tools like GitHub Copilot

Proficient in Python, SQL, Azure, Snowflake, Databricks, Apache Spark, and Generative AI tools. Skilled in leveraging Power BI and MicroStrategy for Business Intelligence reporting and building automation pipelines

Designed and built Test Cases, developed, maintained, and troubleshot Test Scripts, coordinated with team members in debugging effort and assisted developers in Root Cause Analysis and clarified them the results.

Used USQL and Python for Data Lake Analytics validation.

Performed data validation on Apache Spark and Azure Databricks.

Achieved proficiency in creating Requirement Traceability Matrix (RTM) and Bug Reports.

Performed execution of test using Automation tools like HP Quick Test Professional (QTP/UFT), AXE tool and Selenium.

Proficient in Java programming, Selenium WebDriver, Junit.

Experienced in writing Test cases using locators, TestNG Annotations.

Good exposure in using of UFT (formerly QTP) Integrated environment (Step Generator, Synchronization, Actions, Recovery Scenario, Methods etc.

Performed functional web services testing using SoapUI and POSTMAN.

Participated in GEN AI IBM Watson challenge

Expert in JIRA and HP's Quality Center for Test Designing, Requirement Mapping, Reports and Defect tracking.

Maintains proper co-ordination with offshore team. Assigning work to offshore team and helping them technically and process related whenever needed.

Created various Testing Metrics in the various status reports like Weekly Status Reports, Monthly Dashboard and Test Summary Reports.

Experienced in writing Test Plans, Defining Test Cases, Developing and Maintaining test scripts according to the business specifications.

Extensive Technical debugging skills to assist in root cause analysis and provide possible solutions.

Evidenced and recognized for high productivity, attention to detail, problem solving skills and the ability to effectively engage and meet the needs of Business Requirements.

SPECIALIZED IN:

Requirement Analysis, Use Cases, Planning and Monitoring the Traceability.

Software Test Planning, Design and Execution for Functional Testing, Manual Testing, ETL Testing& Automation Testing.

Performed Integration, Regression, Functional, and User Acceptance Testing.

Done business and Software Requirements Management.

TECHNICAL SKILLS:

Cloud & Big Data Technologies

Azure Data Lake, Azure Data Factory (ADF), Snowflake, Apache Spark, Azure Databricks, Hive, Hadoop, USQL, HQL

Programming & Scripting

Python (AI/ML pipelines), SQL, Java, UNIX Shell Scripting, U-SQL

AI / ML / Generative AI

GitHub Copilot, Databricks GenAI tools, Azure AI, IBM Watson (GenAI Challenge), LLM Validation

Data Integration & ETL Tools

Informatica, Ab Initio, Query Surge, Snowflake Data Vault

Automation & Testing

Selenium WebDriver, HP ALM/QC/UFT, IBM Ignite (OTFA/CTD), SOAP UI, POSTMAN

Business Intelligence & Reporting

Power BI, MicroStrategy, SSRS

Version Control / CI-CD

Azure DevOps, GitHub, SVN

RDBMS / Data Warehousing

Teradata, Oracle, SQL Server, Netezza, DB2, Mainframe JCL

Defect & Test Management Tools

JIRA, Quality Center (ALM), Test Director

CERTIFICATIONS:

Finance Foundation - Finance Foundation Training

Insurance Foundation - Insurance Foundation Training

Insurance Institutes - AINS21

DB2 Certification - DB2 700 Certification

QC 9.2 certification – HP

QTP 9.2 certification – HP

ASDA certification – MIT University (provided by Accenture)

Azure Fundamentals – Microsoft

Query Surge certification - IBM

ETL Testing certification - IBM

AWS Certified Cloud Practitioner

SAFE 6.0 Scaled Agile

Generative AI fundamentals by Databricks

OTHER ACHIEVEMENTS:

Awarded Client appreciations certificate for contribution on various releases.

Got Mentor of the Quarter award.

Got Outstanding Onsite support award.

EDUCATION:

Bachelor of Engineering Electronics and Communication, Visvesvaraya Technological University, Belgaum, India (2007)

PROFESSIONAL EXPERIENCE:

Partners – Gastonia, NC. January 2025 – Present

Testing Specialist

Client Name: Partners Health Management

Project Name: Partners

Project Description:

It serves as a Managed Care Organization (MCO), particularly focusing on managing Medicaid services, behavioural health, and intellectual/developmental disabilities (IDD). They collaborate with North Carolina Medicaid and other community organizations to provide services to individuals with Medicaid and other complex healthcare needs.

Tools and Technologies Used:

Microsoft Azure: It is a cloud computing platform that provides a range of cloud services, including those for compute and analytics.

SQL: Structured Query Language

Python: It is an interpreted high-level general-purpose programming language. Python's design philosophy emphasizes code readability with its notable use of significant indentation. Its language constructs as well as its object-oriented approach aim to help programmers write clear, logical code for small and large-scale projects

Power BI: It is a BI reporting tool.

Snowflake Data Vault: Snowflake Data Vault refers to a methodology for designing and managing data warehouses using Data Vault principles in combination with Snowflake's cloud data platform.

Azure Data Vault: It refers to the implementation of the Data Vault methodology for data modelling on the Microsoft Azure cloud platform. The core idea of using Azure for Data Vault is to take advantage of Azure’s cloud services for scalable data storage, advanced analytics, and integration, while leveraging the flexibility and auditability of the Data Vault framework for building modern data warehouses.

GitHub Copilot: GitHub Copilot is an AI-powered code completion tool developed by GitHub in collaboration with OpenAI

Azure DevOps: Azure DevOps is a cloud-based set of development tools and services provided by Microsoft for managing the entire software development lifecycle (SDLC). It offers a comprehensive suite of features for planning, developing, testing, delivering, and monitoring software applications

JIRA: It is a test Management Tool used for issue and project tracking

Agile: Agile approaches help teams respond to unpredictability through incremental, iterative work cadences and empirical feedback. Agile propose alternatives to waterfall, or traditional sequential development

Roles and Responsibilities

Spearhead end-to-end QA strategy, test design, and automation for large-scale healthcare data platforms in an Azure and Snowflake environment.

Lead the Quality Assurance (QA) team in the delivery of Requirement Analysis, Estimation, Test Planning, Test Preparation, Test Automation Approach, Building Automation Script, and Test Execution activities with Automation scripts.

Collaborate with business stakeholders to refine user stories, acceptance criteria, and test coverage for Medicaid, Behavioural Health, and healthcare datasets.

Worked on different databases and platforms to validate and compare the data based on Daily, weekly and Monthly transactions for different Subject areas.

Used Azure and Azure data vault for data storage and Azure DevOps for running Continuous Integration and Continuous Deployment (CI/CD) pipeline jobs, Snowflake vault for performing data validation on stream and tasks data,

Used JIRA as test management tool for Agile project, perform data validations

Use GitHub for code storage and AI tools for better test data management

Used Python and Continuous Integration and Continuous Deployment (CI/CD) to perform automated data validation

Design and execute automated data validation pipelines using Python, SQL, and Azure DevOps CI/CD to ensure data accuracy across Azure Data Vault layers.

Orchestrate ELT and ETL validations for structured and unstructured data flows from legacy systems (DB2, Mainframe) to Azure Data Lake.

Leverage GitHub Copilot and AI-assisted tools for intelligent test script generation and test data optimization.

Validate Power BI dashboards by writing complex SQL queries, ensuring correctness and alignment with backend datasets.

Drive continuous improvement by hosting defect triage sessions and collaborating with cross-functional teams to resolve issues.

Monitor pipelines, optimize Spark jobs, and troubleshoot data integration tasks using Azure, Snowflake, and Databricks.

Lead offshore coordination, knowledge transfer, and project reporting, ensuring timely delivery of QA milestones and KPIs.

Trigger and troubleshoot Data Pipeline jobs those load structured and unstructured data like DB2, Mainframes and other legacy systems into Azure Data Lake Store

Perform and validate 'Extract Load & Transform' (ELT) and 'Extract Transform & Load' (ETL) operations and Trigger and troubleshoot Spark jobs to load structured and unstructured data from various sources like DB2, Mainframes and other legacy systems into Azure Data Lake Store

Prepared Test Plan, Test Strategy, Test Cases, Gap Analysis, Test Closure documents.

Design Project Test Plan and authoring EPIC Stories using JIRA & Kanban tools and there by preparing Artifact generator report and Traceability Matrix

Hosted defect Triage calls with different stakeholders.

Prepared metrics like Daily Status reports

Prepared Timelines, Project Plan and Mitigation plans and shared with required Stakeholders.

Environment: JIRA, BRD/SRS, SQL, Python, ELT, ETL, OLAP, Microsoft Azure, Snowflake Vault, Azure Data Vault, Power BI.

Ahold Delhaize – SALISBURY, NC January 2019 – December 2024

QA Lead/Sr. QA/QE

Client Name: AHOLD DELHAIZE

Project Name: Nitro CIP, Nitro IRI, Personalization, PRISM, SDP SC Analytics, and Instock Omnichannel View, CPP Emerald POS

Project Description:

Nitro CIP – Worked as a Sr QA. Objective of the project is to migrate the source from Legacy system to Microsoft Azure cloud.

Nitro IRI – Worked as a Lead. Objective of this project is to replace the reporting tool – Liquid Web Tool with MicroStrategy.

Personalization – Worked as a lead. Objective of this project is to build the DataMart for the Azure Applications (FiONA).

PRISM - Worked as a Sr QA with multiple teams. Objective of this project was to migrate the existing applications to Microsoft Azure Cloud and built a different reporting system.

SDP SC Analytics – Worked as QE. Objective of this project is to manage Delhaize distributions by RBS instead of third-party vendors

Instock Omnichannel View – Worked as a lead. Objective of this project is to manage Ahold distributions by RBS and to create a common reporting platform for both Ahold and Delhaize

CPP Emerald POS – Worked as a Sr QA, Objective of this project is to manage pricing and coupon details

Tools and Technologies Used:

Query Surge: It is used for comparing the data from different sources and environments.

HP ALM 12.53: It is a web-based tool to manage the application life cycle right from project planning, requirements gathering until testing and deployment.

Combinatorial Test Design (CTD): It is used to plan, prepare scenarios, scripts, steps and Expected Results for each release and Test Cycle.

Microsoft Azure: It is a cloud computing platform that provides a range of cloud services, including those for compute and analytics.

Hive Query Language (HQL): It is used to query distributed data storage including Hadoop data.

Big Data Technologies: Big data is a field that treats ways to analyse, systematically extract information from, or otherwise deal with data sets that are too large or complex to be dealt with by traditional data-processing application software

IBM Ignite Tool: IBM IGNITE is a single platform for accessing Test Automation solutions, with the use of CTD (Combinatorial Test Design) and OTFA (Optimized Test Flow Automation) it become a primary framework for automation. The tools and methodologies will be used for optimization, Automation & Analyzation i.e., defect analytics.

SQL: Structured Query Language

USQL: It is used to fetch data and analyse data in Data Lake Store.

SQL Server Management Studio (SSMS): It is a Microsoft tool used for managing a SQL server infrastructure

Python: It is an interpreted high-level general-purpose programming language. Python's design philosophy emphasizes code readability with its notable use of significant indentation. Its language constructs as well as its object-oriented approach aim to help programmers write clear, logical code for small and large-scale projects

Azure Data Factory (ADF): It is used for running the pipeline jobs.

MicroStrategy/Power BI: It is a BI reporting tool.

Azure Databricks: It is a cloud-based data engineering tool used for processing and transforming massive quantities of data and exploring the data

Liquid Web Tool: It is a reporting tool

Roles and Responsibilities

Gathered Business and functional requirements from BA and performed the testing as per Business requirements.

Worked on different databases and platforms to validate and compare the data based on Daily, weekly and Monthly transactions for different Subject areas.

Used Databricks, Python and CI/CD to perform automated data validation

Performed API webservices validation using SOAP UI and Postman tool.

Analysed the client server response codes by comparing it with DTD (Detail Technical Design) document.

Worked on Ignite Test Automation platform

Develop U-SQL and T-SQL Scripts, HIVE, PIG and Python scripts using Databricks to validate the data flows among Raw Data Store (RDS), Structured Data Model (SDM), Curated Data Model (ADLS) and Big Data Hadoop Distributed File Systems (HDFS)

Trigger and troubleshoot Data Pipeline jobs those load structured and unstructured data from DB2, Mainframes and other legacy systems into Azure Data Lake Store

Perform data reconciliation, transformation logic validation, row count check, schema comparison, data type check, duplicate check and Null validations during all types of data loads using data tools like HIVE, Putty, SQL Server Management Studio, Win SQL, Query surge, and TOAD for Oracle

Prepared Test Plan, Test Strategy, Test Cases, Gap Analysis, Test Closure documents.

Design Project Test Plan and authoring EPIC Stories using HP ALM, JIRA & Kanban tools and there by preparing Artifact generator report and Traceability Matrix

Validate Business Intelligence reports developed in MicroStrategy and Power BI using visualization techniques and by authoring SQL Scripts

Created JIRA and Devops dashboards for project reporting using Power BI.

Performed data validation using Query Surge.

Performed CTD and OTFA automation validation using IBM Ignite tool

Performed data validation using Apache Spark and Azure Databricks.

Performed data validation using Hive Query Language in Ambari tool.

Hosted defect Triage calls with different stakeholders.

Coordinate with offshore teams, Client and different stakeholders.

Prepared metrics like Daily Status and Weekly Status Reports, SLA/KPI

Prepared Timelines, Project Plan and Mitigation plans and shared with required Stakeholders.

Environment: HP ALM, JIRA, BRD/SRS, SQL, Python, UNIX, Teradata, ETL, OLAP, OLTP, TOAD, Oracle, SQL Server, Teradata, cloud Migration, USQL, Hive, Oracle, Microsoft Azure, Query Surge, Azure Data Factory, MicroStrategy, SQL Developer, Power BI, API Webservices.

Wells Fargo– WINSTON SALEM, NC February 2017 – January 2019

System Quality Analyst

Client Name: WELLS FARGO

Project Name: WHOLESALE DQS, ISAM and OMDM and Cloud Migration.

Project Description:

DQS is a part of the Wholesale Information Improvement Program. The goal of the DQS is to provide a set of standardized Business rules on common data elements which can be used across OBs and SORs. This will reduce the overall cost associated with maintaining multiple sets of Business rules.

ISAM is an independent sampling application. The purpose of this project is to establish validation of critical data elements used by Risk and regulatory reporting, reviewing a sample of new and renewed loans and existing loans where data has changed for the critical Business elements.

Officer Master Data Management (OMDM) is a centralized tool that allows authorized users to create unique officer IDs leveraging existing information from ORBT & HR non-sensitive databases.

Tools and Technologies Used:

Autosys: It is an automated job control system for scheduling monitoring and reporting.

Oracle SQL Developer 13.10: It is an integrated development environment tool for working with SQL in Oracle databases.

HP ALM 12.53: It is a web-based tool to manage the application life cycle right from project planning, requirements gathering until testing and deployment.

TOAD DataPoint 4.2: It is a multiplatform tool to access data from multiple sources and simplifies data access and writing SQL queries.

Informatica: ETL tool

IDQ: Informatica Analyst is a web-based tool used for analysing, profiling and cleansing the data.

Teradata 15.10: Teradata is a tool which provides a single, integrated view of data for smarter and faster decisions using real time data and insights.

JIRA: This tool helps in planning, tracking, release and report the tasks/work.

BMC Service Request Management: This is a web-based tool used for managing Service requests.

UFT 14.0: It provides functional and regression test automation for software applications and environments.

WARP System: It is used for report generation.

IAM: Identity Access Management manages access requests, approvals.

Roles and Responsibilities

Validated the Business rule specifications and the application functionality as per the Business requirements in BRD.

Validated the business rules against the IHUB data mart SOR tables by creating the Test data, SQL queries, Automated and manual test cases.

Created Test Cases using Element locators and Selenium WebDriver methods.

Enhanced Test cases using Java programming features and TestNG annotations.

Executed Selenium Test cases and provided reports.

Performed Regression Testing using Selenium.

Performed validation of the payments in Android mobile emulator using SoapUI tool.

Updated SDLC tracker.

Validated the status of exceptions.

Created Test data for Exception’s validations.

Perform UI testing for ISAM application and validated the UI data with DB data in SQL server.

Validated SSRS reports in ISAM application and compared the data with DB data.

Supporting and coordinating onsite/offshore teams.

Participates in all the review sessions, planning meetings and discussions.

Mapping processes, collects data and analyses processes to uncover root cause of problems.

Work with Business teams for process Improvement.

Co-coordinating and maintaining relationship with other teams.

Prepared Test Plan, Requirement Traceability Matrix (RTM).

Provided estimation for the testing effort to my manager.

Performed gap analysis.

Provides status report to my manager.

Worked on Cloud Migration validation for OMDM and ISAM applications.

Worked on Exadata migration validation.

Environment: HP ALM, HP UFT, BRD/SRS, SQL, SQL Scripts, UNIX, Teradata, ETL, OLAP, OLTP, TOAD, VB Script, XML, PL/SQL, Oracle, SQL Server, Teradata, SSRS, Informatica, JIRA, SVN, JavaScript, Exadata, cloud Migration

Liberty Mutual – Dover, NH June 2016 – February 2017

Test Lead

Client Name: Liberty Mutual Insurance

Project Name: LM Personal Insurance

Project Description:

Liberty Mutual Insurance is the second-largest property and casualty insurer in the United States. It offers a wide range of insurance products and services, including personal automobile, homeowners, workers' compensation, commercial multiple peril, commercial automobile, general liability, global specialty, group disability, fire and surety.

Managed a team of 4 people.

Tools and Technologies Used:

Teradata Viewpoint 15.0: This advanced web-based systems management portal for Teradata performance and health management is easy to use, even for novices. It provides a consistent interface via configurable port lets which allows the user to customize their own systems management dashboard.

Teradata 13.10: Teradata is a tool which provides a single, integrated view of data for smarter and faster decisions using real time data and insights.

Splunk: It displays real-time information on the CPU, memory, I/O, and disk processes for all the hosts that the Splunk App for UNIX and Linux has collected data for.

RTC Tool: It provides a collaborative environment to manage all aspects of the work—such as plans, tasks, revision control, build management, and reports.

JIRA: This tool helps in planning, tracking, release and report the tasks/work.

BMC Service Request Management: This is a web-based tool used for managing Service requests.

DVR Tool: It is a Trianz developed Liberty Mutual proprietary tool used for comparing data between same or cross databases like Hive, Teradata, and Oracle.

Eclipse Photon: It is a Java IDE.

Selenium WebDriver: It is an open-source tool for automating web application testing.

Roles and Responsibilities

Participates in all the review sessions, planning meetings and discussions.

Mapping processes, collects data and analyses processes to uncover root cause of problems.

Work with Business teams for process Improvement.

Co-coordinating and maintaining relationship with other teams.

Supporting and coordinating onsite/offshore teams.

Used Splunk and Teradata viewpoint to generate metrics.

Used DVR tool for comparing the records between different environments like Bigdata, Teradata and Oracle.

Automated the process of DDL validation using Macros which helped in reducing 85% of manual effort.

Used RTC and JIRA for logging and tracking the issues.

Designed Test cases Using Selenium WebDriver and TestNG.

Enhanced the Selenium Test cases for Cross browser testing.

Involved in Regression Testing using Selenium.

Validated ETL transformations.

Validated ETL extracted business secured data using hash method.

Extensively used knowledge of data warehousing concepts, such as slowly changing dimensions (SCD), Change Data Capture (CDC), and relational and dimensional data modelling techniques.

Experience in preparation of SMR (Senior Management Review), WSR (weekly status report) and Client MSR (Monthly Status Report) decks.

Used Data profiling tool like Dataflux for profiling the data.

BB&T bank (now Truist) – Greensboro, NC March 2016 – June 2016

Sr. QA

Client Name: BB&T Bank

Project Name: National Penn Deposits

Project Description:

BB&T (Branch Banking and Trust) is one of the largest financial services holding companies in the U.S.BB&T acquired National Penn Bancshares Inc. which helped BB&T bank to greatly expand footprint in Pennsylvania.

Tools and Technologies Used:

Quality Center 11.0: Quality Center was customized and is a part of Test Script Execution, Defect Tracking and as a Test Repository.

Roles and Responsibilities

Participates in all the review sessions, planning meetings and discussions.

Mapping processes, collects data and analyses processes to uncover root cause of problems.

Work with Business teams for process Improvement.

Co-coordinating and maintaining relationship with other teams.

Supporting and coordinating onsite/offshore teams.

Validate the Nat Penn data in the Nat Penn system and mapped it with BB&T data for sweep account, ZBA account and Choiceline account.

Travelers– Hartford, CT January 2015 – February 2016

System Analyst / QA Test Lead

Client Name: Travelers

Project Name: Business Insurance

Project Description:

Travelers is one of the nation’s largest property casualty companies. Traveler’s BI deals with Business Insurance like Workers Compensation, Commercial Auto, and Buildings Insurance.

Tools and Technologies Used:

Quality Center 11.0: Quality Center was customized and is a part of Test Script Execution, Defect Tracking and as a Test Repository.

Teradata 13.10: Teradata is a tool which provides a single, integrated view of data for smarter and faster decisions using real time data and insights.

BRE: Business Rule Engine

Mainframe: Used for validation of Source files.

Abinitio: ETL tool

Roles and Responsibilities:

Participates in all the review sessions, planning meetings and discussions.

Mapping processes, collects data and analyses processes to uncover root cause of problems.

Work with Business teams for process Improvement.

Co-coordinating and maintaining relationship with other teams.

Supporting and coordinating onsite/offshore teams

Does quality audit on behalf of the project.

Provides Test statistics and metrics to Managers.

Analysed XML files in Pipeline application for the data verification in testing

Analysed the requirements from Clients and prepare Test Case as MIS number.

Performed load validation using SQL.

Planned, documented, and tested extensive data integration process to a data warehouse and data mart.

Implemented effective data quality testing standards, verification procedures and tools.

Developed and presented executive level data quality status and process improvement.

Used HP Quality Center: Test Center, Test Lab, Defect management.

Validated the extracted data from Abinitio tool.

Validated the Abinitio graphs.

Developed & executed an advanced plan to improve data testing with tools and the QA process with industry best-practices.

Developed & executed an advanced plan to improve data testing with tools and the QA process with industry best-practices.

Extensively used Teradata to verify source data and target data after the successful workflow runs using SQL

Involved in writing complex SQL queries by combing data from multiple tables with various filters

Holds strong experience in working in the complete Software Test Life Cycle (STLC) within 'Traditional, moderated and Agile methodology testing projects'.

Thorough understanding of various phases like Requirements, Analysis/Design, Development and Testing.

In depth technical understanding of Data Warehousing, Data Validations, OLAP.

Experience in creating Test Data, Analysing Defects and interacting with development team to resolve issues.

Experienced in testing requirements that were developed in ETL and BI tools and reviewed scripts for positive and negative test scenarios, and prepared test summary reports.

Experienced in Creating and maintaining SQL Scripts to perform back-end testing on databases

Excellent communication skills, ability to work as part of a team and on own. Versatile team player with interpersonal, technical documentation skills and efficiently handling multiple projects simultaneously.

Created a Macro which automated the PDR process and helped in saving 90% of manual effort.

Environment: HP ALM, HP QTP, BRD, MIS, SQL, SQL Scripts, UNIX, Teradata, ETL, OLAP, TOAD, Macros using VB Script, XML, JCL, BRE, ACE (IT Express).

BOFA - Charlotte, NC May 2012 – January 2015

Sr. QA ETL

Client Name: Bank of America (BOA)

Project: ECM Testing Team

Project Description:

Enterprise Capital Management is accountable to deliver information regarding Bank of America’s Capital position to regulators, shareholders and the Bank’s internal management. SABER project was initiated to meet the project objectives by upgrading the San Fransisco calculator to match the results and controls of the Charlotte calculator.

Tools and Technologies Used:

Quality Center 11.0: Quality Center was customized and is a part of Test Script Execution, Defect Tracking and as a Test Repository.

Teradata 13.10: Teradata is a tool which provides a single, integrated view of data for smarter and faster decisions using real time data and insights.

Control Panel: BOA in-house tool for User Interface.

MAT: Manual Adjustment Tool

Roles and Responsibilities:

Participates in all the review sessions, planning meetings and discussions.

Does quality audit on behalf of the project.

Provides Test statistics and metrics to QA Managers.

Mapping processes, collects data and analyses processes to uncover root cause of problems.

Work with Business teams for process Improvement.

Co-coordinating and maintaining relationship with other teams.

Supporting and coordinating onsite/offshore teams.

Done testing on Waterfall model and Agile methodology.

Prepared and implemented detailed test plan for the complex financial reports which included Quantitative (Data Testing) and Qualitative (Business Rules Implementation) testing.

Analysed the Requirements from client and developed Test cases based on functional requirements, general requirements and system specifications provided as JIRA Ticket Number.

Worked as QA Lead and performed the overall responsibility of modules and worked with other developers and ensured smooth delivery.

Analysis of requirements and use case documents. Work with business analysts and address the gaps in requirements.

Ability to work independently or in a team environment or in a rapid pace environment.

Possess strong interpersonal skills, communication, and presentation skills with the ability to interact with people at all levels.

Experience in Bug Tracking and Reporting using Test Director/ HP Quality Center.

Work closely with client architects and formulate design for the individual modules. Prepare Technical design documents (TSD’s).

Used PDF comparator and Excel comparator tool for validation of the



Contact this candidate