Post Job Free
Sign in

Automation Test Engineer

Location:
Plainsboro, NJ
Posted:
November 21, 2024

Contact this candidate

Resume:

Madhu Korapati

Test Automation Architect

Email: **************************@*****.***

Phone: 810-***-****

Professional Summary:

•Overall 15+ years of experience as a Automation Test Engineer in designing and executing test strategies, developing automation test frameworks, and improving software quality. Proficient in high-quality software delivery. Service/API testing, web apps, creating automation test scripts in Selenium web drive & in playwright framework.

•Experienced working in Waterfall and Agile SDLC methodology environments.

•6+ Years in Automation Test architect designed frameworks using BDD, TDD, Mock, Microservices ATDD

•As a Test Architect provided end to end solution integration of ALM, JIRA and Test execution platforms and environments for complex distributed applications using Test automation Tools Like TOSCA and Selenium

•Identify Components: Break down the system into its individual components (microservices, databases, external APIs, front-end, back-end, etc.).

•Conducting IoT Security Testing on HSM and ATM’s Using Selenium Driver & Rest Sharp API Testing

• Involved in developing comprehensive test plans and test cases based on functional requirements and specifications

•Hands on Manual Test case design and execute manual test cases and document test results accurately

•Interaction Flow: Understand how different components communicate (REST, gRPC, Kafka, etc.), and what dependencies exist between them (e.g., message queues, databases, external systems).

•Environment Requirements: Identify if the system is deployed in cloud infrastructure, on-premises, or hybrid, and whether containerization (e.g., Docker, Kubernetes) is used.

•Unit Tests: These should test individual components of the system (e.g., individual microservices, modules).

•Integration Tests: Tests interactions between components (e.g., service-to-service, API calls, database interactions).

•End-to-End (E2E) Tests: Full flow tests, simulating a user journey through the system.

•Performance/Load Tests: Test how the system behaves under load (e.g., number of requests per second).

•Fault Tolerance & Resilience Tests: Simulate failures to ensure the system can recover or degrade gracefully.

•Choose appropriate testing frameworks for each level of testing:

•Unit Testing: JUnit, NUnit, pytest, etc., depending on the language and framework.

•Mocking and Stubbing: Use tools like Mockito, Wire Mock, or nock to mock external dependencies for unit/integration tests.

•API Testing: Tools like Postman, Rest Assured, or Super Test.

•End-to-End (E2E) Testing: Selenium, Cypress, Test Cafe for browser-based testing, or tools like Cucumber for behavior-driven development (BDD).

•Performance Testing: JMeter, Gatling, or LoadRunner.

•Distributed Test Orchestration: TestNG, pytest with parallel execution, or dedicated tools like Kubernetes Testbed or Jenkins with distributed nodes.

•Version Control Integration: Connect your CI/CD tool (Jenkins, GitLab CI, CircleCI, etc.) with your version control system (e.g., GitHub, GitLab).

•Automate Test Execution: Integrate your test automation suite into the CI/CD pipeline to ensure tests run with every commit/push or on a schedule.

•Parallelization and Load Distribution: If tests are time-consuming, consider running them in parallel using tools like Selenium Grid or Kubernetes clusters for scalability.

•Reporting: Use tools like Allure, ReportPortal, or Jenkins for detailed test reports and dashboards.

•Containerization: Use Docker or Kubernetes to replicate the actual production environment for test execution.

•Service Virtualization: If the distributed components have external dependencies (e.g., external APIs, third-party services), use service virtualization tools like Hoverfly or WireMock.

•Test Environment Management: Use tools like Docker Compose or Helm charts for managing multi-container environments for testing purposes.

•Multi-Environment Support: Ensure the testing suite works across different environments (dev, staging, production) with appropriate configurations.

•Data Management: Design a strategy for preparing test data (e.g., using database snapshots, mocking services, or data generators).

•Test Isolation: Ensure tests are isolated by resetting or mocking databases, queues, or other shared state to avoid side effects.

•Data Cleanup: Ensure that test executions clean up any data they create, ensuring that subsequent tests run in a consistent state.

•Maintainability: Keeping test scripts modular, reusable, and independent to ensure they’re easy to update and maintain.

•Retry Logic: Implement retry mechanisms for flaky tests to ensure they pass consistently, especially in distributed systems where timing issues can occur.

•Monitoring and Logging: Set up centralized logging (e.g., ELK Stack, Prometheus) for your test execution to diagnose failures effectively. Each test should have sufficient logging for diagnostics.

•Error Handling: Implement intelligent error handling in your test scripts for unexpected failures, retries, or fallbacks.

•Parallel Test Execution: In distributed systems, tests can be time-consuming, so consider running tests in parallel. Tools like Selenium Grid, Docker Swarm, or Kubernetes help to distribute tests across multiple nodes.

•Scaling Test Execution: Ensure that your automation suite can scale depending on the load. For example, you can run a set of tests on multiple containers or even across multiple cloud regions to simulate traffic and failures.

•Asynchronous Communication: Distributed systems often use asynchronous communication. You need to implement ways to wait for events or messages (e.g., polling, event listeners).

•State Consistency: Ensure tests account for eventual consistency when dealing with distributed databases or microservices that operate in an eventually consistent manner.

•Retry and Timeout Handling: Network latency, server crashes, or service downtime can cause tests to fail due to timeouts. Build intelligent retry logic in your tests.

•Test Failures and Recovery: Simulate service failures (e.g., using Chaos Monkey) and ensure your system can handle recoveries or graceful degradations.

•Test Coverage: Continuously track and improve test coverage for your system to ensure that critical paths are thoroughly tested.

•Refactor Test Code: Just like production code, test code needs to be refactored to improve readability, maintainability, and efficiency.

Certifications:

Azure fundamentals Certified

MCCP C# Programing

MCCP Web Application designer & Development

Tools and Technologies for Distributed System Test Automation

Microservices Testing: Spring Boot Test, Pact (Consumer-Driven Contract Testing)

API Testing: Postman, Rest Assured, SoapUI, or Applitools

Mocking Frameworks: Wire Mock, Mockito, Hoverfly

Load Testing: JMeter, Gatling, K6

Test Orchestration and CI/CD: Jenkins, GitLab CI, CircleCI, ArgoCD

Service Virtualization: Hoverfly, Mock Server, Mountebank

Containerized Testing: Docker, Kubernetes, Docker Compose, Helm

Distributed Test Execution: Selenium Grid, Cypress Parallelization, TestNG (parallel execution)

Test Management: HP ALM, JIRA, TFS

Technical Skills:

Testing Tools

Selenium IDE, Web Driver/GRID, Zephyr Enterprise, Postman, SoapUI

Languages

JAVA, C#, Python

Test Framework

JUnit, Cucumber, Karate, TestNG, Data Driven Framework, Keyword Driven, TDD, BDD, Playwriter, Cypress, Ghost Inspecter

Build Tools

Maven, Test rail

Continuous Integration Tools

Jenkins, Azure, TFS, Docker

Bug Tracking Tools

JIRA, HP ALM, TFS

Source Version Control Tool

Bitbucket, GIT

Performance Tools

JMeter, HP LoadRunner, Gata long, Load View

Site reliability Test

RTN & RTO

Accessibility Testing

ADA & WCAG ISO 9001, HIPA, ISO Bank Mocks & ICD

programing languages

C#, Java, Java script, Python, PY Test, PY Tourch

Databases

SQL Server, Postrace, No-SQL, Oracle

CRM

Sales force, Guidewire

BI Tools

MS BI

API -SOAP

Ready API, Postman, Swagger cloud

Monitoring Tools

Dynatrace, New relic, SolarWinds, Splunk, downtime analytics

Data base Test tools

SQL Server, No SQL, Data Factory, Siob, selenium, Test Complete

Professional Experience:

Master card Feb 2024 to Oct 202 4

Automation Test Architect

Project:

The MasterCard Egypt Program is not a single, fixed initiative, but rather can refer to various programs or partnerships that MasterCard may run in Egypt. These could include financial inclusion efforts, digital payment solutions, or specialized initiatives aimed at promoting cashless transactions, enhancing financial literacy, or supporting small businesses and entrepreneurs in the region. It providing financial inclusion, digital payment solutions, e-commerce fin tech collaborations, foundation initiatives, Cashless Egypt & Consumer programs.

Responsibilities:

•Analyzed user requirements and defined functional specifications using Agile and Scrum methodologies.

•Supported the overall software development life cycle (SDLC) process, including continuous integration and continuous deployment (CI/CD) for a playwright framework.

•Performed Windows and Web based application testing.

•Validating Mock Services, ISO Parsing Codes with Payment legs and financial services using API automation.

•Validating Agent to merchant, merchant to agent, consumer to agent all scenarios with UI Automation using BDD framework

•Validating and script generating for Batch jobs, API, Webservices, Mock services, Micro services

•Design and maintain database schemas using SQL Server or other database systems.

•Designed and created automated tests for web applications using the Playwright testing framework and implemented both UI and API.

•Designed and implemented a comprehensive automation framework using Selenium with C#, resulting in a 50% reduction in test execution time.

•Documented test cases, test scripts, and test results. Create and maintain test documentation to facilitate knowledge sharing and collaboration within the QA team.

•Updated and refactored existing test scripts and frameworks in response to application or test requirements changes.

•Ensure that test automation is a seamless part of the build and deployment processes.

•Created and maintained test automation frameworks and libraries to ensure scalability and reusability.

•Modify and update test scripts and scenarios in response to API specifications or application requirements changes.

•Integrated Playwright framework to generate HTML, Allure test reports, and dashboards for stakeholders, summarizing test results and coverage.

•Continuously sought opportunities to improve automation processes and methodologies, incorporating feedback and adopting new best practices.

•Developed and maintained over 200 automated test cases, achieving 95% test coverage for key application functionalities.

•Incorporated GraphQL API tests into CI/CD pipelines using tools such as Azure DevOps to ensure continuous validation. Enabling automated validation with each code change and deployment.

•Developed tests for various aspects of GraphQL APIs, including queries, mutations.

•Worked with Product Owners and stakeholders to gather requirements, refine user stories, and ensure that backlog items align with business goals.

•Leveraged extensive Agile experience to enhance testing processes, ensuring timely and efficient delivery of high-quality software.

•Designed and implemented robust CI/CD pipelines using Azure DevOps, resulting in a 40% reduction in deployment times and increased release frequency.

•Documented test cases, test scripts, and test results. Create and maintain test documentation to facilitate knowledge sharing and collaboration within the QA team.

•Updated and refactored existing test scripts and frameworks in response to changes in the application or test requirements.

•Ensure that test automation is a seamless part of the build and deployment processes.

•Automated UI and backend (e.g., SQL, logs) scripts in a C# language-based Selenium framework stored in a GitLab repository that integrates with Cucumber.

•Performed complex API testing using Postman and written SQL queries for database testing.

•Collaborated with data engineers, developers, and business analysts to identify test scenarios and requirements for data integration testing.

•Conducting IoT Security Testing on HSM & ATM other devices

Technologies Used: Selenium, BDD, Cucumber, Play Wright, ADO, Jira, SQL Server, SOAP UI, Postman, Agile, TypeScript, C#, OOPS, YAML, Gherkin, GraphQL, ADO, TFS, Xray, Zephyr, Azure ADO, java, Aras Innovator, Pac2000, Service now.

CITI BANK JAN 2019 to Oct 2023

SDET

Responsibilities:

•Collaborated with developers to automate tests using Selenium WebDriver, Playwright with typescript reducing manual testing efforts.

•Created and collaborated with different stakeholders to deploy the application to production using Remedy, Service now and Pac2000 PLM tools.

•Facilitated defect triage calls and collaborated closely with stakeholders to resolve issues efficiently.

•Generated allure and HTML report from VS Code for Playwright framework for one of the web applications

•Implemented both UI and API in Playwright framework using Typescript.

•Conducted end-to-end testing of new application and compared the results with legacy system results.

•Developed and executed SQL queries for performing database testing.

•Developed and maintained a BDD framework using Spec Flow, integrating it with .NET testing tools like NUnit.

•Wrote automation test scripts for both Desktop and web-based applications.

•Developed several features in the application to accommodate different business rules

•Develop and consume applications using RESTful web services

•Automation testing using Selenium and playwright writing and modifying test suits to perform Smoke and regression testing.

•Created and maintained automation test suits for 3 web applications using Selenium C#.

•Develop step definitions in C# that map Gherkin steps to automation code, implementing the logic to interact with the application under test.

•Ensure that BDD tests run as part of the build and deployment process, providing immediate feedback on application quality.

•Integrated Playwright tests with continuous integration/continuous deployment (CI/CD) pipelines for automated testing and deployment processes.

•Created and distributed LivingDoc test reports and dashboards that summarize test execution results, coverage, and any issues encountered.

•Participated in backlog grooming sessions and prioritized user stories in collaboration with the Product Owner, ensuring alignment with business objectives and customer needs.

•Created and managed feature files using Gherkin syntax, defining clear and concise acceptance criteria and scenarios.

•Collaborated with cross-functional teams to refine Gherkin scenarios and step definitions, aligning tests with evolving business requirements and application functionality.

•Migrating web applications from 2012 to 2016 servers.

•Worked on API Testing tools like Postman and SOAPUI.

•Expert in CICD pipeline and enabling all the scans related used within the Organization.

•Acted as a single point of contact for creating change Requests and handled the complete deployment process from staging env to production independently.

•Designing and developing using .NET/C# and testing with proactive monitoring of smoke test applications

•Acting as an Offshore representative/Lead and evolved as a first point of contact in the team for technical issues in a short span.

•Excellent understanding and experience working in a mature agile setting.

•Responsible for production support and resolving issues.

Technologies Used: Selenium, Spec flow, Nunit, UCD, Jenkins, GitHub, Postman, Agile, Jira, UCD, Soap UI, SVN, Oracle, Service Now, Pac2000, C#, SQL, Azure ADO, YAML, Gherkin, C#, ASP.NET Core, Xray, Zephyr, SQL, API Management Service, Power Automate, Typescript. java

Dell International July 2019 to Jan 2020

Automation Test Architect

Responsibilities:

•Actively participated in Agile Scrum ceremonies, contributing to sprint planning, reviews, and retrospectives.

•Identified and resolved defects, collaborating with development teams to ensure timely fixes. Utilized problem-solving skills to identify, analyze, and resolve complex issues within testing environments.

•Used SQL Server to validate the stored procedures and data integrity.

•Developed and executed automated test scripts for .NET applications using tools like NUnit and Spec Flow.

•Experience in all stages of SDLC (Software Development Life Cycle) involving user requirements, analyzing, designing, coding, implementation, debugging, testing, deployment and documentation across diverse industries / work environments.

•Providing regular updates for the testing status, throughout the Software Development Life Cycle (SDLC) including UAT phases.

•Involved in requirement analysis, design, and development of web-based applications in the fast faced AGILE METHODOLOGY.

•Performed Sanity Testing, Functionality testing and Regression Testing.

•Executed manual and automated test cases to verify software functionality and performance according to test plans and requirements.

•Assist in generating test summary reports and dashboards to track progress and communicate findings to the QA team and stakeholders.

•Assist with ad hoc testing activities as needed, including exploratory testing to uncover unexpected issues.

•Executed manual test cases for web and mobile applications, identifying and documenting over 50 defects, leading to a 20% improvement in software quality.

•Re-test fixed issues and confirm that bugs have been resolved and that no new issues have been introduced.

•Perform regression testing to ensure that recent code changes have not adversely affected existing functionality.

•Worked closely with project managers, developers, and other stakeholders to understand project requirements and ensure alignment between testing and development activities.

•Oversee the defect management process, including defect identification, documentation, tracking, and resolution.

•Led a team of 10 QA engineers in designing and executing test plans, resulting in a 30% reduction in defect rates and improved software quality.

•Monitored sprint progress and team performance using Jira, providing regular status updates and reports to stakeholders on project health and delivery.

Technologies Used: Bugzilla, ALM, Jira, Quality Centre, SVN, Postman, Agile, Service Now, SQL Server, Oracle, C, ASP.Net, .Net Core 2.1, C#

Qentelli Services Oct 2018 to Jun 2019

QE Manager

Responsibilities:

•Actively participated in Agile Scrum ceremonies, contributing to sprint planning, reviews, and retrospectives.

•Identified and resolved defects, collaborating with development teams to ensure timely fixes. Utilized problem-solving skills to identify, analyze, and resolve complex issues within testing environments.

•Used SQL Server to validate the stored procedures and data integrity.

•Developed and executed automated test scripts for .NET applications using tools like NUnit and Spec Flow.

•Experience in all stages of SDLC (Software Development Life Cycle) involving user requirements, analyzing, designing, coding, implementation, debugging, testing, deployment and documentation across diverse industries / work environments.

•Providing regular updates for the testing status, throughout the Software Development Life Cycle (SDLC) including UAT phases.

•Involved in requirement analysis, design, and development of web-based applications in the fast faced AGILE METHODOLOGY.

•Performed Sanity Testing, Functionality testing and Regression Testing.

•Executed manual and automated test cases to verify software functionality and performance according to test plans and requirements.

•Assist in generating test summary reports and dashboards to track progress and communicate findings to the QA team and stakeholders.

•Assist with ad hoc testing activities as needed, including exploratory testing to uncover unexpected issues.

•Executed manual test cases for web and mobile applications, identifying and documenting over 50 defects, leading to a 20% improvement in software quality.

Technologies Used: Bugzilla, ALM, Jira, Quality Centre, SVN, Postman, Agile, Service Now, SQL Server, Oracle, C, ASP.Net, .Net Core 2.1, C#

DOL-WA Aug 2012 to Jun 2018

Automation Test Architect

Responsibilities:

•Assist in generating test summary reports and dashboards to track progress and communicate findings to the QA team and stakeholders.

•Assist with ad hoc testing activities as needed, including exploratory testing to uncover unexpected issues.

•Executed manual test cases for web and mobile applications, identifying and documenting over 50 defects, leading to a 20% improvement in software quality.

•Re-test fixed issues and confirm that bugs have been resolved and that no new issues have been introduced.

•Perform regression testing to ensure that recent code changes have not adversely affected existing functionality.

•Worked closely with project managers, developers, and other stakeholders to understand project requirements and ensure alignment between testing and development activities.

•Oversee the defect management process, including defect identification, documentation, tracking, and resolution.

•Led a team of 10 QA engineers in designing and executing test plans, resulting in a 30% reduction in defect rates and improved software quality.

•Monitored sprint progress and team performance using Jira, providing regular status updates and reports to stakeholders on project health and delivery.

•Actively participated in Agile Scrum ceremonies, contributing to sprint planning, reviews, and retrospectives.

•Identified and resolved defects, collaborating with development teams to ensure timely fixes. Utilized problem-solving skills to identify, analyze, and resolve complex issues within testing environments.

•Used SQL Server to validate the stored procedures and data integrity.

•Developed and executed automated test scripts for .NET applications using tools like NUnit and Spec Flow.

•Experience in all stages of SDLC (Software Development Life Cycle) involving user requirements, analyzing, designing, coding, implementation, debugging, testing, deployment and documentation across diverse industries / work environments.

•Providing regular updates for the testing status, throughout the Software Development Life Cycle (SDLC) including UAT phases.

Technologies Used: BugZilla, ALM, Jira, Quality Centre, SVN, Postman, Agile, Service Now, SQL Server, Oracle, C, ASP.Net, .Net Core 2.1, C#, Java, HP ALM, JMeter, BDD,Selenium, Typescript

AINS INC JAN 2010 to Jun 2012

Automation Tester

Responsibilities:

As a Automation test engineer collaborating with Developers and business users to convert requirements into Test cases

Validating Test cases with business users and product owners for automation and it coverage

Validating FOIA User flows using Microsoft workflows to authorize role-based work scenarios and coverages

Validating Data migrations and development features with UI test automation scripts using Microsoft testing tools

using Microsoft Test manager preparing test suites and mapping test cases with TFS Users stories

Validating build coverage and build pass or fail criteria using Azure ci cd

Validating end to end System testing and regression suit every quarter and publishing test results to stack holders.

As a sr. Tester interacting with developers for production issues re generating and providing Logs

Working with developers as peer-to-peer review defects and bugs before build release

Technologies Used: BugZilla, ALM, Jira, Quality Centre, SVN, Postman, Agile, Service Now, SQL Server, Oracle, C, ASP.Net, .Net Core 2.1, C#, Hummingbird, Documentum

Education Details:

•Master of computer application (MCA) from Chennai University-2004

•Bachelor of Engineering, Electronics and Communication, JNTU.



Contact this candidate