AMAR AVILELI
Sr. SDET DevOps QA Automation Tester Performance Tester (JMeter)
(AWS Certified: Solution architect, SysOps Administrator)
(Salesforce: Platform Developer I & II)
Oracle Certified Professional (OCP),
ISTQB Certified Professional)
****.*****@*****.*** +1-201-***-****
SDET with 13+ years of IT experience, specializing in Java, JavaScript, and Python for diverse applications (Mobile, Web, Cloud/AWS, API, client-server). Proficient in SDLC and STLC, I build robust automation frameworks (Playwright, Selenium, Cypress) and execute comprehensive test strategies, including performance testing with JMeter and monitoring tools (Splunk, Graphana, Datadog, Prometheus). Expertise spans Healthcare, Retail/E-commerce, and Supply Chain, with strong CI/CD and cloud (AWS, Azure, GCP) skills.
CAREER OVERVIEW:
Utilized a combination of Agile (Scrum, Kanban) and Waterfall methodologies, participating in daily stand-ups, sprint planning, and retrospectives to adapt to project needs and ensure timely, customer-focused software delivery, while maintaining transparent communication with stakeholders. Expert in designing and implementing test automation BDD and data-driven hybrid using Selenium, Playwright, and Cypress, leveraging Java, TestNG, Cucumber, and Extent/Cucumber Reports
Implemented Behavior-Driven Development (BDD) frameworks using Cucumber, defining feature files and Gherkin scenarios to translate user stories into executable acceptance tests.
Designed and implemented cross-browser and cross-platform automation suites using Playwright with TypeScript/JavaScript, focusing on robust end-to-end and API testing
Designed and implemented JMeter test scripts for various protocols (HTTP/HTTPS, REST API, SOAP), utilizing advanced features like correlation, parameterization, and scripting to simulate real-world user behavior
Extensive experience in Performance Testing and Monitoring tools such as JMeter, Splunk, Datadog, AWS CloudWatch, etc., analyzed performance metrics like response time, throughput, error rates, and identified performance bottlenecks.
Developed and implemented automated test suites within CI/CD pipelines (Jenkins, Azure DevOps), validating infrastructure as code, build integrity, and deployment processes using tools like Playwright, REST Assured, and infrastructure testing frameworks.
Used Continuous Delivery / Continuous Integration (CD/CI) tools Docker, Jenkins, and Azure to deploy this application to AWS and used GIT for Version Control of code for incremental development and on a day-to-day basis to maintain the code history.
Integrated SonarQube into Jenkins pipelines for automated code coverage analysis and pre-filtering, ensuring high code quality and building stability
Experience in Azure deployments and infrastructure provisioning using Azure DevOps pipelines, implementing Infrastructure as Code (IaC) with tools like ARM templates and Terraform.
Experienced in both manual and automated API testing, utilizing Swagger, SoapUI, and Postman for SOAP and RESTful web services, and leveraging REST Assured for robust automation.
Experience in automating Hybrid Mobile Applications using Selenium, Appium and Java.
Developed and executed comprehensive end-to-end tests using Cypress, using JavaScript
Implemented and managed version control workflows using Git, Stash, and Bitbucket, ensuring code integrity and facilitating seamless integration with CI/CD pipelines.
Extensively worked in Relational Database Management Systems including MySQL, SQL Server, and Oracle.
Expertise in using various defect-reporting and defect-tracking tools like JIRA Zephyr, Rally, Bugzilla, and HP ALM.
Exhibited strong leadership in managing global teams, specifically by streamlining communication and workflows between offshore and onshore resources.
TOOLS & TECHNICAL SKILLS:
Web Technologies
HTML, CSS, ES6, AngularJS, XML, XSLT, JSON
Programming/Scripting Languages
Java, Python, Angular, C#, JavaScript, Typescript
UI & Functional Automation Tools/ Frameworks
Protractor, Selenium WebDriver, Appium, Rest-Assured, Frisby, Mocha, Chai, Jest, Jasmine, BDD Cucumber, JUnit, TestNG, Karma, LoadRunner, JMeter, SoapUI, Postman, Swagger, Servlets, Spring Core, JPA, Playwright, Cypress
Reporting and other Frameworks
Extent Reports, Jasmine Reports, Java Mail, Apache POI, itextpdf, Sikuli, AutoIt.
DevOps & CICD
SVN, Git, Stash, Jenkins, Teamcity, Docker, AWS Cloud, Sauce Labs, Azure, Terraform, Kubernetes
Databases
MySQL, SQL Server, Oracle, and MongoDB.
Build Tools
Ant, Maven, Gradle
Browsers
Selenium GRID, Internet Explorer, Firefox, Chrome & Safari.
Operating Systems
Windows 10/11, Mac OS, UNIX and Linux, Android.
Test Case Management Tools
ALM QC, JIRA Zephyr, TestRail, XRAY, Rally
Performance Testing and Monitoring tools
JMeter, LoadRunner, BlazeMeter, Locust
Project Management Tool
JIRA, Azure DevOps, Confluence
EDUCATION:
Bachelor of Technology in Electronics and Communications Engineering from Jawaharlal Nehru Technological University.
CERTIFICATION & TRAINING:
AWS Certified Solutions Architect - Associate
AWS Certified SysOps Administrator – Associate
Salesforce Platform Developer I and II
PROJECTS
GE Healthcare, Waukesha, WI Jul 2023 – Till Date
Software Development Engineer in Test (SDET)
/Performance Tester (JMeter)
Responsibilities:
Followed Agile (Scrum) methodology, actively participating in sprint ceremonies and ensuring quality deliverables.
Collaborated extensively with BAs and developers, applying Test-Driven Development (TDD) principles.
Designed and implemented a Behavior-Driven Development (BDD) framework using Cucumber for clear specifications.
Architected a robust automation framework incorporating the Page Object Model (POM) for maintainable UI tests.
Executed the scripts with Firefox, Safari, and Chrome. Worked on TestNG Annotations, XPath expressions and CSS Identified several web elements using different locators such as ID, XPATH, CSS etc. available in Selenium WebDriver and Playwright.
Developed a Data-Driven Testing approach using TestNG and external data sources for comprehensive test coverage.
Led the migration of Selenium (Java) automation to Playwright and Cypress (TypeScript), leveraging its advanced features.
Utilized Playwright and Cypress with TypeScript and JavaScript to build and execute end-to-end tests for modern web applications.
Performed rigorous cross-browser testing with Selenium Grid and cloud-based testing platforms.
Implemented unit and integration tests using JUnit framework to ensure code quality and reliability.
Conducted in-depth Performance Testing using JMeter, simulating various load scenarios on AWS, Azure, and GCP.
Monitored application performance and identified bottlenecks using Splunk, Grafana, and Datadog on AWS.
Executed API testing using Postman and Insomnia, validating contract compliance and functionality.
Integrated automation suites into the CI/CD pipeline with Jenkins, Docker, ArgoCD, Ansible, Terraform, Kubernetes and GitHub Actions for continuous feedback.
Configured Jenkins build jobs, incorporating Maven for dependency management and Surefire for test execution.
Managed test scripts and collaborated on automation efforts using Git version control within GitHub.
Utilized JIRA and TestRail for comprehensive test case management, linking them to user stories.
Documented and tracked defects meticulously using JIRA, participating in defect triage meetings.
Wrote and optimized SQL queries for thorough backend data validation and integrity checks.
Employed XPath and CSS selectors strategically within Selenium and Playwright for element identification.
Integrated JIRA Zephyr for managing test cases within JIRA and generating execution reports.
Environment: JavaScript, jQuery, Java, Python, Mockery, Rest Assured, Rest API, Postman, JSON, Visual Studio Code, Selenium WebDriver, TestNG, POM, Maven, JMeter, GitHub, Jenkins, AWS Cloud, Oracle.
GAP, San Francisco, CA Mar 2021 - Jun 2023
Software Development Engineer in Test (SDET)
Responsibilities:
Worked in the Agile Kanban environment with frequently changing requirements and features set.
Responsible for front-end code testing using TDD & unit testing techniques.
Automated and executed test cases and reported the defects in Jira.
Involved in implementation of Test Automation Framework build using Selenium WebDriver to handle TestNG and Maven technologies under Java platform.
Responsible for implementing Page Object Model (POM) using Selenium WebDriver, TestNG and Java.
Implemented Selenium Web-driver to run Regression Tests on multiple platforms and browsers in parallel.
Responsible for documenting the Automated Test results using Rest Assured for web services.
Daily used build tools like Maven and Jenkins to build and regression automated tests into CICD processes and used GIT for source code control.
Performed the Load testing using LoadRunner and JMeter.
Created POC’s on E2E UI & API Hybrid framework using Rest Assured
Prepared test cases for Navigational testing, Functionality testing and User interface testing.
Involved in the Defect Review Meetings, build meetings and release meetings to resolve the outstanding issues.
Involved in the Defect Review Meetings, build meetings and release meetings to resolve the outstanding issues.
Validate the backend data by writing complex SQL queries to perform back end testing for data integrity.
Environment: Java Spring, Selenium Web driver, TestNG, JSON, JIRA Zephyr, Jenkins, Maven, JMeter, UNIX, Agile Methodology, MySQL, MongoDB, GitHub, POM, SoapUI, Eclipse, IntelliJ.
GAP, San Francisco, CA Aug 2018 – Feb 2021
Senior Performance Tester
Responsibilities:
Led performance testing lifecycle from requirements gathering to results analysis, identifying system bottlenecks and provided optimization recommendations within Agile environments.
Executed load, stress, endurance, and scalability tests using JMeter 4.0, LoadRunner, and Gatling to validate application efficiency.
Conducted root cause analysis of performance issues using profiling tools, focus on CPU utilization, memory usage, and network throughput.
Developed complex test scenarios that accurately replicate real-world usage patterns, measuring critical KPIs and system behavior under various conditions.
Implemented containerization like Docker for creating consistent and isolated testing environments across distributed microservices architectures.
Optimized database queries through garbage collection analysis and heap monitoring, increasing system throughput by identifying memory leaks.
Created performance test scripts for both UI and API testing using JMeter, Blazemeter, and Postman to evaluate system response times.
Collaborated with cross-functional teams to integrate performance testing into CI/CD pipelines using Git, Jenkins, and JIRA for seamless workflow automation.
Utilized Splunk, ELK stack, and DataDog for comprehensive application monitoring and generating detailed performance testing reports with visualized metrics.
Executed performance tests on Azure Cloud and AWS environments, ensuring application scalability and reliability across different cloud infrastructures.
Developed custom Python and Bash scripts to automate test data generation and performance test result analysis processes.
Participated in architecture and design reviews for complex solutions, providing quality perspectives and performance considerations during early development stages.
Utilized JUnit, TestNG, and Cucumber for unit and functional testing integration with performance test scenarios to ensure complete quality coverage.
Generated comprehensive performance reports including transaction response times, throughput metrics, and resource utilization statistics for stakeholder review.
Analyzed garbage collection logs and memory dumps to identify potential memory leaks and optimize application performance under heavy load conditions.
Executed cloud-specific performance tests evaluating auto-scaling capabilities, load balancing effectiveness, and resource provisioning efficiency across environments.
Used Grafana dashboards integrated with various data sources to visualize real-time performance metrics and system behavior during test execution.
Environment: JMeter, LoadRunner, Blaze meter, Gatling, K6, NeoLoad, Postman, SoapUI, Dynatrace, AppDynamics, Datadog, Splunk, ELK Stack, Grafana, Git, Jenkins, JIRA, Xray, Cucumber, Selenium, Playwright, JUnit, TestNG, Jasmine/Karma, Docker, Azure, AWS, Java, Python, Bash, Spring Boot, Microservices, CI/CD, RushHour.
Thomson Reuters, Carrollton, TX May 2015 - Jul 2018
Automation Test Engineer/DevOps
Responsibilities:
Review requirements and attend meetings to understand the business functionality.
Performed Manual Testing of the application, creating test plans, test cases and documentation. Executed the manual test cases for various testing like GUI, Functionality, System, for Web applications.
Involved in implementation of Test Automation Framework build using Selenium WebDriver to handle TestNG and Maven technologies under Java platform.
Responsible for implementing Page Object Model (POM) using Selenium WebDriver, TestNG and Java.
Developed and executed application software testing such as functional, integration, and regression testing to ensure the quality of WSI websites and related systems.
Performed testing Web services (SOAP) and XML with tool called SOAP UI, for the local WSDL / with URL and created Test cases, run them, do load testing, security testing.
Reviewed Manual Testing methods and developed and executed automated scripts.
Developed and Maintained Function Libraries, Object Repositories for the Framework development
Performed Manual Testing and maintained documentation on different types of Testing viz., Positive, Negative, Regression, Integration, System, User-acceptance, Performance and Black Box.
Used Quality Center for reporting/tracking bugs and for document control.
Developed Custom Reporting with Ant, CSS, HTML.
Involved in the Defect Review Meetings, build meetings and release meetings to resolve the outstanding issues.
Environment: HTML, CSS, JavaScript, jQuery, JSP, Java, Selenium WebDriver, TestNG, Quality Center, Oracle, ALM, SOAP, MS SQL Server, JAVA, XML, Windows 7.
ZenQ, Hyderabad, India. Jan 2013 – Apr 2015
QA Automation Engineer
Responsibilities:
Followed the waterfall methodology and analyzed the user requirements, functional specifications and Use Case documents and created the Test Plans, Test cases for Functional testing.
Coordinated with the business analysts and developers and discussed issues in interpreting the requirements and effectively managed the finalized requirements.
Demonstrated an understanding of Functional, Technical and UI requirements of the project.
Developed automated scripts for functional testing using selenium with java.
Optimized automation scripts for Regression testing of the application with various data sources and data types.
Developed inline view queries and complex SQL queries and improved the query performance for the same
Identify defects in aggregate tables and report data, enter defect in JIRA and coordinate with developers to resolve them based on defect severity and priority.
Experienced in performance testing using load runner
Successfully Completed User Acceptance Testing (UAT) on each release of the project with the help of end user requirements.
Ensured content and structure of all testing documents/artifacts is documented and maintained.
Supported the lead in terms of Review of Test Cases and Business scenarios.
Attended the weekly Project Meetings and discussed the issues raised according to their priority level.
Environment: Selenium, Java, TestNG, Java, Quality Center, SQL, Jira, Windows.
Prolifics, Hyderabad, India. Jun 2012 – Dec 2012
DevOps/Automation Tester
Responsibilities:
Designed and implemented robust CI/CD pipelines using Jenkins, GitHub Actions, and GitLab CI to automate performance test execution and result analysis.
Utilized infrastructure as code with Terraform and Ansible to provision consistent, scalable environments specifically designed for performance.
Configured and maintained Docker containers and Kubernetes clusters to simulate realistic load conditions for distributed system performance evaluation.
Created comprehensive automation frameworks using Python (PIP Scripts), and Bash Shell-scripts to generate test data and analyze performance test results.
Implemented monitoring solutions with Grafana, Prometheus, and AppDynamics to capture real-time performance metrics during test execution phases.
Integrated Jenkins Scripted Pipelines with performance testing tools like JMeter and LoadRunner to automate regular performance regression testing.
Configured log aggregation systems using Splunk and ELK Stack to centralize and analyze performance test logs for comprehensive reporting.
Optimized cloud infrastructure across AWS, Azure, and Google Cloud Platform to enable cost-effective performance testing at scale.
Automated database performance testing for SQL Server, MongoDB, and RDS to identify query optimization opportunities and bottlenecks.
Established continuous monitoring practices using Nagios and QRadar to identify performance degradation trends in production-like environments.
Configured ArgoCD to deploy applications consistently between development, testing, and production environments for accurate performance comparison.
Implemented VMware vRealize orchestration to provision virtualized environments automatically for performance testing of legacy applications.
Developed custom dashboards in Grafana and Splunk to visualize performance metrics and provide real-time visibility during test execution.
Environment: Jenkins, GitHub Actions, GitLab CI, Terraform, Ansible, Docker, Kubernetes, Python, Go, PowerShell, Grafana, Prometheus, AppDynamics, JMeter, LoadRunner, Splunk, ELK Stack, AWS, Azure, Google Cloud Platform, Git, GitLab, SQL Server, MongoDB, RDS, Nagios, QRadar, ArgoCD, OctopusDeploy, VMware vRealize.