SHIRAJNISHA SULAIMAN
Email: ********@*******.*** / Phone: +1-856-***-**** / LinkedIn: Shirajnisha LinkedIn
SUMMARY:
QA/Performance Engineer with 5 years of hands-on experience in performance testing and tuning of web and microservices across cloud and on-premise environments. Proven expertise in LoadRunner, containerized deployments, Kafka load testing, and CI/CD integration.
PROFESSIONAL PROFILE:
Extensive experience in performance engineering and tuning of microservices (REST APIs) and web applications using MicroFocus LoadRunner/Performance Center, AppDynamics, Dynatrace, Splunk, Kibana, Sysdig, and Sitescope.
Proficient in JVM tuning, thread management, SQL query optimization, and application configuration enhancements.
Strong background in Kubernetes (K8s) container environments, including container migration from VM servers.
Skilled in all phases of performance testing: estimation, strategy, planning, tool administration, data setup, scripting, execution, monitoring, and analysis.
Expertise in scripting using HTTP/HTML, Java Vuser, and DevWeb (gRPC) protocols for web, REST API, and mobile applications.
Hands-on experience with Kafka message load testing and optimization, including producer/broker/topic tuning.
Conducted various types of performance testing: load, stress, endurance, and capacity planning for peak traffic scenarios.
Experience in setting up and administering Performance Center, including LG server management, user administration, and license monitoring.
Integrated performance testing into Jenkins CI/CD pipelines for automated validation.
Monitored production usage and performance using Splunk and Grafana dashboards.
Automated infrastructure tasks using PowerShell and Unix shell scripting.
Developed Python-based automation frameworks and utilities to support functional and performance testing workflows.
SKILLS:
Performance Testing
OpenText LoadRunner Enterprise, JMeter, Locust & K6
Automation Testing
Python (PyTest), Postman, Jenkins CI/CD
LRE Protocol
HTTP/HTML, Tuxedo, DevWeb (gRPC), Java Vuser
Monitoring Tools
AppDynamics, DyanaTrace, Conduktor, Kafdrop, Sitescope, YourKit
Test Management Tools
MicroFocus ALM, JIRA
Dev Tools
BitBucket, Jenkins
Operating Systems
Linux, Windows Server
Database
PostgreSQL, Cassandra, Oracle, MySQL, MongoDB, MS SQL
Languages
C, Java, PowerShell, Unix Shell Scripting
ACADEMICS:
Masters in Bio Technology from Periyar University, Salem, India
WORK EXPERIENCE:
Duration
Organization
Designation
Feb 2022– Till now
Amdocs, NC, USA
NFT Specialist (Performance/SRE)
Mar 2018 – Dec 2020
Austere Technologies, NJ, USA
Consultant – Performance/Automation Testing
RECENT PROJECTS:
Project 1: Performance Engineering – T-Mobile (Digital Uplift)
Migrated legacy systems to a modern microservices-based digital platform, focusing on benchmarking and optimizing API performance in a Kubernetes environment.
Key Contributions:
Gathered non-functional requirements and validated new API deployments using Postman.
Developed and enhanced VuGen scripts for load testing across HTTP/HTML, Java Vuser, and DevWeb protocols.
Executed load, stress, and endurance tests; monitored performance using AppDynamics, Dynatrace, Grafana, Splunk, and Kibana.
Identified and resolved performance bottlenecks in collaboration with development, infrastructure, and database teams.
Conducted Kafka load testing and optimized producer/broker/topic configurations.
Recommended improvements for Kubernetes pod capacity planning, HPA configurations, and JVM GC tuning (G1GC, heap sizing).
Tuned SQL queries and application configurations to support high concurrency and throughput.
Monitored Kafka lags using Conduktor and Kafdrop; generated MTTR and performance reports.
Provided tool support and administration for LoadRunner and Performance Center.
Project 2: Automation Testing Framework – Internal QA Initiative
Designed and implemented a Python-based automation framework to streamline functional and performance testing of microservices and web applications.
Key Contributions:
Developed reusable automation scripts using Python, PyTest, and Selenium for API and UI testing.
Automated regression and validation workflows, reducing manual effort and increasing test coverage.
Integrated automated test execution into Jenkins CI/CD pipelines for continuous testing.
Built custom Python utilities for log parsing, test data generation, and reporting.
Collaborated with QA and development teams to identify automation opportunities and improve reliability.
Maintained version control of automation assets using Git and GitHub.
Provided training and documentation on framework usage and best practices.
Project 3: LRE Implementation and Upgrade – Testing Center of Excellence (TCOE)
Led the enterprise-wide upgrade from Performance Center to MicroFocus LoadRunner Enterprise (LRE), enhancing scalability and modernizing performance testing infrastructure.
Key Contributions:
Defined upgrade strategy and assessed existing infrastructure for LRE deployment.
Installed and configured LRE components including Admin, Controller, Load Generators, and Analysis modules.
Migrated test assets, projects, and user data from legacy systems to LRE.
Integrated LRE with ALM, Jenkins, and monitoring tools (AppDynamics, Dynatrace).
Managed LRE domains, user roles, permissions, and project configurations.
Validated LRE environment post-upgrade for stability and scalability.
Delivered training and documentation to performance testing teams.
Ensured compliance and high availability through collaboration with infrastructure and security teams.
Monitored license usage and system health using LRE dashboards.