Sign in

Engineer Manager

Astoria, NY
December 22, 2019

Contact this candidate


Naser Samad Iqbal

Queens, NY ***** Phone: 347-***-**** Email:

Performance Test Engineer/Senior QA Analyst


US Citizen Ten (10) Years plus of experience as a Performance Engineer and a Senior QA Analyst. Background includes experience with gathering and interpreting test plans analyzing use cases for functional, regression, performance integration, compatibility, mobile, real-time and performance in SOA-based and API applications. Experience Agile and Waterfall Software Development and Testing Functional and Performance scripts. Extensive hands-on experience in various e-commerce, Secure e-commerce B2B/Mobile/Remote web-services, ESB & SRC.

Design and Develop Performance Stability, measure, analyze, manage and optimize performance Capabilities in performance testing tools such as; VSTS, Load Runner, JMeter with specialization in performance analysis of J2EE/.Net applications and Apache/Tomcat or IIS web servers. End-to-End experience in performance testing activities from gathering requirements to analyzing results and delivering findings and recommendations. Hands-on experience in Performance Engineering, Scripting, Executing Smoke/Baseline, Load Testing, Volume, and Breakpoint testing. Working experience with multiple protocols such as HTTP/Windows Socket, Web Service, Soap, Citrix, JMS, Sap Web, Mobile & JDBC. Experience in using profiling tools like JProfiler, JProbe, JConsole, VisualVM, HP Diagnostics, etc, including GC / JVM analysis tools & heap/thread, dump analysis tools.

Experience with Front-End web performance analysis using WebPageTest, PageSpeed Insights. Hands-on experience with service virtualization, Profiling and tuning web applications, analysis thread, CPU and memory utilization. Familiar with DevOps and Cloud technologies and CI/CD concepts. Experience with Application server configuration. Hands-on experience in Tuning, Monitoring Splunk events, logs, Tivoli Monitoring and Database Tuning, Performance, Debugging and identifying the root cause of the System Bottleneck and end-to-end performance measure. Providing diagnostic support, application, and server level. Experience with Java, .net memory utilization and Redis Optimization. Experience with Mobile App Monitoring using New Relic. Experience in Performance testing in the Azure Cloud Platform and Experience with AWS CloudWatch, EC2 instance. Hands-on Application Monitoring in AppDynamics and Performance Diagnostic, Profiling and analyzing through Hotspot, PurePath technology, Node.js, RUM using Dynatrace Client. Experience with Network configuration, setup, and Network Performance engineering. High motivation, able to work effectively across cultures. Excellent communication and Leadership skills.


Testing Tools: Jmeter, PCF Cloud, LoadRunner, VSTS, BlazeMeter, HP Performance Center/ALM, Quick Test Pro/UFT, HP Quality Center/ ALM, HP Diagnostic, Postman, Fiddler, HttpWatch, SoapUI, Rational Suites, NeoLoad, VMWare, Oracle PeopleSoft, HCM/CRM, Putty, JIRA- Confluence, Bitbucket, Jenkins Pipeline, CA Rally.

Profiling/APM Tools: DynaTrace AppMon, Docker, IBM TIVOLI, Splunk Cloud, JVM, AWS CloudWatch, Azure Applications Insights, New Relic, Wily Introscope/ CA, MS Diagnostic, AppDynamics, Zabbix, nmon, Perfmn, Yourkit.

Languages: VBScript, C++, UNIX Shell, XML, WSDL, SQL, PL/SQL, HTML, Ruby, PHP, CSS, JavaScript, SOAP/REST Services, Python, JSON, and JAVA.

Databases: Oracle, SQL Server, MySQL, DB2, Rapid SQL, and COBOL SQL

Mobile/Web: Device Anywhere, Ranorex, SeeTest Automation, MobiReady, Google PageSpeed Insights, Web Page Test and Android SDK.

Operating Systems: Windows, Linux, Mac OS X, AS/400, DOS, Fedora, Red Hat, Solaris.

Servers: WebSphere, DynaTrace Server, JBoss, Apache, VMware, Http server, WebLogic, J2EE, Ftp Pro, IIS & AWS Cloud, Microsoft Azure DevOps.

Others: Microsoft Office, Microsoft Project, TFS, Cloud Foundry, HP Sitescope, ClearQuest, IBM MQ, PerfAnalyzer, Firebug, SharePoint, JProbe, Jconsole, GITLab, BeanShell.


AWS CloudWatch Associate

IBM TIVOLI Certification Training

Microsoft CRM

Professional Experience

MasterCard, NYC, NY

Performance Test/ Platform QA Engineer Dec 2018- Present


Responsible for Performance test Executing and Monitoring Performance metrics. Understanding the architecture, design and performance platform and load criteria. Collaborating with the Leads, Business, Dev, Dynatrace Admin, Operations & Infrastructure teams on defining load and performance requirements. Analyzing performance bottlenecks.

Monitoring Performance metrics in Dashboard. Running JMeter scripts for API and Soap services and Monitoring Dynatrace PurePath, response time hotspot and analyzing Orchestration and downstream service call. Auto-scaling using PCF. Responding to Alerts and degradation and Ownership across all performance environments. Collaborating Performance Engineering using JMeter/SoapUI/Jenkins/Postman/RESTAPI.

Monitoring Prod API/ services, middleware, Redis, Payment data, Consumer calls, and other REST API services call in DynaTrace and then analyzing methods from this service call in Splunk enterprise. Binding app service in PCF Cloud to DynaTrace which call services for monitoring in DynaTrace service flow and node.js. Modify, Deploy Configurations in Jenkins Pipeline.

Deploy Application and Update / Execute Scripts for continues integration and continues delivery. Update, deploy & manage configurations for service in Pivotal Cloud. Analyzing Splunk query search time for Synapse and XML gateway response time comparison.

Captured 6 hours and 24 hours avg. (total_time) and avg(LOADBALANCER_EXECUTION_TIME) difference time and shared with server admin. Analyzing Hotspot Methods and API response in Dynatrace and Splunk for production issues and continue supporting the production team.

Performing Optimization of white space and Optimization of the endpoint for all API calls using Dynatrace Purepath. Running XMLGW Splunk query and capture XMLGW metrics. Pull Redis data by buckets and Analyzing Splunk logs. Analyzing Hotspot Methods and API response in Dynatrace and Splunk for production issues and continue supporting the production team. Suggesting POC and performance bottlenecks issue. Creating Dashboard reports, performance test results and creating SRC Production Performance Metrics – Latency Breakdown performance results and report.

Environment: JMeter, StromRunner, Dynatrace, Pivotal Cloud, Spring frame, Agile, Postman, CA Rally, Splunk, Redis, Akamai Cloud, Linux, Bitbucket, Java, BeanShell, Jenkins Pipeline.

NY State Comptroller, Albany, NY

Performance Engineer/ SDET-QA 02/2016- Dec 2018


Responsible for handling multiple projects and gathered Performance requirements. Collaborating with the infrastructure team, developers, Functional test team, Server admin, Technical leads and Project manager. Develops, executes, and analyzes the Load test for Web/Rest API’s application. Preparing results and sharing with them. Creating a service request. Running Smoke and Load Test for API Web Service and MobileAPI Service with 15/20K User Load Using VSTS and Monitoring Key Indicators. Monitoring Performance Test execution -Client-side matrix and service side matrix in Visual Studio Load Test. Analyzing Performance KPIs and Monitoring systems under test, measure and record key performance indicators, analyze and interpret results in NeoLoad Performance Monitoring. Monitoring Web Service request, Total response time, QA Panel in each load test using Splunk. Monitoring App Server Memory, Solr Server Memory, Web Server Memory and CPU using Splunk. Capturing DB CPU Utilization, App CPU and App Memory Metrics from IBM TIVOLI.

Running Load Test in SALTA Cloud for Tokenization's Services. Performance measurements for Oracle, Web Logic, IIS in the Controller. Monitoring Mobile API Service call, App Server metrics, Creating and sending error alerts Using New Relic. Creating Dashboard and Monitoring Solr and Salta App Server Response time, QA Dashboard using Splunk Cloud enterprise.

Monitor High CPU, % of Memory through IBM Tivoli, analysis test results with server admin and WAS team. Creating test results and creating Comparison results metrics. Executing Performance Tests using LoadRunner and Analyze results and providing recommendations. Performing all phases of end-to-end testing which includes User Acceptance testing, Functionality testing, Regression Testing, negative testing, system testing, and Smoke testing. Providing diagnostic support, fine-tuning at the database, application, and server level. Supporting Redesign Team using LoadRunner technology and monitoring tools to measure, detect, isolate and resolve performance issues found during application development performance testing including measuring, monitoring and capturing required infrastructure & application performance metrics, logs, and reports. Build a Performance regression suit. Analyzing Performance scenario. Creating Performance Workflow. Executing Performance Tests using LoadRunner. Running SQL Query in SQL Server for verifying data validation. Monitoring and Optimizing SQL Performance Tuning and Running BP process.

Monitoring the Backend profile process; Trace logs, Slow SQL. Uploading Script and Setting upload scenario, Adding Application and Database Server for resource monitoring in (Controller). Monitoring Response time, TPS, Hits Per Second. Identifying and debugging application/infrastructure performance problems. Analysis of performance test results and documenting them. Managing web service related integration issue, Log Problem. Participating in root cause analysis and complex system performance investigations. Analyze metrics and trends to highlight performance opportunities for improvements. Creating a Performance Test Closure Report with details server metrics and Sharing test report.

Environment: Jmeter, Dynatrace, Pivotal Cloud, Spring frame, VSTS, LoadRunner, NeoLoad, PostMan, SoapUi, Azure DevOps/ Cloud, GitLab, Splunk, IBM TIVOLI, WebSphere MQ, RTC, New Relic, PeopleSoft, PageSpeed Insights, ECM, Oracle, UNIX, SQL Server, SharePoint/Office 365, HTTP Analyzer, Microsoft Project.

Project: DOB (Performance Test Engineer)


Worked with Project Manager/Leads for measuring, analyzing, optimizing the performance and scalability of web application and provided recommendations. Identified the Test Cases for Performance as per the Client requirements and created the Test Cases in HP Quality Center. Setting up the load test environment, Creating Performance Center server monitoring Profiles for monitoring Web Application, Database and Web Server.

Analyzed performance test results and sharing with business, development and internal team members. Maintained the performance test automation, investigating and troubleshooting performance issues. Created reports to document performance metrics, test results, analysis, and recommendations.

Created VUGen Scripts. Uploaded multiple test scripts, Created Time slots, Created Test schedules, ramp up/ramp down and setting up server Monitors and maintaining all the scripts in ALM Performance Center and monitored response time, server, process, memory and total transaction time and discussing with the team for the optimization. Executed Stability Test, Load Test, Stress Test, Smoke test and Endurance test and Monitored Vuser logs in Performance center.

Configured scenarios and set up the monitors to capture the performance of the application servers, Web servers and Database servers using Performance Center. Extensively Monitored response time, throughput and monitoring CPU, % of Memory usages, Processors, SQL and the other server as well. Analyzing resources such as; %Committed Bytes in use, % Processor Time, % Disk Time. Troubleshooting,

identifying bottlenecks of web applications and mimic the performance issue with the integration and Database team and analyzing. Monitoring TraceLog, Exceptions and event viewer using HP Diagnostic tool. Identifying Performance issues of the various applications.

Analyze the root cause for the performance issue by profiling the application and logging to the portal server and finding out the root cause of failing and recommending the team to optimize the issue. Testing Rest Web API, Web services, Java/JASON sampler Using JMeter. Performed load and stress test with 300 concurrent users using JMeter and monitoring the performance and measured application response time.

Scripted Selenium with JMeter Web Driver Sampler and Uploaded scripts to BlazeMeter; Monitoring and analyzed load and response time. Installed and configured Dynatrace Client and patching/ unpatched Vugen Scripts. Using Dynatrace client for DOB Performance Report, Users activities, Web request, Server Analysis using RUM and analyzing KPIs through PurePath Dashboard and Sharing report.

Monitoring and analyzing the throughput, response time, Web Server. Monitored .net backend applications and drill down each Web request to PurePath technology. Performed deep dive JVM analysis examining garbage collection logs, heap dumps, and thread dumps as needed. Monitoring JVM while running JMeter script and monitoring pid, total % of Memory, monitoring live threads and sharing the thread dump scenario. Analyzing various graphs generated by LoadRunner Analyzer including Database Monitors, Network Monitor graphs, User Graphs, Error Graphs, Transaction graphs, and Web Server Resource Graphs. Developed complex SQL queries to performed back-end testing in SQL Server RDBMS. Assisted in the development of application standards (coding, logging, performance), reviewing and approving test design and test execution and mentoring other performance test resources. Verified the Scripts and Scenarios created by the other performance tester before test execution. Created Performance Test reports of test results and test metrics. Providing daily and weekly updates to the client and application team based-on the test results and analysis.

Environment: LoadRunner, Performance Center, HP Diagnostic, JMeter, BlazeMeter, Web Driver, JVM, TFS, Site24x7, Dynatrace Client, Agile Spring Boot, CRM, Apache, IIS, SQL Server, .Net, ALM QC, Oracle Linux 6, Java 8

Pearson Education, New York, NY

Performance Test Engineer 09/2014 – 02/2016


Assisted Performance manager & and other QA team in developing estimates to support new and existing projects. Writing Load/performance test cases and scripts and Executing based on business requirements. Worked in multiple projects with DBA, Leads, and Managers and identifying bottleneck issues. Enhanced Mobile test scenario and performed Sanity and a Smoke test using QC. Creating multiple test scenarios scripts and Integrating with LoadRunner for Performance testing. Used LoadRunner for backend API test, Load volume 50k, 100k, 500k, and up to 1 million concurrent hits. Installed Android SDK platform and manager for emulating mobile device or tablet or iPad devices and testing physical devices using SeeTest Automation and MobiReady. Analysis of the test results, device log, Device Vitals Monitors and debugged the properties, error statistics and shared results. Export code and Created CVuser Script in SeeTest Automation and uploaded it to Performance Center for running a load test. Testing Network virtualization for Mobile performance using SeeTest Network. Performed creating and designed Neoload scripts for web applications, web services and Databases using JavaScript. Performed Correlations and parameterizations and converted Legacy scripts from HP LoadRunner to NeoLoad. Configured .net server, IIS in AppDynamics. Monitoring Web Server, optimizing business transactions, monitoring TPS, Load and System health.

Monitored .net backend and middle-tier applications Using AppDynmics APM. Analyzed mobile apps content response time in multiple device platforms. Performed Mobile performance testing with ios and Android platforms using Neoload and monitored end-user activity. Analyzed Throughput Graph, Hits/Second graph, Transactions per second graph and Rendezvous graphs using LR Analysis.

Environment: LoadRunner, Quality Center, NeoLoad, AppDynamics, IBM WebSphere, IBM Mainframe, ASP.NET, CA LISA, SeeTest Automation, Ranorex, SeeTest Network Virtualization, Visual Studio Web Test, JMeter, DynaTrace Client, Java 8, BeanShell, WebLogic, VMWare, ActiveMQ, JIRA, Selenium (WebDriver), Jenkins and Taurus.

Project: JMeter Developer (Performance Engineer)

Created Performance Testing, Volume Testing Strategies, Performance Test Plans, and Requirements for Applications, Middleware and Databases. Responsible for integrating the Automated scripts to the Automated workflow framework built using DevOps tools. Analyzed Throughput Graph, Hits/Second graph, Transactions per second graph and diagnose the cause of performance problems, analyzing application tiers and JDBC. Writing and executing load, volume, and capacity and performance test for JAVA based platform using JMeter technologies. Build a Performance test for web and Cloud-based applications in Linux and Created Load Test for the Profile Web Service and Database application across multiple high-profile projects in Java Implementation. Created thread group requests and run them for the Connection Service.

Running Data-Driven Test, monitored the application and multi-web server and analyzed PerfMon Metrics. Analyzed Memory Load, CPU, Thread, Response Code and Network I/O Load for performance bottleneck issue triaging. Performed Commands such as; Top, Perfmn, Wget, Sir, and Vmstat. Performed. Build test automation for UI and WCF/REST Services and created reusable and shareable components Using JMeter in the Linux platform. Written BeanShell Scripts for Profile service. Installed and run the Taurus script in Jenkins Pipeline. Identifying hotspots and isolate performance problems and Monitored Response time in PurePath Dashboard, RUM Analysis using DynaTrace client. Created Performance Test reports of test results and test metrics and shared within the team.

PNC BANK, Pittsburgh, PA, Performance Tester/ QA Analyst 03/2011 – 08/2014


Performance testing in Oracle PeopleSoft and Collaborated with the PINNACLE QA Team for Application Performance. Worked on AWS CloudWatch for EC2 Dashboard, Containers and created multi-instance, monitoring CloudWatch Metrics. Responsible for creating script creating (VUGEN), Execution (Controller) and Analysis/ Reporting. Performed Baseline, Load and Stress Testing Using LoadRunner. Created, correlated and parameterized automated test scripts using the Web/ HTTP, Oracle (2 tier), AJAX True Client, Click and Script and Web Services (SOA) protocols using LoadRunner.

Worked on Performance monitoring and analyzed the response time, Memory leaks, hits/Sec and throughput graphs. Uploaded scripts in ALM Performance Center, Created Time slots, Created Test schedules and maintain scripts. Installed and configured application through the DynaTrace Profiling tool. Monitored the server and engaged to implement Dynatrace to monitor real-time performance. Performed Benchmark, Capacity and Load Test using JMeter for JDBC connection and proxy server. Worked closely with software engineer team members to tune and optimize product stability and performance. Written SQL Query and identity which queries and optimizing those with Database Tuning performance and Worked with Database administrator to index database to improve performance of the Applications. Monitored Application Server through Analysis and communicated bottleneck issues to the System administrator. Testing on mobile Testing web applications on multiple browsers, platforms, and versions platforms: iOS, Android using Device Anywhere.

Installed and configured application through profiling tools such as VisualVM, JConsole and Monitored Linux resources during the load test finding Bottlenecks and solving the issues on Linux servers using different monitors. Monitored resource utilizations such as; CPU usage % of Memory occupied in VM Stat I/O Stat JVM, Thread, System Processing time and latency in Linux and used commands such as performing, Sir, Top and vmstat. Identified functionality and performance issues, including deadlock conditions, database connectivity problems, and system crashes under load. Analyzed online graphs and reports to check where performance delays occurred, network or client delays, CPU performance, I/O delays, database locking.

Fannie Mae, Herndon, VA, Quality Test Analyst 03/2009– 03/2011


Contributed Manual, Automation and Performance QA Tester. Performed planning, designing, executed and evaluated performance tests of web applications and services using LoadRunner. Maintained automated tests environments and scripts. Worked with the development team to ensure testing issues are resolved. Monitoring Application Server, Database Server. Analyzed server access logs debugging application performance issues. Creating (VUGEN) script using Web/ HTTP, SAP GUI protocol for a customer ID and Online banking transaction. Execution (Controller), Analysis/Reporting.

Monitored perfmon counters and windows resources such as; CPU, Memory, and threads through HP SiteScope. Performed manual regression using ALM/QC test case and running a test scenario. Performed UFT automation testing. Used Quick Test Pro extensively to automate testing the functionalities of the application and used the scripts for Functional, and Regression Testing. Perform back-end testing by verifying the data in the Oracle Database. Managed the requirements using Quality Center. Performed Smoke, Integration, Functional, Performance, Regression, and Backend testing.

Performed Functional, Integration testing and Regression testing using HP Quick Test Pro. Created User-defined functions, function libraries and maintained the initialization scripts to set-up the work environment using QTP. Performed Database Testing and write SQL Queries for MS SQL Server 2000. Worked closely with Test Lead during the Software Testing Life Cycle.

Designed and developed Test Scenarios, Test Cases, and Test steps for the various Business processes. Utilized ClearQuest as a request for tracking the Defects and generating reports.


Bachelor’s in Science from Southern University

City University of New York (CUNY), New York, NY (MS. In Computer Information System)

Contact this candidate