ABM S HAIDER ***********@*****.***
US Citizen
Over 10 years of extensive experience in software Quality Assurance and Performance testing
Expertise in Test documentation, Performance testing and execution on Client/Server, Integrated Intranet, UNIX, Linux, Mainframes and Internet applications.
Hands on experience in using automated tools like Load Runner, Test Director, Quality Centre and SharePoint.
Extensive experience on ITKO LISA for testing mobile applications and webservices.
Experience in Performance testing of Web applications and Client / Server by using Load Runner.
Performance testing Experience in J2EE, PeopleSoft, Oracle applications by using HTTP/HTML, Web Click & Script, and Citrix _ICA Protocol and multiple protocols.
Expertise in Manual and Automated Correlations to parameterize dynamically changing Parameters values.
Monitoring system resources such as CPU Usage,% of Memory Occupied, Stat, I/O Stat
Collecting the frequency of JVM Heap and Garbage Collection in web sphere during test
Hands on experience and exposure in all phases of project development lifecycle and Software Development life cycle (SDLC)right from Inception, Transformation to Execution which include Design, Development, and Implementation
Performed Black Box, Grey Box and White Box in Java, Java Script, Cold Fusion and spectra and other web technologies.
10 years of extensive experience in Microsoft Tools: Word, FrontPage, Excel, PowerPoint, Publisher, MS Project and Access.
Extensive experience in using Load Runner monitors such as IIS Monitor, Web Logic 8.1 Monitor.
Written Load Runner Scripts, enhanced scripts with C functions, Parameterized, stored dynamic content in Load Runner functions. Text checks were written, Created scenarios for Concurrent (Rendezvous) and Sequential users. Run time settings were configured for HTTP, iterations. Simulated Modem speeds to bring the testing scenario to real world. CPU, Memory, ASP Requests, Network, Web connections and through put were monitored while running Baseline, Performance, Load, Stress and Soak testing.
Measured Response times at sub transaction levels at web, App servers and database server levels by using Optimal Application expert. Highly concentrated on Transactions per sec during testing.
Experience in monitoring Oracle Database performance for Indexes, Sessions, Connections, poorly written SQL queries and dead-locks for each component of application.
Experience in monitoring servers using tools like SiteScope, Wily, TeamQuest and Tivoli Performance Viewer.
Hands on experience with analysis of business, technical, and functional requirements and Developed, Executed & Tested the test plans, test cases and test strategies.
Good understanding of object oriented methodologies, software life cycle and software testing methods.
Developed automation test frameworks in HP Quick Test professional (QTP) using extensive descriptive programming in VB script
TECHNICAL SKILLS
Testing Tools: Load Runner,CA LISA, Jmeter, Quick Test Pro, Quality Center
Monitoring Tools: Wily Introscope, J2EE Diagnostics, Site Scope, Wily, Team Quest.
Operating Systems: Windows XP/NT/2000, UNIX, Linux, DOS, Mainframe OS/ 390
Languages : Java, C, C++, Visual Basic, JSP, .NET
Databases: DB2, Oracle 8i/9i, SQL Server 2000, MS-Access
Version Tools: PVCS, Visual Safe, Clear Case and Clear Quest
Database Tools : TOAD, SQL Navigator
PROFESSIONAL EXPERIENCE
ADP, New Jersey. July 2019 — Present
Role: Performance Lead
Responsibilities:
Created java code for data extraction for retail applications in Apple for all countries.
Created Java code for front end on Apple internal application using javascript, CSS and HTML.
Implementing end-to-end load testing activity on various Retail- online applications through Jmeter across Agile environment.
Development of Performance Test Plan defining test approach by gathering the NFRs of each application every release.
Planned the load and stress test by analyzing Task distribution diagram, Transaction Profile and User profile and executed Performance test using Jmeter.
Responsible for creating Test Scripts using JMeter with various protocols including Web service, Web HTTP/HTML.
Designed and created Jmeter scripts using Web (HTTP/HTML), Webservices, AJAX protocols, REST based services, Java.
Responsible Performance testing, debugging, executing, analyzing complex applications using Jmeter and AOS performance report generator.
Prepared workload models to simulate the production-like load for validating the performance, scalability & stability of the applications.
Provided support in the performance testing using JMeter task includes developing test plan, test script and reports.
Created and executed JMeter scripts for performance testing of portal.
Provided support to the Test Automation Center of Excellence group in the area of Capacity Testing for different project.
Monitored using Hubble and AOS and controller to observe the behavior of the system. Analyze the GC heap to understand the application memory behavior different kinds of Graphs including Transaction Response time, Windows Resources (Memory Utilization, CPU Utilization, Backend SQL query response time and Threads) while executing the scripts from LoadRunner, Performance Center.
Monitor different tests environments using Splunk, collect and analyze performance test results, application logs.
Have knowledge on Apache JMeter with HTTP requests. Also, worked on automation to run Performance tests on AWS for the JMeter scripts.
Performed Regression Testing on applications using JMeter by creating test scripts with Assertions to validate whether the application is returning the expected results or not.
Monitoring Real-time SQL in Oracle Enterprise Manager while executing test to identify the performance of SQL statements.
Used Dynatrace to diagnose and troubleshoot application performance issues.
Collecting AWR report from Oracle10g/11g, then analyzing the DB and configuring the number of processes and Number of open cursors, recommending Tunings setting SGA, PGA and validating the index (is that transaction using index or not) and providing solution for log running SQL’s.
Executed Performance test and reporting of results of database queries testing using open source tool Jmeter.
Responsible for creating data to be used for performance test using SQL queries on oracle and DB2.
Identify and eliminate performance bottlenecks during the development lifecycle.
Preparing final Test summary report at the end of the engagement with recommendation for go- no go decision and suggestions for performance improvement.
Supported Performance Validation Monitoring in Prod CA/ Wily Introscope - During Go live production every release.
Environment: Jmeter, LoadRunner, AOS performance report generator, Hubble, AOS command central, OEM Cloud Control 13c, Dynatrace, Splunk, Shell, Linux, AWS.
Union Bank, New Jersey Sept’ 2015 – July’2019
Performance Engineer
Responsibilities:
Involved in project planning, coordination and implemented performance methodology
Developed Performance Test Plans and Test Strategies based on business requirements
Conducted Performance testing by creating Virtual Users and Scenarios using LoadRunner
Recorded and enhanced Load Runner HTTP/HTML web and web services scripts.
Enhancing the scripts by employing Manual/Automatic correlation, Parameterization Techniques and LR specific functions
Responsible Performance testing, debugging, executing, analyzing complex applications using HP LoadRunner and ALM HP Performance Center.
Developed and Executed JMeter Scripts.
Responsible for writing queries in splunk and validating logs .
Created ERP testing scripts in Siebel and Peoplesoft and did a performance test for it.
Debugging and validate the Test scripts
Used Scheduler to schedule the scenarios for user’s Ramp up/Ramp down in Load Runner Controller.
Monitored different graphs like Transaction Response Time and analyzed Server Performance Status, Hits per Second, Throughput etc.
Setting up the Pacing and Think Time according to the SLA for Test executions.
Tracked and monitored defects using HP ALM.
Created dashlet in Dynatrace.
Business process monitoring in Dynatrace .
SAP GUI and SAP Web CRM applications performance testing for different modules of SAP .
Monitor resources to identify performance bottlenecks and tuning JVM.
Monitored resources to identify performance bottlenecks and tuning JVM also Planed and implemented server component-level testing and monitoring
Analyzed JVM Heap and GC logs in Web Sphere during test execution.
Executed Load Test, Stress Test, and Endurance Test by uploading the VuGen Scripts in to Performance Center 11.5.
Designed and executed Performance testing to analyze the bottlenecks in the application using Loadrunner Analysis.
Explorer and recommended changes, as well as re-tested to validate the fixes.
To investigate the backend logs created during execution of Load Runner scripts.
To Create Loadrunner scenarios & execute different tests as per the requirements.
Create a test report, which documents test results and lists any performance bottlenecks.
Documented Summary Reports and Closure Reports for each Test execution.
Responsible for Performance Tuning for Load, Stress, Endurance Test executions.
Worked with development members on bug reproduction and fixes.
Updated management on testing results, activities and planning.
Developed performance test plans, test scripts, test scenarios based on business requirements
Recorded and enhanced test scripts in protocol such as Web (HTTP/HTML), Oracle NCA, Oracle Web Application 11i with parameterization, correlation, adding ANSI C and Oracle NCA functions
Executed multi-user performance tests, used online monitors, real-time output messages and other features of the Loadrunner Controller
Analyzed performance transaction, and server resource monitors for meaningful results for the entire test run using LoadRunner Analysis
Environment: LoadRunner 12.5, DynaTrace,SAP,Siebel,Peoplesoft Performance Center 11.5, HP ALM/ Quality Center, SOA, HTML, XML, JavaScript, Web Services.
Direct TV July’13 –Aug ‘15
Role: Performance Tester
Location: El Segundo, CA
Responsibilities:
Assisted the project team in identifying and documenting performance test requirements
Worked with business and technology leads to identify the appropriate data for testing, and prepare that data for the test cases
Designed and developed performance testing scripts, functions, scenarios, processes for simple to complex testing situations using HP LoadRunner.
Responsible for generating and executing web services through LISA
Responsible for testing IOS and Android test cases through CA LISA.
Executing test through Testrunner from CA LISA.
Responsible for writing queries in splunk and validating logs .
Created dashboard in splunk.
OMS sterling performance testing for the order supply chain application.
Responsible for testing restful services through CA LISA.
CRM applications for Directv was tested for 25o concurrent users.
Responsible for recording scripts for different business scenario and involved in script enhancement with heavy correlations and parameterization.
Performance tuning with sql profiler and jprofiler tools.
Worked on different protocols; Web (HTTP/HTML), Ajax Click & Script, AJAX TruClient, Web (Click & Script), Web services,SAP,Siebel.
Involved in test data preparation for the Parameterized values in the scripts for multiple scenarios
Designed the performance test scenarios for smoke test, baseline test, scalability test and stress test
Extensively worked on OMS system.
Taking Heap dumps to analyze memory exception and memory leaks.
Taking thread dumps to analyze deadlock issues.
Used Scheduler to schedule the scenarios for User's Ramp up/Ramp down in LoadRunner Controller and assigned Vusers group to different Load Generators
Observed the entire load test run for any failures/errors and monitored metrics such as Transaction Response Times, Running Virtual Users, Hits Per Second and Windows Resources graph
Identified performance problems or bottlenecks and recommended possible steps for remediating Using LoadRunner, executed multi-user performance tests, used online monitors, real-time output messages and other features of the LoadRunner Controller
Used dynaTrace to measure web site performance in test environment to capture performance metrics of key product features
Involved in performance engineering to debug issues and find root cause problems.
Performed problem solving analysis and root cause for system functionality and testing challenges using LoadRunner Analysis Tool
Opened defect in Quality Center with necessary information and assigned it to development team
Worked closely with development team to resolve the defect and ensure that it is resolved and closed accordingly
Involved in Performance testing of online batch jobs, creating necessary files for exports and imports batch jobs
Created reports for online batch jobs by capturing processing time from batch server logs and ensured that the records are processed within SLA
Environment: CA LISA, Loadrunner, Performance Center, Salesforce, Quality Center, Unix,
Paychex June 12– June13
Role: Senior Performance Test Engineer
Location: Rochester, NY
Responsibilities:
Reviewed requirements for project and designs for performance risks.
Responsible in giving feedback to the development team with possible improvements and discovered performance issues/benchmarks.
Responsible for doing testing for IRS application including 1040Pub 15, Pub 17,W-7 and W-9.
Involved in pre-testing that included reviewing and studying the functional and performance requirement document for developing test plans and test cases.
Responsible for testing parasoft performance testing tool.
Responsible for performance testing on Microsoft Azure application.
Responsible for generating and executing web services through LISA
Responsible for testing IOS and Android test cases through CA LISA.
Taking Heap dumps to analyze memory exception and memory leaks.
Taking thread dumps to analyze deadlock issues.
Executing test through Testrunner from CA LISA.
Responsible for testing restful services through CA LISA.
Participate in production release validation to ensure successful deployments of products without any performance issues
Conducted performance testing using LoadRunner for testing the maximum load that the application can handle as well as the response time for read, write and update transactions.
Created and executed Scenario’s with multiple monitors and generated analysis reports to identify performance bottlenecks.
Created Group, Real world Scenarios, Duration based scenario’s and monitoring the scenarios for Load, Stress and Longevity tests
Ensuring the integrity of work performed by Performance Testers
Responsible in Planning the entire Load test process based on the performance requirements document
Conducted walkthroughs with the team reviewing the test plans and test cases for team input and base lining the test plan using the Agile test Methodology
Formulated Test Plan and Test Cases in Quality Center as per the project milestones.
Performed manual testing executing all the test cases in Quality Center before switching to automation testing.
Involved in Both form of testing i.e. Black Box Testing & White Box Testing.
Prepared a detail test schedule and Test Metrics on a weekly basis for the project members to know the status of the QA Process.
Environment: CA LISA,Loadrunner, Performance Center, Performance Manager, Quality Center,Unix, Windows, JAVA,Azure, J boss Web logic, Oracle, XMT, SQL Server, MS Access,MS Vision, MS Project, VB, J2EE analysis, HTML, .Net.
Client: Nationwide Insurance Jul 09 – May 12
Role: Performance Engineer
Location: Columbus, OH
Responsibilities:
Participated in all the phases of SDLC starting from requirements, design, development, Testing and implementation phase.
Gathered, and Analyzed Business requirements and procedures.
Responsible for developing the performance test strategies, plans, and cases for PeopleSoft Application.
Deploying and managing the appropriate testing framework to meet the testing mandate
Executed performance tests on the existing hardware to confirm the scalability of the application.
Planned the load by analyzing Task distribution diagram, Transaction Profile and User profile.
Developed web Service Vuser scripts for a Web Service Call using Soap UI.
Conducted load testing with Thin and Thick Clients Simultaneously, Scripted Thick clients in load runner and Thin Clients in Web and Citrix Protocols.
Developed load runner V ugen Scripts using Correlation to parameterize dynamic values.
Used Rendezvous points, Load balancing and IP spoofing to load test specific transactions.
Responsible for setting up performance monitors using secure CRT and Site scope.
Analyzed various graphs generated by Loadrunner Analyzer including Data base Monitors, Network Monitor graphs, User Graphs Error Graphs, Transaction graphs and Web Server Resource Graphs.
Responsible for collecting the frequency of JVM Heap and Garbage Collection in Web logic during test
Provide weekly updates to the client and application team based on the test results and analysis.
Implemented Controller Scenarios, Run-time Settings, Correlation, parameterization and other functions in Load Runner
Managed entire Performance testing(like Load, Stress, Volume, Endurance and Failover) using Load Runner (Controller, Virtual User Generator, and Analysis).
ENVIRONMENT : JMeter, Load Runner, PeopleSoft 9.1 / Portal V9, ASP, VBScript, Toad, Oracle, Mainframe, MQ Series, Unix, HTML, DHTML, XML, QTP, ills, Apache, Quality Center, Agile.
Client: ADP
Role: Performance Engineer Oct 08- June 09
Location: Roseland, NJ
Responsibilities:
Managed test planning for Load Runner testing,Performance testing, Integration testing, Systems testing,Acceptance testing,Regression testing and Cross-Browser testing
Used Load Runner, (utilizing C programming) for load testing.
Performed manual and automated testing (using Win Runner, Astra Quick Test and Quick test Pro)
Load, Stress, sizing, Scalability and capacity planning of different witness Systems products witch are client server products involved lot of Server testing.
Evaluated Performance Testing Tools from IBM Rational and HP Mercury Suite
Implemented Load Runner and got the licenses from Mercury Interactive
Designed and Implemented Performance Testing Plan for QM, e Reporting, OEM’s (BT, Avaya), Balance, and WFO.
Developed test harness using Virtual User Generator in Single and Multi protocols (HTTP/HTML, RMI, Citrix, Dual Web/Winsock and Windows Sockets).
Correlating and parameter zing scripts, configuring the Runtime settings in Virtual User Generator.
Creating Scenarios with different schedules like Ramp up, Duration.
Monitor Performance using windows Performance monitors and Load Runner monitors.
Analyzed Test using Summary Analysis,Average Transaction Response Time,Throughput, Windows Resource, Network delay and HTTP Codes
Defined, estimated and assigned tasks to other Team members.
Met with Project Managers in defining and estimating tasks and risk.
Assisted with planning, tracking and reporting the team’s progress against schedule and reported status to upper management.
Tested and verified data mapping to appropriate tables and columns.
Environment: Loadrunner, Performance Center, Site scope, Quality Center,Unix, Windows, JAVA, J boss Web logic, Oracle, XMT, SQL Server, MS Access,MS Vision, MS Project, VB, J2EE analysis, HTML.
EDUCATION:
Bachelor of Science Electronic Engineering
Vaughn College of Aeronautics and Technology,NY,USA