Sohel Ahmed
Address: **** ***** ******, ****** *******, NY 11428
E mail: *********@*****.*** Cell: 917-***-****
US CITIZEN
OBJECTIVE:
To obtain a challenging position as a Performance Test Engineer/Analyst, who is creative in identifying
problems, expert in analyzing & Performance Test and effective in implementing solution that meets the
current and future expectations of business?
Professional Summary:
6+ years of professional experience as a Performance Engineer with extensive experience in performing
●
Software Quality Assurance and Software testing.
Extensively experienced in testing of Banking, Financial Services, Logistics, Health Care and
●
Telecommunication business domain applications.
Extensively experienced in Software testing including creation of Test scripts / cases, execution of Tests,
●
Defect tracking and Reporting by working with development team to ensure QA testing requirements are
appropriately identified and included in the requirements.
Experienced in Software testing automation including creation of Test scripts / cases, Test data
●
designing, development of Automation framework, Regression runs and scripts maintenance and updating
the automation suites for new functionality.
Experienced with full Project/QA Life cycle (SDLC) from Requirement analysis to User acceptance
●
Testing (UAT).
Expertise in SQA practice and QA methodologies, Test strategy, Software Development Life Cycle
●
(SDLC) and Defect life cycle.
Good Exposure in implementing Agile/Scrum and Waterfall Methodologies in Testing
●
Project.
● Experienced in working both Front end and Back end testing like GUI testing, Functional testing,
System testing, Integration testing, User Acceptance testing (UAT), Regression testing,
Smoke testing, Sanity testing, Database testing and Automated testing using both Manual
Testing and Automation Tool Load Runner, QTP
● Experienced in creating Virtual User Scripts, defined User Behavior, ran Load Test
Scenario, monitored the Performance and Analyzed Result using Load Runner.
Experienced in performance measurements for UNIX, Oracle, Web Logic, Web sphere
●
servers in Load Runner controller and monitored online Transaction response times, Web
hits, TCP IP Connections, Throughput, CPU, Memory, Various Http requests etc.
Experienced in Performance testing using Jmeter.
●
Experienced in using Jmeter for Database Backend Testing with JDBC & ODBC Connection.
●
Experienced in creating of load test scenarios for various load levels in Load Runner; executed
●
Load test, performance test and stress test using Load Runner; generated and analyzed the
reports post the performance test execution.
Experienced in identifying and isolating performance bottlenecks, defects and problems and
●
providing recommendations / assistance to rectify issues.
Experienced in HP Site Scope and MS SCOM to monitor UAT and Production Infrastructure &
●
CA Wily Introscope to monitor JAVA applications when performing load testing.
Experienced in using RDP, Putty, SQL Developer, Process explorer, File Mon, Process Mon,
●
Pool Mon and Fiddler.
Experienced in different Virtualization Technology like Cloud Computing, Microsoft Hyper V,
●
Oracle Virtual box and VMware Workstation.
Experienced in preparing Requirements Traceability Matrix (RTM) to ensure Requirements
●
coverage using HP Quality center/Test Director.
Used Quality Center and Test Director for Defect Tracking and Reporting. Involved in Defect
●
Management/Problem Solving; attended bug triage meetings.
Experienced in testing RDBMS database applications and proficient in writing SQL queries for backend
●
testing.
Knowledge in generating VBScripts to automate the tests using QTP and used Descriptive programming
●
to reduce time maintaining object repositories.
Knowledge of developing Data Driven Framework, Keyword Driven Framework and Hybrid
●
Framework using VB Script in QTP.
Experienced in working in teams as well as solo responsibility projects.
●
Expertise in managing test beds and setting up test environment to perform QA activities.
●
Excellent communication skills enabling effective interaction with Business Managers, QA Managers,
●
Testing Team and Developers.
Technical Skills:
Testing Tools Load Runner 9.50 11.51, Performance Center, Jmeter, Quality center and Quick
Test Pro,
Operating Systems Windows, UNIX and MS DOS
Databases MS Access, Oracle, SQL Server
Front End GUI Visual Basic, Developer, Power Builder, Power Point, MS Project
Web Technologies JDBC, EJB, MTS, JSP, Web Logic, Apache HTTP,Web Sphere, IIS and JWS
Bug Reporting Tools Test Director and Quality Center
Professional Experience:
Department of Energy, Washington, DC
Nov, 2011 – April, 2013
Performance Engineer
Job responsibilities:
Analyzed Functional Requirements and Business Requirements documents and developed effective Test
●
plans.
Created Test cases; interacted with various departments to formulate and report tests results.
●
Created Test Scenarios from use cases and requirements to perform Verification and Validation Testing.
●
Performed GUI, Smoke, Functional, Regression, System, User Acceptance, Performance Load and
●
Stress testing.
Created Traceability matrix and for viewing the effects caused by a requirement change and measuring the
●
task completion progress across different functionalities using Quality Center.
Used Quality Center to plan tests, manage test assets, create and run manual and external scripts to check
●
GUI and functional features.
Used Quality Center to track bugs and report bugs to the developers.
●
Performed Database integrity tests to create DB scripts with SQL commands to extract the actual results.
●
Worked with Users and Business Analysts to define and design Load test scenarios and Test data.
●
Knowledge of subject matter expert (SME) level testing .
●
Created Test Script for Performance testing using Load Runner.
●
Created scenarios using Ramp Up and Ramp Down in Load Runner.
●
Created Virtual User Scripts, defined User Behavior, ran Load Test Scenario, monitored the
●
Performance, and analyzed Results using Load Runner.
Used Load Runner 9.50 for performance and stress testing of the application to improve its efficiency
●
and scalability, measured hits per second and response time.
Performed Sanity tests on the daily build. Integrated build with other releases/builds.
●
Used Jmeter for Database Backend Testing with JDBC & ODBC Connection.
●
Extensively used Monitoring Tools: Monitor the whole infrastructure under the Load using HP Site
●
scope. Performance Monitor, Resource Monitor, Task Manager, Process Explorer, Data Ware House
Monitor in Windows system and System Monitor and Topas in UNIX system, Jconsole to Monitor Java
based application.
Built procedures and handled multiple and frequent releases as part of testing life to visualize the Risks,
●
issues and proactively plan for mitigation as part of release management.
Handled End to End system testing.
●
Participated in various meetings and discussed Enhancements and Modification Request issues.
●
Environment: C, XML, VB Script, HTML, SOAP, Web services, Windows, Quality Center,
MS Word, MS Excel, MS Project, Load Runner,Jmeter
T Mobile, Atlanta, GA
Aug, 2009 Oct, 2011
Performance Analyst
Job responsibilities:
Involved in analyzing requirements; organized and supervised formal reviews of development
●
documentation like requirements and design documents and system test plans.
Involved in developing test plans, test strategy and test cases.
●
Responsible for identifying of problems, risk rating, problem reporting and referral to appropriate person or
●
team.
Performed Manual Testing, Black Box and Grey Box Testing.
●
Involved in performing GUI testing, Ad hoc testing, Smoke testing, Functional testing, Regression
●
testing, System testing, Performance testing, Load/Stress testing and Back end testing.
Involved in managing workflows, user setup, roles, project creation, schema creation, implementation
●
with Quality Center.
Tracked and reported the errors discovered using HP Quality Center.
●
Wrote SQL Queries for performing Back end testing on Oracle data bases.
●
Involved in Localization testing and Performance testing of web based modules; handled Load testing
●
using Load Runner.
Created Performance benchmark for the product and compared test results of new build with benchmark
●
results using Load Runner.
Monitored resources to identify Performance Bottlenecks, Analyzed test results along with development
●
team and database team and report the findings to the clients using Load Runner.
Used Windows Typeperf & Perfmon Utility to create custom config file and collect windows resources
●
statistics remotely and generate report with PAL.
Extensively used Topas Utility, Vmstat, Sar, & System Monitor in UNIX System to measure Unix system
●
Performance under load.
Updated Daily and Weekly Status reports to the Team Lead, Test Groups and Test Managers to co
●
ordinate test cycles.
Documented Summary and Closure reports for each test execution.
●
Participated in walkthrough, presentations & status tracking calls.
●
Environment: QTP, LoadRunner,TOAD, Quality Center, Java, HTML, UNIX, Oracle, SQL,
PL/SQL, SQL*Plus, Unix shell scripts, Windows, VB script, Microsoft SQL Server, Visual
Basic, Load runner,Jmeter.
American Express, Salt Lake City, UT
Jan, 2007 Jul, 2009
Test Automation Analyst
Job responsibilities:
Involved in the review of requirements specification with functional manager and technical specialists of the
●
application.
Involved in gathering system requirement and functional requirement from different sources such as
●
development team, user or vendor.
Developed Test cases and Test scripts based on the requirement documents.
●
Maintained test scripts for different builds and releases.
●
Performed various black box testing methodologies such as System Testing, Regression Testing.
●
Written Positive and Negative test cases.
●
Used Test Director for defect reporting, tracking, storing the client requirements, and managing test plans.
●
Wrote SQL Queries to perform Back End testing on Oracle and MySQL database.
●
Performed functional and regression testing using QTP: recorded scripts; enhanced tests with standard,
●
image, table, text and data checkpoints; parameterized tests; executed tests and analyzed the results.
Participated in designing Automation architecture using QTP as automation tool.
●
Developed Data Driven Tests with QTP to test the application with different data values.
●
Worked on concepts like Synchronization timeout and Recovery Scenario Manager in QTP.
●
Designed scripts using descriptive programming in QTP.
●
Prepared the data for initial setup and performance acceptance test.
●
Installed and configured Load Runner, performed stress testing of the application for various scenarios
●
and analyze the results to improve its efficiency and scalability.
Utilized Performance /Load testing methodologies to identify potential bottlenecks and system
●
performance problems.
Created Virtual User Scripts, defined User Behavior, ran Load Test Scenario, monitored the
●
Performance, analyzing Results using Load Runner.
Analyzed test results reports and test cases results and determined causes, and created detailed and
●
comprehensive defects reports.
Generated and automated various Daily status, weekly reports, and Monthly and Quarterly reports. Also
●
generated Defect status report, QA status reports, Risk Analysis documents, Requirements Traceability
reports, Test Execution and test results summary reports.
Environment: Windows, UNIX, Oracle, J2EE, JUnit, JBuilder, QTP, Test Director, PL/SQL,
ASP, and XML/DHTML
Education:
Bachelors in Social Science
National University, Bangladesh