K HURRUM KA ISER
Contact # 347-***-**** (Mobile)
Email: *********@*****.***
Status: United States Permanent Resident
OBJECT I VE:
O ver 7 and half years of hands on experience in Q uality Assurance w ith
B achelor in Computer I nformation Systems and diversified experience in
A utomated, Manual, Functional, Performance testing of Web and
C lient/Server applications on U N IX /Windows environment to secure a position of
Software Test Engineer/ Quality Assurance Analyst.
SU M MARY :
Experience i n defining detailed application softwa re test plans, including
organization, participants, schedule, test and a pplication coverage scope .
Experience i n gathering and d efining functional and user inte rface
r equirements for software applications.
Strong analytical experience i n developing detailed b usiness level test
scenarios and i ndividual test events and scripts based on f unctional and
b usiness requirements.
Experience in developing detailed design level t est scenar ios and
i ndividual test events and scripts based on multiple user groups and
detailed user interface design requirements.
Experience i n developing, reviewing and managing requirements t raceability
( requirements, a pplication components, t est cases, t est case execution
r esults).
Experience i n defining and executing test data acquisition, including
p roduction d ata set selection and extraction and t est data generation, as
well as manual data preparation.
Experience i n C M M I level 3 software development environment.
Performed Accessibility Testing/508 Compliance using JAWS.
Strong E xperience in a utomated softwa re testing tools, including
R egression/functional testing, user interface design testing, P erformance,
stress and load testing .
Experience i n developing and executing SQL queries i n an Oracle
environment to confi rm database operations performed by software
applications using O racle 9g, 10g, 11g.
Used T OAD t o check SQL queries for optimal performance and scalabili ty
u nder real-t ime workloads.
Ability to t r iage softwa re defects p roviding development staff with adequate
i nformation for remediation.
Experience A bili ty to adjust priorities to multiple demands and
u nanticipated events as wells as adjust decisions and actions to new
i nformation.
Experience Abili ty to be flexible in response to changing
c ircumstances or conditions.
Experience w ith p lanning, implementing, a nd testing mission critical
i nformation systems employing Web Services and built on In ternet
a rchitecture principles.
Possess strong listening, verbal, attention to detail, and w r i tten
communication skills.
SK I L LS :
T esting Tools HP LoadRunner, HP Quality Center, Q uick Test
P rofessional (QTP), Test Director), T OAD, PVCs T racker,
J I RA, I B M Rational Robot, Clear Quest, B MC Remedy,
R ational Manual Tester(R M T)
Visual Basic, C++, C#, Java, J2EE, PL/SQL, HTML,
Languages
M acromedia Dream weaver, VB Script, XML, Dojo.
UN IX, MS DOS, Windows Vista, Windows XP, Windows 2000,
Operating
and Windows 95, Windows 7, Windows 8.
Systems
Database MS SQL Server 2005 /2008, O racle10g, 11g, 9i, MS Access
M icrosoft Tools MS Word, MS Excel, Power Point, M S Works, and O utlook,
M S Visio, Microsoft Lync, Microsoft Outlook express, IBM
Lotus Note
Application Web Logic, I IS, Apache, JBOSS.
Server
JOB EXPER I E NCE:
Goldman Sachs
New York, New York
Job T itle: QA Analyst
J une 2013 to P resent
Responsibilities:
Analyzed requi rements and user cases, certified application-specific
softwa re, performed ambiguity reviews of business requirements and
f unctional specification documents
Attended Weekly tech review meeting
Updated the BRD according to the Business User’s demand
Wrote Test Cases for enhancement of the Vendor Management Portal(VMP) in
Quali ty Center
Executed Test cases and logged defects using Jira
Develop and updated UAT scripts for different Business Users with different
role.
Wrote Test Plan for different releases.
Responsibilities included M anual GU I Testing, Functional Testing,
I n tegration Testing, Regression Testing, I nte rface Testing, End-to-End
T esting, User Acceptance Testing.
Test the application in Production, Development, UAT Environment
Work on system integration with other system
Develop test data for Regression testing.
Updated existing scripts according to the new release to the production
Design, Develop and maintain automation framework (Hybrid Framework).
Analyze the requirements and prepare automation scripts scenarios
Develop Functional L ibrary for VMP application
Automated the f unctionality and inte rface testing of the application using
Q uick Test P rofessional (QTP)
Performed database testing using ODBC by A utomation Test scripts.
Created automation scripts for regression testing using Q TP.
Created D r iver Script using VB Script t o execute QTP application
automatically and run number of automated scripts simultaneously.
E nvironment: Java, HTM L, SQL, Oracle 10g, TOAD, Apache, JBOSS, Q uick Test
P rofessional, Quality Center, Ji ra.
Guardian L ife I nsurance
New York, New York
Job T itle: QA Analyst
D ec 2012 to April 2013
Responsibilities:
Analyzed requi rements and user cases, certified application-specific
softwa re, performed ambiguity reviews of business requirements and
f unctional specification documents
Executed test cases manually and logged defects using Clear Quest
Wrote Test cases on IBM rational Manual Tester
Conduct V alidation Testing using L BS console by extracting the data on
excel
Responsibilities included M anual GU I Testing, Functional Testing,
I n tegration Testing, Regression Testing, I nte rface Testing, End-to-End
T esting, Database Testing and User Acceptance Testing
Wrote test cases for Financial application enhancement and
i mplemented changes
Tested the F inancial Application for equities products
Conduct functional and regression testing on the F inancial
C alculators
Conducted UAT Testing
Test the application in QA, PRODUCTION, UAT Environment
Client Application Testing, Web based Application Performance,
S tress, Volume and L oad testing of the system
Analyzed performance of the application program i tself under various test
loads of many simultaneous Vusers.
Analyzed the impact on server performance CPU usage, server memory
usage for the applications of varied numbers of multiple, simultaneous users.
I nserted T ransactions and Rendezvous points i nto Web Vusers
Created V user Scripts using V uGen and used Controller to generate and
executed Load Runner Scenarios
Connected M ultiple Load Generator w ith Controller to support Additional
V users
Created scripts to enable the Controller to measure the pe rformance
of Web server under va rious load conditions
Design, Develop and maintain automation framework (Hybrid Framework).
Analyze the requirements and prepare automation scripts scenarios
Develop Functional L ibrary for VMP application
Automated the f unctionality and inte rface testing of the application using
Q uick Test P rofessional (QTP)
Performed database testing using ODBC by A utomation Test scripts.
Created automation scripts for regression testing using Q TP.
E nvironment: Java, HTML, UN IX, SQL, Oracle 10g, TOAD, Apache Tomcat, H P
L oad Runner, IBM Rational Robot, Clear quest, HP Quick Test
P rofessional, I B M Rational Manual Tester .
D iscovery Communications I nc.
New York, New York
Job T itle: QA Analyst
J un 2012 to Dec 2012
Responsibilities:
Developed t est plan for DORS(distribution online request system)
Executed test cases manually and logged defects using BMC remedy
Responsibilities included M anual GU I Testing, Functional Testing,
I n tegration Testing, Regression Testing, I nte rface Testing, End-to-End
T esting, Database Testing and User Acceptance Testing
Client Application Testing, Web based Application Performance,
S tress, Volume and L oad testing of the system
Implemented A utomated-testing methodology, which resulted in the
i dentification of problems w ithin the system.
Analyzed performance of the application program i tself under various test
loads of many simultaneous Vusers
Analyzed the impact on server performance CPU usage, server memory
usage for the applications of varied numbers of multiple, simultaneous users.
I nserted T ransactions and Rendezvous points i nto Web Vusers.
Created V user Scripts using V uGen and used Controller to generate and
executed Load Runner Scenarios.
Connected M ultiple Load Generator w ith Controller to support Additional
V users
Created scripts to enable the Controller to measure the pe rformance
of Web server under va rious load conditions.
Design, Develop and maintain automation framework (Hybrid Framework).
Analyze the requirements and prepare automation scripts scenarios
Develop Functional L ibrary for VMP application
Automated the f unctionality and inte rface testing of the application using
Q uick Test P rofessional (QTP)
Performed database testing using ODBC by A utomation Test scripts.
Created automation scripts for regression testing using Q TP.
E nvironment: W indows Server 2005, Java, Java Script, HTM L, UN IX, SQL,
O racle 10g, TOAD, JBOSS, H P Load Runner, H P Quality Center, H P Quick
T est P rofessional, B MC Remedy.
Geico
Chevy Chase, Ma ryland
J ob T itle: Softwa re Automation Test Engineer
J anua ry 2010 to Jun 2012
Responsibilities:
Executed test cases manually and logged defects using HP Quality Center.
Responsibilities included M anual GU I Testing, Functional Testing,
I n tegration Testing, Regression Testing, I nte rface Testing, End-to-End
T esting, Database Testing and User Acceptance Testing .
Client Application Testing, Web based Application Performance,
S tress, Volume and L oad testing of the system.
Analyzed performance of the application program i tself under various test
loads of many simultaneous Vusers.
Analyzed the impact on server performance CPU usage, server memory
usage for the applications of varied numbers of multiple, simultaneous users.
I nserted T ransactions and Rendezvous points i nto Web Vusers.
Created V user Scripts using V uGen and used Controller to generate and
executed Load Runner Scenarios.
Connected M ultiple Load Generator w ith Controller to support Additional
V users
Created scripts to enable the Controller to measure the pe rformance
of Web server under va rious load conditions.
Automated the f unctionality and inte rface testing of the application using
Q uick Test P rofessional (QTP).
I nsert Object Data Verification Check point on Q uick Test P rofessional
( QTP) automation testing tools.
Verify Back end Data using ODBC a fter interacting with front-end
A utomation Test scripts.
Used Q TP for Shared Object repository creation and maintenance,
used r egular expression, re-usable actions, data table, checkpoints and
r ecovery scenario.
Imported data from the Database to the Data table and performed Data
D riven testing with different Data sets for reports generation module in Q TP.
Verify test approach, validate the stability of the product, test and evaluate to
achieve acceptable results.
E nvironment: W indows Server 2003, 2005, Java, Java Script, HTML, UN IX,
SQL, Oracle 10g, TOAD, I IS, H P Load Runner, H P Quality Center, QTP.
United Parcel Services, I nc. (UPS)
Luthe rville T imonium, Ma ryland.
Job T itle: Softwa re Test Analyst N ovember 2007
– D ecember 2009
Responsibilities:
Involved in preparing Test Plan and Test cases.
Developed t est cases for automation team for regression testing.
Formulated methods to perform positive and negative testing against
requirements.
Performed b ackend testing using SQL queries .
Reported b ugs found du r ing test using Q uality Center.
Conducted f unctional, regression, black box and system testing .
Reviewed f unctional design for i nte rnal product documentation .
Used Quality Center for r equirements management, planning,
scheduling, r unning tests defect t racking and m anaging t he d efects.
Analyzed, tested, and certified application-specific software and performed
ambiguity reviews of business requirements and functional specification
documents.
Developed Manual Test cases and t est scripts t o t est the functionality of
t he application.
Assigned, t racked, and status problem reports, Change Requests, and release
packages in Dimensions.
Generated various graphs like Transaction summary graph, Vuser graphs in
L oad Runner Analysis and reported the results.
Provided t est results, g raphs, and a nalysis of application pe rformance
d ata by email or phone during testing to the application developer and
manager.
Implemented A utomated-testing methodology such as D ata D r iven
T esting, Key Word Driven Testing methods.
Created and executed regression scripts using Quick Test Professional.
Inserted various check points, parameterized the test scripts, and performed regular
expression on scripts.
Documented tests bugs in Quality Center.
Also tested the modules on the application by Manual Testing and data validation using
SQL queries.
Created VuGen scripts by using Virtual User generator for Component test, Stress test
and Volume test.
Parameterized unique IDs and stored dynamic content in variables and paired the values
to web submits under http protocols.
Performed manual and auto correlation to handle errors due to dynamic contents.
Created Load Runner Scenarios for controller and executed scenario to verify
applications performance under various loads.
Measured Response time at sub transaction levels at web, App servers and database server
levels.
Analyzed Load test results to generated reports and provided them to my technical lead.
Environment: W indows Server, Java, Java Script, HTM L, UN IX, SQL, TOAD,
O racle, Web Logic, Quick Test Professional, Load Runner, Quali ty Center.
M o rgan Stanley
New York, New York
Job T itle: QA Analyst J anua ry
2006 – October 2007
Responsibilities:
Analyzed the business requirements and involved in the review discussions.
Participated in high level design sessions.
Participated in the Q A activities for various releases of the Project.
Performed System and I n tegration Testing .
Drafted test cases based on Functional Specifications and System
Specifications.
Prepared of T est Plan and analyze in tegration system impacts.
Involved in M anual Testing of the application for N egative and Positive
scenarios.
Train team members on the new business functionality of BRD.
Performed R egression Testing t o end sure that bugs have been fixed and the
application was running properly.
Extensively involved in executing, analyzing and verifying test results and
worked with developers to resolve issues.
Communicated project business issues to appropriate business leadership
g roups.
Responsible for O bject Repository, maintained it in the central repository
and made changes as new changes were developed
Wrote SQL statements to extract Data and verified the output Data of the
reports.
Prepared R equi rement T raceability Ma t r ix (RT M) t o establish t raceability
between requirements and test cases.
Modified and maintained test cases due to changes in the requirements.
Detected, reported and classified bugs in T est D i rector .
Used T est Di rector for managing test execution and defect t racking of all
issues.
Conducted internal and external reviews as well as formal walkthroughs, and
participated in status meetings.
Environment: Windows, SQL Server, Oracle, TOAD, Visual Basic,
W inRunner, and Test D i rector.
ED UCAT ION:
B achelor of Business Administration
M inor: Network System Administration
Lehman College Bronx, City University of New York, NY
PROFESSIONAL REFERENCES:
Available upon request.
•