YUBA DHAKAL
**** ****** **** *******, ** *****
Phone 703-***-**** Email: ****.******@*****.***
PROFESSIONAL SUMMARY
Over Seven years of Professional experience in IT industry with extensive experience of performing
manual, database, and automated testing of web-based, desktop and client/server applications on
UNIX and Windows environment. Demonstrated experience working with client, stakeholders and
business owners performing requirements harvesting, analysis, definition, providing guidance in
translating complex program requirements into testable objectives, developing requirement
traceability matrices, writing and executing test plans, test cases, test strategy, test scenarios, Test
data, Test log, Test Summary Report, defect logging, tracking, status reporting and documentation.
Hands on experience in System Analysis, QA Processes, Software Testing Life Cycle (STLC) and
software Test Methodologies.
Experience in analysis of system requirement specification, functional specifications and
design documents.
Thorough understanding of different phases of Software Development Life Cycle (SDLC),
Agile development model, Waterfall model, V-Model development model.
Experience in understanding Business Process from the requirements and converting
them to test scenarios.
Solid experience in Smoke, GUI, System testing, Integration testing, Functional testing,
Performance testing, Stress testing, Backend testing, Sanity testing, and Regression
Testing.
Hands on experience in Tracking and Reporting of defects using Defect Tracking Tools such
as HP Mercury Quality Center, and Test Director.
Working Knowledge of Capability of Maturity Model Integration (CMMI) levels and 508
compliance and regulations as well as SOX regulations.
Experienced in maintaining Requirement Traceability Matrix (RTM).
Experienced in tracking and logging Defects with high level of detail, accuracy and
informative recreation steps.
Hands on experience in writing advanced SQL Queries, extract data from SQL Server, DB2
and Oracle.
Extensively working with version management tools VSS, Advance skills in Visio and
Microsoft Office applications
Expert level knowledge of mortgage, finance, banking and accounting.
Strong analytical background with in-depth experience in financial services, e-commerce
and banking
Excellent knowledge and working experience in Test Execution and Test Result Analysis.
Self-starter, team player working on fast-faced development environments committed to
the deliverables.
Flexible and versatile to adapt to any new environment and work on any project.
Possess strong communication, presentation, problem solving and excellent interpersonal
skills.
Organized, attention to detail, multi-focused and ability to meet project milestones and
deliverable dates.
Experience in building, troubleshooting and supporting personal computers, networking,
Strong team player with excellent interpersonal and teamwork skills.
EDUCATION
Master’s Degree (Majoring Economics), Tribhuban University, Nepal
Bachelor’s Degree (Majoring Mathematics), Tribhuban University, Nepal
TECH NICAL SKILLS
JIRA, HP-Mercury Quality Center, QuickTest Professional,
Testing Tools
TestDirector, LoadRunner, VersionOne
Programming Language C,C++, JAVA, VB, .NET
Scripting Language HTML, XML, VbScript
Database Oracle, SQL Server
Operating Systems Windows 95/98/NT/XP/ 2000, Unix
Applications tools MS Office, Visio, Great Plains 10.0, Splunk, SortSite
WORK EXPERIENCE
AAMC, Washington DC 10/2011
- Present
Sr. QA Test Analyst
Responsibilities:
• Analyze technical specifications and functional specifications to determine testing needs.
• Write high-level test plans and detailed test cases from requirements and developed
documentation.
Develop and documented test cases based on Business Requirement Documents and Use
•
Cases documents.
Create Test sets in HP Quality Center and accurately maintained execution status of test
•
cases.
Document defects, update defect status, and tracking defects to closure using HP Quality
•
Center
Developed and maintain Functional and Regression Test cases using QTP
•
Modify the existing QTP scripts based on approved requirement changes
•
Track, monitor, and prioritize testing issues (bugs), and interacting with developers to
•
understand and resolve bugs.
Participate in Daily/Weekly Agile Development meetings and provided the QA feedback as
•
needed.
Regularly follow up with Development Team to discuss discrepancies identified during testing.
•
Perform Back-end testing using SQL queries to validate the data in the back-end Oracle
•
Database using SQL Developer tool.
• Actively participant in system, Regression, Integration and UAT(User Acceptance Testing) Testing
Attended weekly Project Status Meeting with Development Team, Project Manager, Business
•
Analyst and worked closely to define Test Scope, Gap Analysis, and Risk Mitigation
Analyzed and identified problems with the existing QA process and continuously made the
•
process improvements to improve the test coverage and reduce the test cycles.
Worked closely with all the business as well as technology stakeholders to analyze complex
•
business processes
Review and provide feedback for test deliverables and providing oversight and guidance for
•
testing activities
Prioritized test cases based upon the Business Requirements and managed them accordingly
•
in Quality Center
Review QA check lists to make sure that all the requirements are met for the releases
•
Worked with development and QA team to perform root cause analysis for defects found
•
during testing.
Mentor and coordinate activities of the junior members of the team
•
Participate in production readiness reviews meetings and help management to improve SDLC
•
and QA process in the organization.
Environment: J2EE, Quality Center, SPLUNK, SCRUM, VersionOne, SortSite, Oracle, Unix, Quick Test
Pro, Microsoft Office, Java, XML, Multibrowser, JAWS
CGI, Fairfax, VA 08/2010 –
10/2011
QA Test Analyst
Responsibilities:
• Analyzed Business Requirement Specification (BRS), System Requirement Specification (SRS)
and User Requirement Document (URD) and provide feedback to business owner.
Designed and developed Test Plan, Test Scenarios and Test Cases based on Business
•
requirements, technical specifications, and Use Cases covering both positive and negative
testing requirements
Mentored junior test engineers.
•
Involved in preparation of the test data required for executing test cases.
•
Performed UI Testing on the application web Interface.
•
Performed Functional and regression testing using quick test pro (QTP).
•
Modify the existing QTP scripts based on approved requirement changes.
•
Created Standard checkpoints, Bitmap checkpoints and Text checkpoints to check the
•
application’s current behavior to its behavior in previous version using QTP automation tools.
Enhanced the existing or default test scripts by adding some programming for custom testing
•
and debugging the scripts.
Developed Base line scripts for Performance, Load and Stress Testing of the application using
•
LoadRunner
Developed LoadRunner VuGen script and Created LoadRunner Controller scenarios to run the
•
script.
Create new bugs and track status of bug using JIRA
•
Coordinate UAT testing activities and report progress
•
• Performed UAT (User Acceptance Testing) and executed to verify requirements, look and feel of the
applications.
Track defects and facilitate defect review meetings with team leads
•
Involved in writing SQL queries to perform backend testing.
•
Performed Regression testing on the application for testing validity of functionality in different
•
versions of the application
Environment: QTP, LoadRunner, Quality Center, JIRA, ASP. Net, XML, HTML, Oracle, Microsoft Office,
Unix, JAWS, WAVE
ING DIRECT, Wilmington, DE 05/2008-08/2010
Software Test Analyst
Responsibilities:
Reviewed Business Requirements, Functional and Design Specification documents, Identified
•
Test Conditions, Ambiguities, Conflicts and Risks and developed Test Scenarios and Test cases
Participated in defect review meetings with the team members and monitor defect status.
•
Performed Functional, Negative, Regression and Smoke Tests on the new enhancements
•
added to the web application.
Responsible for generating of various Test Scenarios & Test Cases, Test Cases Review and Test
•
Execution.
Used MS Project to keep track of tasks status and prepared Test Status Reports during test
•
execution processes.
Performed Back-end testing using SQL queries to validate the data in the back-end Oracle
•
Database.
Monitor resources to identify performance, bottlenecks, analyze test results and report the
•
findings to the clients, and provide recommendation for performance improvements as
needed.
• Train external broker companies on newly developed ING systems to help create a more
efficient work flow
• Design training documents and video tutorials for interactive web application.
• Work with internal stakeholders to define business requirements through the interactive
creation of process models
• Administer system testing to ensure all errors have been identified and corrected before
project launch date.
Environment: VB, LoadRunner, Quick Test Pro, Quality Center, Oracle, SQL Server, Windows
NT/2000/XP
DIGENE Corporation, Gaithersburg, md
Quality Assurance Tester 09/2005-05/2008
Responsibilities:
• Responsible for developing Test Cases as per Business Requirements Documents and Use
Cases.
Designing and Executing Test Cases in Test Director using different testing techniques like
•
Positive Testing, Negative Testing.
• Used Test Director for bug tracking and reporting, also followed up with the development
team to verify bug fixes and update bug status.
• Investigated the Software bugs and interfaced with developers to resolve technical issues.
• Define the test strategy for customer requirements to include designing, writing, and
executing test cases and procedures.
• Created and filed various office documentation using MS Office applications.
• Performed execution of test cases manually to verify the expected results.
• Meet with the developers and technical content writers on a regularly to update the test
documents.
• Defects were tracked, reviewed, analyzed and compared using Test Director.
Environment: Windows, Visual Basic, ASP, SQL Server, XML, Test Director, MS Office.
REFERENCES:
Will be furnished upon request