IMELDA OLISA
*** ******** **** 334-***-****
Montgomery, AL 36117 *.*****@*****.***
CAREER OBJECTIVE
To obtain a challenging position as Quality Assurance Lead so that I can utilize my analytical, detail oriented programming/testing background, education and computer skills in Automation and Manual testing.
SUMMARY OF QUALIFICATIONS
10 plus years of experience in Software QA Testing and 3 years in Programming environment.
Excellent knowledge of full Project development life cycle, CMM, analysis, design, development,
Testing, implementation, documentation of various client/server applications and project management.
Posses strong analytical and interpersonal communication skills. Other qualities include detailed-
Oriented, supervision, problem solver, organized, innovative, work independently and conscientious.
CORE PROFESSIONAL STRENGTHS
Software Visual Studio.Net 2008, MSSQL Server 6.5/7.0,2000, Crystal Reports, Dreamweaver MX, Fireworks MX, Flash MX, MS Office/XP, and Visual Source safe
Languages ASP.Net, C#, XML/XSLT 1.0, DHTML, JavaScript, VBScript, Visual Basic 5.0/6.0, SQL, UNIX platform and Mainframes,
Databases SQL Server 2008, DB2 SQL, Oracle 8.0, Ms.Access, DB2, CICs, PL/SQL, Oracle Developer 10g Environment, Oracle 10g.
Testing Tools Test Director/ HP ALM Quality Center11.0, HP Quick Test Professional/(UFT), Win Runner, Load Runner, Rational Suite, TFS, Bugzilla, JIRA and SharePoint.
Alfa insurance. Montgomery AL. May 15 – Aug 15
Quality Assurance/Test Team Lead
Alfa is implementing a new Policy Administration application called Guidewire. They currently have 17 policy
administration applications and will be reducing this to 1 with this solution. Utilized software's such as JIRA,
SharePoint, HP ALM Quality Center and UFT.
I. Role:
Was responsible for defining the QA strategy and tactical processes and leading testing efforts
in support of company products (hardware and software) and services.
Driving the execution of and reporting on validation plans and the creation and maintenance of systems and other
relevant documentation.
Proactively managed and developed testing team members, and ensured that the team adheres to
industry best practices.
Ensure that internally developed or purchased IT systems applications, infrastructure, and other elements are developed, tested and implemented consistently to meet established quality targets.
Liaise with project managers and IT management to ensure integration of testing processes throughout the
project management, software development, and system procurement lifecycles.
Oversee execution of formal, centralized testing processes for assigned IT systems.
Oversee the creation of system test plans and their successful execution.
Implement automated testing wherever feasible. Drive to automate as much testing as possible.
Continually assess customer requirements and IT roadblocks to meeting them; present recommendations to QA management.
Complete reportable scorecard(s) on code defects and other quality metrics. With QA leadership, establish plans for improving these metrics.
Manage the defect logging and reporting process.
Manage appropriate deliverables. These may include but are not limited to test plans and supporting reference documentation; design reviews; release reviews; QA investigations; and performance & load assessments on various IT systems.
Foster teamwork and a spirit of collaboration among team members internal, offshore, and business partners while conveying a sense of urgency and responsiveness to business needs.
II. Know-How
A. Technical/Specialized:
• Expert knowledge of manual and automated testing methodologies, including selection, configuration, and maintenance of modern automation tools.
• Expert hands-on technical knowledge managing system testing in new computing architectures/environments. Knowledge of relevant tools, databases, and middleware.
• Working knowledge of trend analysis and other quality metrics techniques.
B. Managerial:
•Process thinker. This role will implement and improve multiple new processes.
•Comfortable leading dispersed/offshore teams.
•Able to convey calm but focused sense of operational urgency to team members.
•Strong project management and organizational skill
Human Relations Skills:
• Strong customer service focus.
• Ability to work independently
• Strong verbal and written communication skills. Regularly explained complex issues to diverse stakeholders, including IT leadership. Train non-technical business users in QA testing processes and methodology.
III. Problem Solving:
• Responsible for implementing new-to-Alfa processes and methodologies. The ability to creatively scale and adapt new, challenging concepts to existing frameworks, including the ability to train and communicate these concepts to the testing team and other IT groups.
• Strong analytical, problem-solving, and conceptual skills. Assist in developing clear solutions to complex problems, often where no current answer exists.
• Assist QA leaders in developing quality improvement strategies and techniques, using trend and metrics analysis to come to conclusions on quality-related issues.
State of Alabama Department of Finance(SBS) Oct 14 – May 15
Quality Assurance/Test Coordinator
The CGI Advantage is a web-based accounting package application used in many parts of the country to
handle all the accounting needs of some of the state agencies in Alabama. QA testing was performed
Manually. Using SharePoint and Service Now to Log and Track defects.
Role:
Hired to Establish the UAT team. Establish all QA and UAT processes and procedures.
Created the core UAT Activities List, the testing schedule and UAT Test Plan and Procedures.
Outlined the UAT Preparation and Work Plans, Test Data and Scenarios
Developed the UAT Test Design Process, Test Cases Documentation and Test Cases Outline
Created, modified and Validated the Test cases in SharePoint.
Formed the UAT Participants Orientation provided help guides and training support
Develop tracking on the Test facilities reservations for the end users to perform testing.
Design the Test Execution Process and planned the UAT cycle.
Prepare the UAT Readiness Review such as Prerequisites and Entry Criteria
Design and develop the UAT Defect Management Process
Develop Overall STAARS Software Maturity Charts and test Metrics
Organize the Test Reporting that relates to the post-test phase which includes test event log analysis, test Cases execution reporting and a formal test report assessment.
Report the Post-Test Activities includes retrieve, review and analyze application logs
Conduct Test Out-Briefings and articles on the status of the testing.
Coordinating with vendor on Migration Packages were kept to a minimum to prevent a delay during UAT
Organize and coordinate with the vendor about the CM Plan and Procedures.
Created the Issue Tracking lifecycle and conducted Orientations on the Issue management process.
Detect, Review and analyze Requirement and design issues.
Check and analyze the performance test plan.
Log and track all UAT technical issues in the Test Log.
Sent mass emails to the end users, notifications on the UAT checklist, testers assigned, COA, workflows, test plan and procedures etc.
Organized and created the UAT workflow structure in SharePoint for the end users.
Provide functional test support to the end users through the STAARS Website support.
Perform Functional and Regression Testing
PRINCIPAL ACCOUNTABILITIES:
1. Managed assigned IT testing. 60%
Develop and oversee a comprehensive testing program and processes for IT applications, hardware, and other systems.
2. Managed IT quality metrics reporting for assigned testing. 25%
Assisted Director/QA in developing trend analysis and metrics reports to assist IT leadership with continuous quality improvement. Administer and regularly validate these metrics.
3. Supervised and developed the members of the IT Testing Team. 15%
CACI inc. Montgomery AL. Apr 13 – Feb 14
Quality Assurance Analyst.
Multiple web-based applications which were developed using Visual Studio.Net and SQL Server
were tested Manually. Used HP ALM - Quality Center 11.00 for the Air force client and Bugzilla
as in house to Track software defects.
Role: QA Analyst
Designed and tracked quality assurance metrics such as defect densities, open defect count, code coverage in the
project life cycle.
Provide scope, resources and time estimates for projects presented to QA team for testing.
Design, develop and execute test plans, test scripts and test cases.
Assisted in the architecture, implementation, and administration of testing systems.
Participate during the design and development stages of the product life cycle to provide
feedback on usability, performance, and data accuracy and risk factors.
Developed a quality monitoring system (automated or manual) for post launch products.
Assisted in developing quality strategy - including process, metrics, and controls,
utilizing TFS as the primary platform, assuring appropriate test coverage in accordance
with standardized QA process methodologies.
Installed and configured the Bugzilla defect tracking system.
Administered the Bugzilla application for various projects.
Created weekly bug reports/charts to the upper management.
Maintained the back-end i.e. database My SQL software.
Developed Cron Jobs to be scheduled nightly using My SQL software.
Created Test Plans and Test cases for various projects.
Uploaded test plans, test cases and Requirements in HP ALM - Quality Center 11.00
Logged the application bugs into Bugzilla defect tracking system.
State of Alabama DHR, Apr 05 – Jan 12
Quality Assurance Lead.
This was a web-based application using ASP.Net, C#, connecting to and Oracle database converted from
Mainframe. QA testing was performed using Manual and Rational tools. Used Test-Director to Track
defects.
Role:
Hired to Establish the QA team. Established all QA processes and procedures.
Managed a team of 6 QA testers.
Provide scope, resources and time estimates for projects presented to QA team for testing.
Design, develop and execute test plans, test cases and test scripts.
Assist in the architecture, implementation, and administration of testing systems.
Develop metrics and work with others to optimize application / systems performance through Quality center.
Provide functional, system, performance and load testing support to all teams within DHR.
Perform regression testing using Win Runner for the DHR Web/PERS ASSIST and the (FACTS)
Family Adult and Child Tracking Systems on a web environment.
Create and execute various test procedures, test scripts using testing tools Quick test professional, Load Runner
and Rational Suite and Spec flow.
Provide an accurate assessment of risk in our software products and assist QA Manager
in identifying and implementing appropriate testing strategies, policies, and procedures to
ensure the success of our released product.
Provide application upgrades, migration, configuration, management and total disaster recovery
testing.
Work effectively with business analysts, to ensure QA testing requirements are appropriately
Identified and included in the business requirements.
Responded to patch requests, performed enhancements and assisted with user problem
Determination/solution using TFS.
Conducted version verification on new releases of GIs software.
Effectively coordinated a testing lab and lead new analysts in execution of test scripts.
Use and maintained Watch fire Robot, Clear Quest to detect broken links and Rational rose to
Publish workflows on the website.
Work closely with developers, to ensure all relevant and risk factors of the product are tested as
well as incorporated into the automated testing tools.
United Toll Systems, Wetumpka AL. June 03 – Apr 05
Quality Assurance Lead.
This was a web-based application using Java Servlets connecting to and Oracle database on Sun Solaris.
QA testing was performed using Manual and Mercury tools. Used Test-Director to Track defects.
Role: QA Lead
Managed a team of 4 QA testers.
Provide scope, resources and time estimates for projects presented to QA team for testing.
Design, develop and execute test plans, test scripts and test cases.
Assist in the architecture, implementation, and administration of testing systems.
Participate during the design and development stages of the product life cycle to provide
feedback on usability, performance, and data accuracy.
Develop a quality monitoring system (automated or manual) for post launch products.
Provide functional, system, performance and load testing support to all teams within UTS.
When necessary, participate in the manual monitoring of the system during live events
Create and execute various test procedures, test scripts using Win Runner, Load Runner and Test
Director.
Provide an accurate assessment of risk in our software products and assist QA Manager
In identifying and implementing appropriate testing strategies, policies, and procedures to
Ensure the success of our released product.
Act as the main contact for verification of bug fixes and problem resolution prior to closing open
items.
Work effectively with business analysts, to ensure QA testing requirements are appropriately
Identified and included in the business requirements.
Use and maintained Watch fire Linkbot, Clear Quest to detect broken links and Team Site to
Publish workflows on the website.
Work with developers, to resolve issues and risk factors of the product are tested as well as incorporated
into the automated testing tools.
Work directly with product development and engineering groups to develop test cases, test scripts,
and quality requirements for certification of products prior to launch.
Participate in test case execution, bug tracking, and issue resolution during development life cycle to
ensure products meet all requirements of the functional specification prior to launch.
Use SQL to validate the data generated by various products as necessary
Develop a quality monitoring system (automated or manual) for post launch products
Analyze and identify the root cause of anomalies encountered by reviewing log files, database tables, etc.
Interface heavily with Product Management, Business Analysts, other QA Team Members, and
Developers, Development Managers to coordinate all testing efforts.
ILD Telecommunications, Atlanta GA. Jan 2000 – Jan 03
Programmer/Quality Assurance Lead.
This was a web-based application using ASP pages connecting to and SQL Server database.
QA testing was performed using Manual and Mercury tools. Used Test-Director to Track defects.
Role: QA Lead
Hired to Establish the QA team. Established all QA processes and procedures.
Managed a team of 6 QA testers.
Provide scope, resources and time estimates for projects presented to QA team for testing.
Design, develop and execute test plans, test scripts and test cases.
Assist in the architecture, implementation, and administration of testing systems.
Develop metrics and work with others to optimize application / systems performance.
Provide functional, system, performance and load testing support to all teams within ILD.
Perform regression testing using Win Runner.
Create and execute various test procedures, test scripts using Win Runner, Load Runner and Test
Director.
Provide an accurate assessment of risk in our software products and assist QA Manager
In identifying and implementing appropriate testing strategies, policies, and procedures to
Ensure the success of our released product.
Work effectively with business analysts, to ensure QA testing requirements are appropriately
Identified and included in the business requirements.
Use and maintained Watch fire Linkbot, Clear Quest to detect broken links and Team Site to
Publish workflows on the website.
Work closely with other developers, to ensure all relevant and risk factors of the product are tested as
Well as incorporated into the automated testing tools.
Role: Programmer
Project: Prepaid Phone Cards system.
Tools: Active Server Pages (ASP), VBScript, JavaScript, MS SQL Server 7.0, Windows NT, SQL, MS Visual Interdev, Visual Source Safe.
Description: Visual InterDev, VBScript, and Visual Basic 6.0. Windows NT, MS SQL Server 7.0
Responsibilities: Currently involved in design and development of Intranet Pages for Prepaid system. The web
Pages are being developed using Visual InterDev and MS Frontpage98. The Active Sever Pages (ASP) are
coded in VB Script; the backend database is on MS SQL Server 7.0. On a Windows NT platform and the
Internet server is IIS 4.0. ActiveX ADO is used to communicate with the database. The Pages enables users to
browse the list of Long Distance rates offered, and information about other departments and Consulting.
EDUCATION
Master of Commerce Osmania University. Hyderabad, India. 94 – 96
AUM Alabama : ASP.Net, C#, Programming, Web development – 04/2009
Web Info Tech, AP, INDIA : Visual Basic 5.0/ Oracle 5.0 with Project. – 09/1997
Aptech Computer Education : Information and Systems Management - C,C++ 10/1994