Post Job Free
Sign in

SR QA Professional

Location:
Norcross, GA, 30093
Salary:
70-80K
Posted:
April 05, 2009

Contact this candidate

Resume:

JOHN YAVELAK

Senior QA Analyst

**** ******* *****

Norcross, Georgia 30093

770-***-****

aucjxm@r.postjobfree.com

KEY SKILLS & QUALIFICATIONS SUMMARY

• 15 Years QA, Lead 6 Years, 10 Years Unix/Shell scripting, CMM/PMT 3 Years, Manual 13+

• “Heads-Down” QA who digs in to logs and files, from Client-Server to Mainframe, Web Based 3 Tier XML and Java/GUI/NetX environments, Full SDLC in 2 major Telecoms. 1 Year in Financial Industry.

• Mercury Exp: Test Dir/Quality Center 4 years, LoadRunner 2+ Years, QTP 2+ Years, WinRunner 1 year

• Automation Tester: Unix/Shell scripting, machine-machine + XML file build/backend, QTP(Regression), WinRunner(Data Verify/Loads)

• Automation Architect: created XML auto and load test data, scripting, and libraries

• Load/Performance Tester: LoadRunner Web,Siebel, Business Obj, Winsock protocols

• Team Lead, Manual/Load: Provided direction for Team.Trained Manual testers and Load Testers

• Script/Programmer: 10+ Years, UNIX/shell scripting for data builds, auto / load tests

• Innovator using tools (QTP/WR) to verify data and test, Shell scripts to automate XML processing.

EMPLOYMENT HISTORY

AT&T (contracted from ChemTech) 10/07 – Present

QA Test in 3-Tier XML/Vitria Driven Application “BSOMS2”, a Business Process Management (BPM) application which allocates trouble-ticketing to call center technicians, for AT&T U-Verse) Manual Test case creation, execution, and defect creation. Creation of XML test data, automation and Load Testing. QA test Java/GUI/NetX Provisioning application. Back-end testing to EMBus. XML Test Case automation.

Position demanded heavy Unix/Shell scripting.

Accomplishments:

• The XML/Unix data build/auto/loadtest libraries that I built for one system (BSOMS2), are being copied/ported and modified to be the base libraries for another application (BBCATS)

• Wrote 73 Defects written for a single BSOMS Release 1, initial manual testing

• Automated Unix/Shell scripting libraries to build the XML input files for Automated Functional Tests (and created the QTP scripts which verify their GUI presentations)

• Automated the XML file build and updating of all Load Test data for Performance Tests.

• Automated the “back-end” order processing of XML orders, sending orders interval timed and spaced in to the EMBus to run Performance Tests, paired with LoadRunner GUI platform script operations

• Automated tests (QTP) have caught multiple data errors (especially “nillable” fields) in the XML file processing. I am working to increase automation percentage within our test cases

• Sole builder of XML records via UNIX/shell scripting & creator of SQL queries, trained QA co-worker Unix/SQL. Complimented by TechMahindra replacement staff on the ordered and precise scripting.

• In interim assignment, JavaGUI/NetX (AT&T U-Verse provisioning),for back-end “low level” testing:

Caught requirement issues (invalid fields) in Unit Testing

Assisted developer in set up of Simulator with Weblogic queues

Built Unix/Shell shell scripts to break out FTP interface file

Technical Environment:

• Tier 1: Weblogic 9.2

• Tier 2: Vitria 4.3+ / Business Objects (Web Intelligence + Crystal] Reports

• Tier 3: ORACLE 10G

• Servers: SUN servers, Unix, SunOS 5.10

• Test Tools: Altova XML Spy Prof Edition 2008 Ed SP1, Quality Center 9.0, QTP 9.0, LoadRunner 8.1

CompuCredit (QA Analyst III [employee] ) 9/06- 09/07

Began as contractor, converted to Employee 03/2007.Responsible for Integration/Functional testing in Data Warehouse credit card (TSYS) environment Load Testing against various Web and Business Objects applications. Reason for Leaving: CompuCredit offshored I.T. functions + lay off over 60% of I.T. staff.

Accomplishments:

• Reduced Web Platform regression test time from 8 hours to 30 minutes with QTP automation

• Caught Omissions in the Include/Exclude Business rules (wrong column status fields)

• Caught server crashes and database query failures with loadtesting (ibiz)

• Caught field omissions in table processing (emboss)

• Have automated (1)XML gateway testing and (2) Test Account data loading using QuickTestPro

• Have automated XML responses analysis codes using QuickTestPro AND Unix/Shell scripts

• Recorded 24 defects against TIDAL/ET/Python data load, discovered requirement errors

Technical Environment:

• Operation: TIDAL 2.5, running PYTHON/SQL

• OLAP: Business Objects 6.5

• ETL: Informatica 7.1.4

• Test Tools: Quality Center 8.2, LoadRunner 8.1, QuickTestPro 9.0

AutoTrader/COX Enterprises (Contracted from Act 1 Technical) 06/06 – 08/06

QA Engineer Backend and Manual Tester in Ephiphany 6.5 environment.

Wrote over 40 defects in Release of Internet CARS sales

Release. Created and shared SQL scripts (ORACLE) for CARS inventory querying.

Data verification strategy, researched automation side QTP usage.

BellSouth (contracted from Accenture to Customer Mkts Assignment) 03/04 – 06/06

QA Load/Performance Tester (Senior Systems Analyst)

Load/Performance Test and Team Lead for SIEBEL and Web Based Broadband and Customer Billing Applications. Review requirements, create and execute test cases. BAC/TOPAZ Performance Monitoring Scripts for BellSouth Siebel RTM Ticketing Application. Functional Testing and Load Testing(using LoadRunner) on BellSouth Billing Applications: EBPP, CCT, BSLD, Sales and Broadband applications Manual Functional, Auto & Load testing. WinRunner scripting for test data creation/verification. Automated Regression cases via QTP in a web testing environment. SOX(Sarbanes-Oxley) Compliance Auditor for short period.

Accomplishments:

• Tested two releases, one a major re-write, in first 3 months EBPP

• Mentored three Test Teams in the set-up of their LoadRunner environments EBPP

• Created and shared parameterized SQL scripts to team for ease of querying EBPP

• Load testing performed [4/28] uncovered need for SOAK tests EBPP

• Load testing uncovered maxing out of n-web server and CPU EBPP

• Tested as many cases as two other testers, wrote as many defects as three other testers, EBPP

5-25 through 6-20 in first cycle (over 60)

• Wrote 215 Cases for CCT cycle. Wrote over 30 defects in first cycle 9-13 – 10-15. CCT

• Did (a) Test Data research [wrote WinRunner scripts to build test accounts] CCT

(b) Requirements research (c) Mentoring of new QA’s in this environment

• Assisted Load Test environment and data set-up for their Load Test person CCT

• Assembled/Presented Lessons list for the QA Lead (Test environment needs, etc) CCT

• Created Automated Test Data Loads for about 2000 accounts, saving many hours work CCT

• Load Testing discovers (a) crashed server (b) un indexed table EPP

• Had automated over 60 regression cases via Quick Test Pro (halted for LR work) BSLD

• Had created/then ran WinRunner scripts overnight to load 1,000 ticket orders for

the QA Performance Test Bed BSLD

• Had been mentoring CPR Load Test person AND running EBPP Load Testing [BSLD]

• Performance/Load testing has revealed: BSLD

* Breakpoint/bottleneck points

* BEA setup and tuning issues

* Web Server out of Memory

• Performance/Load Testing showed (1) inadequate threads (2) server exhaust SALES

• SOX auditor in starting up effort in department. Have discovered CP problems SOX

• Plan/Research 22 Application Database server/ORACLE upgrade migration

which includes Regression Test Planning AND Load Testing BigDB

• Learn and Script LoadRunner 8.0 against SIEBEL, 7.8 including correlation

and debugging BBT

• Discovered and documented functional defects, Created Test Accounts BBT

• Created naming convention to report transactions in a very efficient manner BBT

• Cut hours out of data verification in Integration Testing using QuickTestPro iPRT

• Discovered and Alerted Teams to Perf Testing deficiencies VERY EARLY in Dev. CRM

• Trained 2 brand new LR Testers to be ready to script within a week CRM

(Brief BellSouth TOPAZ/BAC assignment Spring 2006)

Technical Environment:

• Applications: SIEBEL 7.8, Yantra

• Systems: PC; Windows 2000 Professional, SUN; OS5.8

• System Env.: HTML, Unix, ORACLE 8.1.7, Java/J2EE.

• Web Environment: iPlanet Webserver, Weblogic App Server; Orbix 2000; Connect Direct

• OfficeTools: Excel, Internet Explorer 6, Netscape 7.2,Outlook, WORD, Exceed 7.1,

• Test Tools: LoadRunner 8.0,7.6, WinRunner 8.0, 7.6, Quality Center 8.0,

QuickTest Pro 8.2, 6.5

Student and Part-Time Store Clerk 12/02 – 03/04

Took voluntary separation package from BellSouth, then 5 months advanced software test training at Malix Institute in Norcross, Ga. Trained in the Mercury Suite of automated test tools, WinRunner, Loadrunner, TestDirector, worked part time as a store clerk at Brock’s Army Surplus in Decatur, re-entered job market.

BellSouth Science and Technology, Atlanta, GA 12/00 – 12/02

Quality Assurance Test (Employee, Associate Member Technical Staff)

Software Test for SMS, a core revenue generating system($1 Billion/month) which provisions BellSouth SS7/AIN local services. Perform Network Element download of data and subscriptions testing, network capacity tests... Reviewed requirements, created/executed test and regression cases. Supported simulator environment for up to 6 testers. Responsible for SOAC (Service Order Analysis and Control) to provision AIN service activation and PSIMS files (machine to machine interface). Load/Stress analysis of SOAC processing. Process review of System Test Procedures (PMT) for CMM Level 3 certification process. Responsible for 300+ test cases, 108 were automated regression(ATF interface to Xrunner). System test assignment web page responsibility.

Accomplishments:

• Led team to create and display exhibit for BellSouth Innovation Showcase 2002 (Application Performance Analysis), including tools and outputs for scalability testing.

• Led a cross-functional team to streamline the intricate Network Simulation environment from fragmented individual scripts into a single easily managed standardized system.

• Modified web page into team information resource, placing instructions for the testers in one easily accessible place bypassing laborious library searches.

• Standardized SOAC/SMS machine to machine interface checking via ksh scripts, shortening setup times from 30 minutes to under 5 minutes.

• Created SQL scripts testers use to see multiple network simulator status on one page instead of dozens of laborious screens by compressing 5 network associations connection status’ into one line, instead of multiple screens per Network Element.

• Documented 90+ defects, using functional, regression, and machine-machine interface testing.

Technical Environment:

• Systems: PC: Windows NT/2000, SUN: Sparc OS5.5, Server: HP, 9000/800, HPUX11.

• System Tool/Env.: KSH script, SQL, HTML, Unix, ORACLE, MOTIF, Client/Server, C/C++.

• OfficeTools: Microsoft Excel, Internet Explorer, Outlook, WORD, Adobe Framemaker 6.0, Netscape Navigator, AFT Interface to Xrunner 6.0 (3rd Generation Automation).

• Test Tools: PVCS Dimensions, TCMS, NE USLI.

• Network Prot/Interface: TCP/IP, Topcom, Hyperchannel. Exceed

• Network Equipment: Lucent Advantage SCP, SN, CSN, eMRS.

AT&T, (Operations Technology Center), Alpharetta, GA (1990 – 2000)

QA Test Lead (Senior Technical Staff Member) (1996 – 2000)

QA Analyst, SG4(1990-1993) (MTS-I) (1993 – 1996)

Reviewed documentation, created/reviewed/executed test cases and assisted in the overall Test Plan for Rapid/FASTAR, a mission-critical core network AT&T system. Lead role and acted as a subject matter expert for other testers. Performed process review work (PMT) as part of ongoing CMM process. Created test plans, tested data, used system logs and application for verification. Inspected requirements documentation (i.e., Interface Agreements and Design Specifications).

Accomplishments:

• Lead System Test on Rapid/FASTAR. Acted as information source/mentor for 3 new testers. Assisted in work assignments for the group. HP admin worked delegated to another tester.

• Built and lead unofficial cross-functional team to resolve software issues, from requirements, through customer needs through development all the way through to the final installation.

• Promoted teamwork (in a District with 40+ members) both in and out of our group working together with remote labs, equipment vendors, developers, and the final customers.

• Lead a small project team to prove feasibility of remote site Disaster-Recover site in White Plains, N.Y. Received Salute award from my peers

• Lead the setup of testing scenarios for the test team usually, occasionally development, too

• Lead the Test Team in resolution of resource problems; setup issues, installation procedures, performance issues and server machine lockups.

• Built strong relationships with developers with actions like assisting them in recreating problems and with remote NJ DSL equipment lab along with problem solving on defects.

• Lead test effort on a new feature within FASTAR, NGLN, included overseeing others’ work.

• Lead multi-feature testing efforts (FSU,NGLN,FlexNet) on one release of Rapid/FASTAR.

• Improved overall test team performance by coordination of resources

• Automated manual test runs into AARTS script runs.

• Coordinated work between customers, developers and the DSL lab to resolve problems, received a commendation award from senior developer on diligence and thoroughness.

• Documented initially over 110 system errors from severity 2 through 4 in the 1st year, the resolution of which helped elevate Rapid/FASTAR from a prototype system into a patented AT&T Core Network application.

• Documented over 1260 defects in 10 years. Informally known as a “Defect Generating Machine” by the developers.

• Performed as member of the O.T.C. System Test PMT group, assisted in the clarification of installation procedures for Rapid/FASTAR.

• Evaluated Automated Tools, such as MYNAH, Centerline, and Purify. Saved Project wasted effort in avoiding non-supported tools in environment.

• Caught numerous installation/environment errors by low-level machine-machine testing

• Expanded personal web page for Rapid group to include many major corporate resource reference links, cutting down search time for status and information.

Technical Environment:

• Systems: SUN: Sparc/ULTRA Server, Solaris 2.7; HP: 9000/800, HPUX10, GUI/CORBA 00.

• System Tool/Env.: Ksh, Lib-E, HTML, Unix, VERSANT, Client/Server, Openwindows, C++.

• Office Tools: Nroff/troff, postscript documentation, Adobe Framemaker, SUN openview.

• Test Tools: Sublime, Sable, AWK.

• Network Equipment: DACS-III, and RNC (Restoration Node Controller).

AT&T Supervisor SG4/Business Analyst, Finance and Network Divisions (1983 – 1990)

Analyzed and documented software solutions for 3 different software environments in Accounting & Finance/Treasury and Network organizations. Analysis and specifications for Rapid/FASTAR, SCAMIS(circuit trouble ticket) and PromissSS, and MEO(equipment) Perform software Analysis/Design for Payroll & A/P check reconciliation, for a company of about 400,000.

Accomplishments:

• Negotiated/coordinated with N.J. staff in design and implementation of ACCUNET T45 service, also performed Integration Testing.

• Discovered 20 defects in pre-integration testing of ACCUNET T45 service.

• Discovered DS3 Level cross connect command flaw which could have broken customer service for hundreds of phone calls (DS3 = 672 voice circuits)

• Trained two new analysts to the MEO (Mechanized Equipment Ordering) District.

• Improved the strained relationship with our Kansas City developers group for MEO with personal cooperation, even working with them down to the “code level”.

• Supported others by being not only information source, but highlighting their accomplishments to management.

• Skilled in managing status presentations on the SCAMIS project to the group.

• Managed the reconciliation of the ISD Vendor account in support of the Coopers and Lybrand Audit, resolving hundreds of millions of dollars Assisted in reconciliation of the MAPS account

• Discovered over $20 Million in unreconciled AT&T funds in 1983 reconciliation Wrap-Up. These funds were unreconciled as late as June of 1984.

EDUCATION

B.A., Theology, Ambassador College, Pasadena, CA

M.B.A., Management & Industrial Relations, Seton Hall University, South Orange, NJ

B.S., Management & Industrial Relations, Seton Hall University, South Orange, NJ

ADDITIONAL TRAINING

Malix Institute of Technology, Norcross, GA, January – April/May 2003, trained in:

• Mercury Test Suite/Automated Tools: WinRunner, LoadRunner, TestDirector

• Programming: TSL, SQL, and PL/SQL

• Advanced software test concepts, i.e., web load testing

AT&T, IT and Systems classes, New Jersey and Georgia:

• Data Processing AT&T, Alpharetta, GA, 1990 – 1995

• Initial Analyst Training (I.A.T.) AT&T, Alpharetta, Ga. 1990

• AT&T C.E.T., Piscataway, NJ and Atlanta, GA 1984 – 2000:

Introductory System Testing, Communications Network Architecture Quality Improvement System Testing, Communications Network Architecture Process Project, Managed Structured Test Plans, AWK, Shell, JAD , Use Cases, Function Decomposition

AFFILIATIONS

Accepted into “National Register in Who’s Who in Executives and Professionals” (2004 Ed)

Atlanta Quality Assurance Association (AQAA)



Contact this candidate