Post Job Free
Sign in

Federal Aviation Administration

Location:
Hamilton Township, NJ
Posted:
October 04, 2012

Contact this candidate

Resume:

Khalee Ward

Email: aboqgs@r.postjobfree.com

Address: **** ******* ***.

City: Mays Landing

State: NJ

Zip: 08330

Country: USA

Phone: 856-***-****

Skill Level: Experienced

Salary Range: 60

Willing to Relocate

Primary Skills/Experience:

See Resume

Educational Background:

See Resume

Job History / Details:

KHALEE R. WARD

WORK OF EXPERIENCE

August 2009 a present ECS, assigned to the FAA

Federal Aviation Administration (FAA) William J. Hughes Technical Center

Atlantic City International Airport, New Jersey

Computer Scientist

a ECS won the new 7-year contract with the FAA over APPTIS, creating a transition in companies; simultaneously, the National Airspace Data Interchange Network (NADIN), Weather Message Switching Center Replacement (WMSCR), and AWOS Data Acquisition System (ADAS) FAA programs combined into one common platform, adding to my obligations. New responsibilities presently include:

a ScrumMaster of the FAA Common Platform Test Tool Action Plan, in which I led the initiative of developing the Unified Test Suite (UTS), a tool that was developed in a series of consecutive sprints, enabling a common test tool across the three aforementioned FAA AJW-177 systems. I expedited the Scrum process over a team of 7-9 individuals, depending which project phase was in effect.

a Designated the Point of Contact (POC) for system testing on new software builds. The POC must estimate total hours billed on a project, coordinate task time, allocate said tasks to any assigned resources, and make sure milestones are met on time.

a Test POC on the NADIN Graphical Network Monitor & Control (GNMC) project. The GNMC is a network application that controls manipulation of all data, messages and flight plans traveling through the FAA.

a Developing the System Test Plan and procedures, populating the Verification Requirements Traceability Matrix (VRTM), dry running and redlining procedures, Test Plan review, conducting tests, developing the Key Site Test Plan and procedures, and documenting a Quick Look report, a System Test report, and a Key Site Test report.

a Test Co-POC on the WMSCR SWIM (System Wide Information Management) project. Using the Draft Use Cases and the SWIM requirements specification to develop both test cases and test procedures.

October 2008 a July 2009 APPTIS, assigned to the FAA

Federal Aviation Administration William J. Hughes Technical Center

Atlantic City International Airport, New Jersey

Engineering Specialist

a Member of the National Airspace System (NAS) - National Airspace Data Interchange Network (NADIN) Test team. Responsibilities included:

a Identifying both defects and like requirements, then designing and baselining test plans a manually creating test objectives, setups, and overviews a and procedures for resolving both new and existing software problems, and for testing new software builds. Next, ensuring said requirements are met, via software testing: System Support Directive (SSD) testing, functionality testing, regression testing, key site testing, and integration testing.

a Maintaining and implementing support for the NADIN National Message Switch Rehost (NMR) system, which stores and forwards incoming air traffic messages both to and from the U.S.as two National Network Control Centers in Salt Lake City and Atlanta.

a Supporting user testing (ARINC, ECG, CAATS/NavCanada, DTC DUAT, CSC DUATS, ERAM, and others) with the NMR system, by establishing and maintaining either IP or X.25 protocol connections for users to send messages throughout the NAS. User testing support additionally requires message routing to correct addresses, and troubleshooting in-test problems.

a Directing and supporting both external and internal testing and NMR field issues when the leads are not present. Yielding, sometimes multiple, field issues in the lab.

a Troubleshooting, by analyzing symptoms and errors reported by network packet analyzers and the NMR system, then gathering information, analyzing, evaluating, and finally, planning a course of action for problem resolutions. Actions include, for example, performing data insertions, message retrievals, and system modifications.

December 2007 a September 2008 Aerotek, assigned to Lockheed Martin

FAA William J. Hughes Technical Center

Atlantic City International Airport, New Jersey

Systems Test Engineer

a Served as a software engineer for the Display System Replacement (DSR)/User Request Evaluation Tool (URET) system. After first, pre-planning, organizing, and coordinating activities with teammates specific to each test, additional responsibilities included:

a Working cohesively within a team, testing the URET system (simulated program displaying real-time flight plans, radar track data, aircraft performance characteristics, weather and wind analysis, and flight trajectories for pre-departure and active flights) for bugs and inaccuracies before new software was released in federal air traffic control centers throughout the U.S. Any bugs found were documented using Problem Trouble Reports (PTRs) and then, their corrections, or delta fixes and system enhancements were subsequently tested for accuracy.

a Bringing up the system for each specific test, and then executing the testas procedures. During each test, all unexpected anomalies were documented in the testas lab log. Any addressable software or hardware failures were documented and resolved in the lab, meaning we developed our own solutions, and the test continued. Any failures that could not be resolved in the lab were submitted to either its hardware or software department for analysis and corrections.

a Attempting to break the system by looking for any errors that could cause a problem in the field. Testing included vigorously searching for any incorrect functionality, knowing that thousands of lives are at stake with the submission of each new software release.

a Developing UNIX aliases and writing scripts that executed series of longer commands, to ease the testing process, helping the team work faster, smarter, and more efficiently.

a Running lab reports and completing Data Reduction and Analysis (DR&A) after the test completion. After running reports, the team was disseminated with a summary, detailed with the runas results. Analysis included studying the System Health Reports (SHR), determining why unexpected software and hardware errors occurred during that run, as well as confirming that all expected failures occurred, and if not, finding the reasons and documenting those results.

August 2005 a November 2007 Gaming Laboratories International (GLI)

Lakewood, New Jersey

Test Engineer I

a Testing electronic gaming equipment (i.e. video slot machines, poker tables, roulette tables, lottery programs) for use in over 400 lottery and gaming jurisdictions throughout the world. Testing, most importantly, includes reviewing and compiling C++ code to determine reel strip combination correctness and for associated pay mapping.

a Understanding, identifying, and using proper test scripts for determining specifically applicable rules and regulations in a given lottery or gaming jurisdiction. Building knowledge of known cheating techniques and ensuring proper game operation.

a Using system interfaces (i.e. on-line systems, progressive systems, EFT systems), ensuring jurisdictional requirements are met for submitted games. Analyzing and testing protocols, such as SAS, using data line analyzers and other software testing tools. Reading and dissecting message packets sent from both systems and site controllers, used to determine accuracy and correctness.

a Using company math tools necessary for determining probabilities and for identifying win combinations having high chances of producing failures.

a Developing understandings of both the hardware and the software used in testing, including product knowledge of device parts, hardware, and modules of software code associated with devices. This includes knowledge of CPU, RAM, ROM, its logic board and all other hardware components.

a Developing understandings of clientas standards and intent, necessary for product submission. Documenting and discussing new device/protocol features, ideas, and functionality that is not specifically allowed by rules of requested jurisdictions. Documentation includes creating a requirements specification and a completing a step-by-step project paperwork file, which is subsequently submitted for a technical review.

December 2004 - July 2005 Walt Disney World College Program, EPCOT

Orlando, Florida

Intern a EPCOT/Disney Interactive Studios (DIS)

a Tested stubs for educational game demo.

a Worked with various departments, distributing game copies and other materials.

a Compiled comments from different departments, in spreadsheets.

a Reviewed game builds for debugging and design critique.

a Created a presentation for a game demo.

a Performed various coordination tasks, including organizing assets and content.

TECHNICAL SKILLS

Software Development Methods: Traditional Waterful, Scrum

Languages: C, C++, Java, HTML, XML, WSDL, SysML

Operating Systems: DOS, Linux, UNIX, Solaris, SUN, Windows XP and Vista

Applications: Microsoft Visual Studio, MS Office, Exceed On Demand, Hummingbird, Bugzilla, IBM Rational ClearQuest, IBM Rational Manual Test, IBM Rational Rhapsody Designer, TCPView, General Weather Simulator (GWS), NADIN Test Applications (InHouse, CommandEntry, InterceptEdit), iTKO LISA Server Edition Test Tool

Network Packet Analyzers: nGenius InfiniStream, Ethereal, Wireshark, CommView

EDUCATION

Aug. 2001-Dec. 2004

Temple University, College of Science and Technology

Philadelphia, PA

B.S. Computer & Informational Sciences

Graduation: January 2005

Cumulative GPA: 3.0 (official transcript reads 2.9)

Degree Program GPA: 3.27

Honor Roll: Spring 2003



Contact this candidate