The purpose of operational testing is to assure the Military Services field weapons that work in combat. This purpose has been codified in both USC Title 10 and in the Department of Defense’s (DOD) 5000-series regulations for many years without substantive alteration.
Operational testing is intended to occur under “realistic combat conditions” that include operational scenarios typical of a system’s employment in combat, realistic threat forces, and employment of the systems under test by typical users (Soldiers) rather than by hand-picked or contractor crews.
Thorough operational testing should be conducted prior to a system’s Full-Rate Production decision or deployment to combat in order to inform acquisition decision makers and operators in an objective way about how the system will perform in its combat missions. Under current law, the Director of Operational Test and Evaluation (DOT&E) is required to present his opinion on whether the operational testing conducted prior to the Beyond Low-Rate Initial Production decision is adequate or not.
The Director must consider all the operational facets of a system’s employment in combat when he determines what constitutes adequate operational testing, including the performance envelope the system must be able to achieve, the various operating conditions anticipated in a time of war, and the range of realistic operational threats.
In 2014, I investigated many examples of recent programs across all Services to identify common themes in operational testing. These themes illustrate the value that operational testing provides to the Defense community. Additionally, they highlight the continuing improvements we have made in the credibility and efficiency of OT&E during my tenure.
A briefing covering these six themes and dozens of examples across all Services is posted on the DOT&E website.1
These themes reveal a common conclusion: OT&E provides value to the Department by identifying key problems and clearly informing warfighters and the acquisition community about the capabilities our combat systems do and do not have. Furthermore, we are getting this information now more efficiently and cost effectively than ever by employing rigorous scientific methods in test planning, execution, and evaluation.
Since my first report to you in 2009, we have made progress increasing the scientific and statistical rigor of operational T&E; there is much work to be done, however, since the Department’s test design and analysis capabilities lag behind the state of the practice.
Additionally, we have focused attention on reliability design and growth testing, and in improving cybersecurity operational testing. Operational testing continues to be essential to characterize system effectiveness in combat so well-informed acquisition and development decisions can be made, and men and women in combat understand what their equipment and weapons systems can and cannot do. I submit this report, as required by law, summarizing the operational and live fire T&E activities of the DOD during Fiscal Year 2014.
J. Michael Gilmore
Operational Test & Evaluation
Full Report in PDF (410 pages):