This article is published in Aviation Week & Space Technology and is free to read until Oct 11, 2024. If you want to read more articles from this publication, please click the link to subscribe.
While the U.S. Air Force wants to move rapidly toward fielding the first increment of the Collaborative Combat Aircraft—a first step to get iron on the ramp before worrying about more advanced capabilities—a small unit with specially modified jets is thinking long term.
The Air Force announced the Viper Experimentation and Next-Gen Operations Model (VENOM) program at Eglin AFB, Florida, last year as part of its push for autonomous capabilities in the early stages of the Collaborative Combat Aircraft (CCA) effort.
- Six F-16s are planned to roll out autonomy software
- The services hopes to increase AI use in the long term
As the program has progressed and aircraft have arrived this year, the purpose of the unique experimental operations unit has come into sharper focus. VENOM is using six Lockheed Martin F-16s—complete with sensors, radars, pods and training weapons—to develop autonomous capabilities beyond initial CCA increments in a way that is also designed for safety.
Other early Air Force autonomy tests have focused on the Kratos XQ-58 or the Lockheed Martin X-62A Variable In-Flight Simulation Test Aircraft. The latter is a modified F-16 that helped develop the Automatic Ground Collision Avoidance System (Auto GCAS) but lacks the sensors and other systems needed for combat. By using six F-16s straight from an Air Force flight line, VENOM is setting its sights on readying the service’s autonomous capabilities for combat.
“They’re going to be focused more on the beyond-visual-range kinds of engagements, and I think the main thing is going to be human-machine interfaces and pilot-aircraft teaming,” Air Force Secretary Frank Kendall tells Aviation Week. “So we’ll get a chance to explore some of those things I think we’ve been doing in simulation. Now we’ll get to do it on an aircraft.”
The first F-16s arrived at Eglin in the spring from the U.S. Air Force Weapons School at Nellis AFB, Nevada, for initial modifications. Col. Tucker Hamilton, who was then commander of the 96th Operations Group at Eglin, told Aviation Week that these F-16s will be the stars of autonomy testing because, unlike the XQ-58, the type is a well-known aircraft that does not need initial flight sciences testing to be ready. Experienced pilots and the known airframe will be able to push autonomy systems to their limits in a safe manner.
“VENOM is officially a risk-reduction effort for the Collaborative Combat Aircraft program office,” he told Aviation Week in July before leaving the command role. “It’s not just the capability that we’re testing with VENOM; it’s how you test the capability, how you have the right data architecture, infrastructure, how you share data—all those things.”
Hamilton said he was involved in bringing on Auto GCAS, and autonomous operations are a next step. VENOM also aims to build on foundational work both within and outside the service, such as the Air Combat Command U-2 Federal Laboratory’s integration of Kubernetes open software to enable inflight updates. The initiative overlaps somewhat with DARPA programs, such as Air Combat Evolution—which pitted an artificial intelligence (AI) pilot against a human pilot—and Artificial Intelligence Reinforcements, which has contracted with five companies to develop AI software.
VENOM, however, is focusing on autonomy ahead of potential AI integration. An F-16 pilot will watch over the autonomy software piloting the jet, testing, evaluating and turning it off if needed.
“I think of it as test infrastructure—when we do have an autonomy solution, we can take it here and test it before we put it into a purely autonomous platform,” says Col. Timothy Helfrich, senior materiel leader for the advanced aircraft division and director of the Agile Development Office at the Air Force Life Cycle Management Center.
The basic approach for the early stages of VENOM autonomy testing focuses on software working through if-then statements: If a mission requires A, then the aircraft will do B. At a basic level, it resembles how a General Atomics MQ-9 Reaper automatically flies to a designated orbit if it loses connection to the ground station. This type of approach is intended to develop into more robust processes using deterministic software code, whereby based on a specific input, the aircraft would choose an outcome or notify the pilot to do so, Hamilton explained.
This method compares to some broader AI evaluations that will be done with the XQ-58, wherein the software will take the inputs and decide on its own what the outcome should be by evaluating its parameters inside the software code—albeit with an option to turn off the software to ensure human governance.
“So VENOM is meant to give us a platform in order to be able, when the time is right, to evaluate AI,” he said. “A lot of it is focused on autonomy. What does autonomy look like for an uncrewed system that we’re imagining will be flying with support from another entity?”
What that looks like is something VENOM hopes to pursue. The Air Force initially envisioned the CCA as a Next-Generation Air Dominance platform that would control a team of uncrewed systems, although that has evolved into potential operators on other platforms.
“The solution isn’t necessarily a sixth-generation fighter aircraft that has a bunch of uncrewed systems on its wing controlling them,” Hamilton said. “I think the solution is, we’re going to have some uncrewed systems that are being controlled by a human and that have certain autonomous functions. Where that human sits I don’t think we fully know.”
The Eglin efforts are a buildup approach to increase confidence in autonomy. It is too early to identify VENOM’s exact best use case—suppression of air defenses, dogfighting, air-to-air-targeting, air-to-ground target recognition or something else altogether.
Figuring that out is “going to be a very deliberate step-by-step approach,” Hamilton said. “We are having conversations, by the way, of what is appropriate for an autonomous system and an AI-enabled autonomous system to make decisions regarding. That’s why I think . . . we are not going to unleash on the battlespace autonomous capability that has no oversight from a human. A human will be responsible for, and in the loop of, these types of autonomous capabilities.
“AI is not magic,” he continued. “It’s math. And once you understand some of the aspects of the math, you understand some of the pitfalls of the science. Then you can mitigate for those pitfalls, and you can ensure that our approach mitigates the risk of undesirable activity.”
Kendall himself took to the skies of Southern California in May in an X-62A piloted by a series of autonomous systems to demonstrate his confidence in the technology. That confidence is spreading, he argues, including among those within the service who have been more resistant to change.
“Initially, I thought that our fighter pilots might resist the idea of CCA,” he says. “Now the opposite has happened. In the words of one of the leaders in that community, 'we understand that if we’re going to survive, we’re going to need this.'”