Skunk Works Boss Details AI Approach For Air Combat

X-62 test aircraft in flight

Air Force Secretary Frank Kendall observed AI pilots in action during a May 3 ride on the X-62.

Credit: Richard Gonzales/U.S. Air Force

DARPA’s handpicked participants in the next round of artificial intelligence-piloted air combat experiments have not been released yet, but the head of Lockheed Martin’s Skunk Works is not losing any sleep over the decision.

As the follow-on to the pioneering Air Combat Evolution (ACE) program, the agency’s Artificial Intelligence Reinforcements (AIR) represents a final, 12-18-month burst of experimentation by the U.S. Air Force’s fighter pilot community to determine the roles and limits of a new class of Collaborative Combat Aircraft (CCA) set to be fielded in about four years.

  • Skunk Works’ ECHOS demonstrations continue this year
  • ISR community experience ignored by Air Force fighter pilots
  • Startups’ “God AI” approach is criticized

Lockheed performed well as one of several industry players in the ACE program’s AlphaDogfight Trials, losing in the final round to a rival that employed tactics that the Skunk Works still disputes. But John Clark, the Skunk Works general manager, has plotted the company’s own development path for combat aircraft piloted by artificial intelligence (AI)— with or without a starring role in the follow-on AIR program.

“I’ll certainly take the government’s money and go do experimentation with them,” Clark tells Aviation Week. “But I’m testing independently anyway because then I’m not trapped with their [concept of operations,] . . . so we’re able to move faster and do broader experimentation.”

As the head of the company’s experimental arm for military aviation, Clark seems focused on establishing the Skunk Works as a leader in AI-piloted systems. But he has also been frustrated by the pace and scope of government-led experimentation and the claims of rising Silicon Valley-backed competitors.

The internally funded Enhanced Collaborative High-Frequency Orientation System (ECHOS) program—a name that invokes the Skunk Works’ pioneering Echo software code, which led to the design of the stealthy F-117 Nighthawk in the 1970s—is Lockheed’s preferred path to introducing AI agents in air combat.

Last year, Skunk Works teamed with the University of Iowa’s Operator Performance Laboratory, using two piloted Aero Vodochody L-29 aircraft acting as surrogates for uncrewed aircraft systems. AI agents directed the L-29s to provide jamming support for a simulated air-to-ground mission.

The ECHOS demonstration will evolve this year, switching from air-to-ground to air-to-air missions—but with a twist.  “It’ll be AI versus AI as opposed . . .  AI versus man,” Clark says. “And we think that’s going to be very instructive about learning how the algorithms [fight] against each other.

“And then we’re going to up the game, and we’re going to go two-versus-two,” Clark adds. “And so we’ll have four L-29s—two virtual, two live—that we’re going to add more complexity to in terms of the engagement.”

Finally, the 2024 series of ECHOS demonstration will conclude by the end of the year with the addition of a new capability. A human battle manager will be added to the multiaircraft engagement scenario, Clark says.The battle manager’s input will then be inducted into reinforcement learning software models, ultimately leading to the creation of an autonomous battle manager.

Skunk Works technology also provides the infrastructure behind the Air Force Test Pilot School’s X-62 Variable In-flight Simulation Test Aircraft (VISTA), a highly modified, ex-Israeli Air Force F-16 Block 30. The X-62 is essentially a flying simulator, with the ability to emulate the performance of dozens of other aircraft during a single flight. With the rare ability to be commanded by AI pilots, the X-62 VISTA also has served as an experimental platform for DARPA’s ACE program, culminating in an AI-flown dogfight on May 3 with Air Force Secretary Frank Kendall in the cockpit as an observer.

These feats of AI technology, however, depend on a digital infrastructure installed by the Skunk Works. A supplemental flight computer installed by Lockheed on the X-62 interfaces with the flight controls during AI-controlled flights. AI software companies design their code to operate safely, but the algorithms can still make mistakes. Lockheed’s supplementary flight computer establishes a baseline set of flight safety parameters, preventing any such mistake from causing the aircraft to damage itself.

Anduril Fury aircraft concept
Anduril’s Fury aircraft will be one of the first Collaborative Combat Aircraft deployed by the Air Force in 2028. Credit: Anduril

“We’ve got a safety boundary around it,” Clark says. “You wouldn’t ever see a bad decision by one of these other companies’ AI because we would have protected them from it.”

Lockheed’s supplementary flight control computer also will be installed on a handful of F-16 Block 42s now enrolled in Project Venom, the test fleet for DARPA’s selected competitors for the AIR program.

Air Force officials describe Project Venom as a laboratory for the autonomy software that will be introduced on the CCA fleet. In May, the Air Force selected Anduril and General Atomics Aeronautical Systems Inc. to build the aircraft for Increment 1 of the CCA program, with a fielding date scheduled in 2028. As development of those aircraft begins, Project Venom will continue experimentation with various software-based AI pilots over the next 12-18 months, helping the Air Force’s fighter pilot community understand how to blend autonomous technology with human-piloted aircraft in future operations.

Based on Skunk Works’ experience, however, Clark doubts that Project Venom’s experiments will discover any breakthroughs. He points to Lockheed’s experience with the application of AI pilots in experiments with the Air Force’s U-2 fleet as an example. The high-altitude intelligence, surveillance and reconnaissance (ISR) U-2 demonstrations included a series of publicized flight tests with an AI pilot onboard four years ago.

“The Air Force is run by fighter pilots,” Clark says. “Regardless of the amount of autonomy that is in a system that’s ISR-based, if it wasn’t a fighter pilot that was a part of the situation, it’s not something they will have learned. So now the fighter pilot community is going to get to go learn something and internalize it. Most of the other parts of the Air Force community are going to be like: ‘Yeah, we we knew that.’”

The Air Force has no shortage of potential suppliers for the CCA fleet’s future AI pilots. In addition to autonomy core systems offered by traditional prime contractors, including Lockheed, Boeing, Raytheon and BAE Systems, several start-up, Silicon Valley-backed companies have participated in DARPA’s ACE and the Air Force Research Laboratory’s Skyborg program. These include Anduril’s Lattice system, Shield AI’s Hivemind, EpiSci’s Tactical AI solution and PhysicsAI. It’s not clear if the Air Force will select only one AI system for the CCA fleet or employ autonomy algorithms from multiple suppliers.

Clark, however, draws a sharp line between the philosophy of AI software developed by Lockheed and some of the Silicon Valley-backed startups. The Skunk Works approach to the AI pilot essentially mimics the role of a human fighter pilot. The human does not directly interfere with processes of the flight computer and the mission systems computer during a mission, but instead commands them to function in certain ways and supervises how those tasks are executed. In a similar way, Lockheed’s AI pilot is overlaid on top of the CCA’s onboard processing systems.

“We’re introducing just the behaviors that are necessary to allow the pilot to interact with [the CCAs] in a collaborative system and get the expected behavior out,” Clark says. “We’re using the traditional process of the flight control system. Those pieces of software already exist. So let me just put the AI behaviors on the top. Those are pretty simple behaviors: ‘Your objective is to go that way, engage at this rate that weapon from this direction.’”

By simplifying the AI pilot’s interaction with the onboard systems, Skunk Works says it has already developed the technology for each human pilot in an F-35 or F-22 to control up to eight CCAs now, Clark says.

In more nuanced terms, the Skunk Works approach calls for operating CCAs not with a single AI pilot aboard, but rather with multiple sets of algorithms that are each specially trained for discrete behaviors, such as deciding whether to engage a threat or run away from it. In Clark’s view, this approach is more effective than attempting to develop a single algorithm capable of performing each of these missions interchangeably.

“That’s where the monolithic or the ‘God AIs’ are going to run into a problem,” Clark says.

He describes a counter-air scenario in which one CCA faces two enemy aircraft. Multiple decisions must be made: Should the CCA engage? If “yes,” then should the CCA first aim at the enemy on the left or the right? If the answer is “no,” then a different algorithmic behavior takes over, but this behavior must decide on the best approach to evade the threat. If all of these behaviors are contained within the same algorithm, Clark says, there is a potential for “behavioral crosstalk” that confuses the AI software.

“You’re going to have unexpected behaviors because of all these competing variables that aren’t . . .  coupling toward the same decision,” Clark says.

Steve Trimble

Steve covers military aviation, missiles and space for the Aviation Week Network, based in Washington DC.

Comments

2 Comments
For some funky reason that aircraft looks like something recycled from the '50's. I don't have a reference or a link but appears to be a thing that was conceived of in the distance past. I stand to be corrected I admit but seems like a design I've seen published in print a long time ago.
Agreed. There’s a bit of a retro, F-105 vibe to the Anduril design. See: https://images7.alphacoders.com/411/411673.jpg