Ask the Editors: The Aviation Week Network invites our readers to submit questions to our editors and analysts. We’ll answer them, and if we can’t we’ll reach out to our wide network of experts for advice.
Has software development reached a point where fighter aircraft can operate autonomously?
Aviation Week Defense Editor Steve Trimble responds:
The Cambridge Dictionary defines “autonomous” as “independent and having the power to make your own decisions.” Cambridge University Press also has published an aerospace dictionary, which defines “fighter” as an “aircraft designed primarily to intercept and destroy other aircraft”—although many aircraft called fighters are optimized for striking targets on the ground.
One of the functions of a fighter pilot is navigating and avoiding collision threats. In this area, software has proven quite useful and may even exceed the skills of most human pilots. For example, the U.S. Air Force has installed a software mode called the Automatic Ground Collision Avoidance System (Auto-GCAS) in the F-16 and F-35. This system operates by comparing flight navigation data with a digital terrain database. Once the software detects that the pilot may be inadvertently approaching a point of no return, Auto-GCAS sends warning signals to the pilot to pull up. If the pilot fails to respond, the system assumes the pilot is at least temporarily incapacitated. Auto-GCAS then takes control of the aircraft and maneuvers it to a safe flight condition before restoring control to the pilot. Auto-GCAS is touted by the Air Force as a first step on the path to greater autonomy in combat aircraft.
But, other functions of a fighter remain beyond the capabilities of software algorithms. The AlphaDogfight trials run by DARPA showed that an artificial intelligence (AI) agent “flying” a simulated fighter could easily beat a human pilot in a dogfight contained within a synthetic environment. But the AlphaDogfight trials did not seek to replicate a realistic dogfight scenario. The AI agent was given perfect knowledge of the position of the adversary aircraft throughout each engagement. In a realistic combat scenario, an AI agent would need spherical sensor coverage and tremendous onboard processing power to transform imperfect state data into perfect situational awareness of the enemy fighter’s position and intention in real time.
While DARPA says the AlphaDogfight trials showed the potential for AI to fly an aircraft in combat, freeing up the pilot to act as battle manager, it represented one of the most straightforward missions in a fighter’s repertoire: a close-range visual engagement with another hostile aircraft. A human fighter pilot is expected to easily handle more mentally challenging scenarios, such as a beyond-visual-range engagement, where it’s necessary to balance sensor data with rules of engagement.
An ambiguous scenario might possibly be even more challenging for an AI agent. Fighters are often launched to intercept aircraft that are knowingly or unknowingly violating restricted airspace. Creating an algorithm that can discern the intent of the other aircraft and respond appropriately may rank among the more challenging assignments for the next generation of software developers.