Airbus Advances Autonomy Project As Part Of Future-Cockpit Concept
The autonomous taxi, takeoff-and-landing (ATTOL) project is the first visible part of Airbus’ work on a so-called fifth-generation flight deck, which is aimed at improving safety by allowing a crew more time for decision-making. As part of the project, a modified A350-1000 performed eight automated takeoffs in December in Toulouse, using an image-recognition system. Landing trials followed in mid-January.
Airbus flight safety experts see the main goal for the design of a fifth-generation cockpit as enabling the pilot to become a mission manager. Information should be presented more synthetically, says Pascal Traverse, Airbus’ general manager of autonomy technology. Typically, a primary flight display with parameters such as a speed scale could disappear for most of the flight.
- Takeoff and landing tests successfully used image recognition
- Fifth-generation cockpit nearing the drawing board
The electronic centralized aircraft monitor (ECAM, also known as the engine-indicating and crew-alerting system) usually recommends remedial actions. “Instead of telling the crew to shut down a pump, the ECAM would shut it down itself,” Traverse says.
In a fifth-generation cockpit, the autopilot would be key. It would have more capabilities—such as coping with wind gusts—and therefore can be engaged throughout the flight. It would become even more reliable thanks to increased computer redundancy, Traverse explains.
Enhancing the autopilot would allow an existing trend to continue. The A350 includes the latest progress in autopilot technology, as it remains engaged even though flight-envelope protection becomes active. Moreover, the speed brakes automatically extend after the aircraft surpasses 5 kt. above maximum operating speed.
In the fifth-generation cockpit, the size of the crew no longer would be a factor of safety, according to Traverse. A long-haul flight would need two pilots instead of three, but “reducing the size of the crew is not an objective,” he says.
The ATTOL system is intended to give the crew more time and bandwidth to analyze a situation, regardless of the airport’s landing aids. It would enable the aircraft to land automatically, while the crew looks at the situation as a whole.
“Airbus commercial aircraft use 4,000 airports, 1,000 of which have an instrument landing system [ILS], and only a few hundred runways are compatible with our autoland capacity,” says Sebastien Giuliano, ATTOL project manager.
In 2013, in Birmingham, Alabama, a UPS-operated Airbus A300 freighter crashed, killing both crewmembers. The ILS was unavailable. The crew failed to properly configure the flight management computer and monitor the aircraft’s altitude, the NTSB investigation found.
Satellite-based guidance does not allow autolanding, but it can be seen as an alternative to ILS. However, loss of the signal from a global navigation satellite system is reported regularly.
With computer vision, the aircraft would no longer depend on an external system, Giuliano emphasizes.
The ATTOL project was launched in June 2018 for a duration of two years by Airbus UpNext, an organization that also manages the E-Fan X hybrid-electric demonstrator and Fello’fly project for fuel-efficient formation flight. One aim of the ATTOL system is to exploit the possibilities of image recognition when the system is close to the ground.
At the heart of ATTOL are a camera (mounted on top of the instrument panel and looking forward), image-processing algorithms and a control law. The system detects converging vanishing lines and deduces the runway’s centerline.
The December flight test began with a deliberate 4-m (13-ft.) offset position before the brakes were released. The aircraft autonomously reached the centerline while accelerating. In a video released by Airbus, the copilot can be heard saying, “It is converging . . . overshooting . . . coming back on it.”
At the preset speed, the control law took care of the rotation, and the autopilot took over with an existing mode, says Giuliano.
For the landing tests, the crew aligned the aircraft with the centerline in approach and then cut off GPS and ILS receivers. The aircraft landed autonomously, using visual cues. Two runways were used for the five landings, meaning the system adapted to different visual environments.
During deceleration, the system realigned the aircraft when required. The fifth landing involved the brake-to-vacate system, which regulates deceleration so the aircraft reaches the chosen exit at the correct speed. Coupling the two systems was deemed successful.
Could fog be a limitation? The ATTOL system could use sensors outside the visible spectrum, Giuliano answers. The resulting “image” would not have to be seen by a human eye and therefore could be analyzed directly by a software program.
Another limitation—sunlight dazzling the camera—has been overcome by using the camera that monitors the main landing gear, which is under the nose section and oriented rearward, shielding it from the Sun and enabling it to follow the runway’s centerline, Giuliano explains. It can be used for every takeoff after rotation, once the cockpit camera is pointing at the sky and not the runway.
Although no product development has begun, a certification challenge can be foreseen because some algorithms use machine learning (ML). The European Union Aviation Safety Agency recently released the first edition of its “Artificial Intelligence Roadmap,” which begins to answer OEMs’ questions on how to certify an ML-based system.