BAE Systems details the autonomous navigation and guidance system it is developing in Australia
It might be just a dark rock that no one has ever noticed, let alone put on a map. But it can tell a passing aircraft where it is.
How? Because the aircraft has worked out its position relative to the rock and the rock's position relative to a patch of bright green grass that it spotted previously. The aircraft already knew where the grass was relative to a tree farther back, and the position of the tree relative to a parked car that is now out of sight. These mundane objects are not on a map loaded into the aircraft; rather, the aircraft is making a map with them as it flies along, and at the same time working out where it is on that map.
This is simultaneous localization and mapping (SLAM), a navigation technology widely under development for ground robotics and also being worked on by various research teams for aircraft, especially pilotless aircraft. For military purposes, its chief attraction is that it offers a passive method of fixing a vehicle's position without satellite navigation signals, which an adversary may try to jam or spoof. It is, therefore, applicable to manned aircraft, too.
SLAM is part of whatbelieves to be a world-leading navigation, guidance and control system it is developing in Australia. Since the same Australian unit is working on navigation and guidance for the BAE Systems Taranis, there is strong reason to suspect that the advanced autonomous capabilities, or at least most of them, are used in that aircraft, a demonstrator for a stealth combat drone. BAE and its customer for the Taranis, the British Ministry of Defense, have revealed few details of the aircraft. The U.S. Navy is also looking at BAE's system as a way of achieving what must be one of its ideals: automated carrier landing without emissions.
Whether or not BAE's guidance and navigation is the most advanced under development—and there is no shortage of research teams working on such systems—its description of its system at least gives insight into the autonomy that can be expected from pilotless aircraft in a decade or so.
Work on it began in 2003 and is now 80% complete, assuming that the requirement does not change, says Brad Yelland, head of strategy for the aerospace business of BAE Systems Australia. The Australian and British parts of the company have funded the system, which is being flight-tested on a BAE Systems Kingfisher drone. Yelland declines to comment on Taranis work or even say whether it is related.
The advanced features that BAE describes are in navigation (that is, the part of the system that works out where the aircraft is) and guidance (determining where it should go). The separate mission management element works with the guidance section in autonomously determining how to execute the assigned tasks, which could be reconnaissance of various points, or launching weapons.
The core of the navigation package is an inertial navigation system (INS), a standard sensor that works out where an aircraft is by measuring its accelerations. If left alone, an INS will gradually accumulate errors, “drift,” and so a GPS receiver is commonly added to update it, bringing it back on track. BAE has added SLAM as another source for updating the INS, one that will work if GPS is unavailable.
As the aircraft flies, sensors detect fixed features on the ground and the software works out their relative positions from their changing bearing, building up a map of them and determining the aircraft's position on the map. The sensors can be passive such as electro-optical, infrared or radio direction-finding, or active—millimetric radar has been tried—and BAE says synthetic aperture radar should work. For submarine navigation, sound has been tried with success.
Data from the INS and SLAM are combined to create a best estimate of position. While SLAM data is available, drift is limited to 1-5 meters (3-16 ft.), comparable with common GPS systems, but not the best military-grade GPS. In building its map, SLAM should have three objects to watch at a time, in case one turns out to be moving relative to the others. With many objects and enough sensor fidelity, SLAM can achieve positional precision of less than 1 meter, says BAE.
If an object is lost—say, because of cloud cover—it can be reacquired. If SLAM cannot find any features, then the INS just carries on alone for as long as necessary. There are no fixed visible features at sea, and mainly for that reason BAE incorporated radio direction-finding into its SLAM system. So, instead of watching a tree visually, the system might watch focus on transmitter on a cell phone tower or the radar of a distant airfield. BAE has previously referred to its work on such techniques, which are called navigation with signals of opportunity.
One expert in autonomous navigation questions whether suitable radio signals would be available far out to sea. On the other hand, SLAM may not be needed far from land, since an enemy's distance would increase its challenge in interfering with GPS.
SLAM is distinct from georegistration, in which the navigation system looks for objects of known location to fix its position, and from Tercom, a method of working out the position of the aircraft from radar measurements of the shape of the ground.
BAE says its SLAM system can also be used for tracking targets, noting that it was a considerable challenge to develop a system that could do the three necessary tasks at once: making a map of fixed features from a moving position (the aircraft), simultaneously locating that moving position on the map and at the same time tracking another moving object on the map.
Notably, the SLAM map can be shared between several vehicles, which then navigate and localize themselves and targets on a common grid. “This is extremely difficult and can only be achieved by combining our decentralized data fusion technology with our SLAM technology,” says Yelland.
The guidance package need not use the traditional waypoint approach, directing the aircraft from one designated point to another. Instead it can be given a moving 3-D block of space which it must stay in but within which it can freely fly the aircraft toward the set target. “The benefits of this approach are reduction in mission-planning workload, optimization of mission execution and the ability to manage airspace deconfliction for integrating UAVs into shared airspace,” says Yelland. In designing the blocks, mission planners incorporate terrain avoidance and masking. The guidance system, working with the mission system, can choose to execute a low-priority mission before a high-priority mission if its track happens to suit changes in the sequence.
Building on their earlier work with theEvolved Sea Sparrow Missile, the Australian engineers are working on the guidance system measuring changes in performance parameters, which might occur because of such events as surface damage, icing and engine degradation. The aircraft would then know, for example, that its fuel will not last as long as expected.
Finally, the system can bring an aircraft in to an autonomous landing without signals from the field, remote pilots or GPS. It can find a runway, navigate there, on arrival recognize the runway from its features (primarily, two straight lines, tens of meters apart), direct the aircraft on a pass to survey the field for obstacles, work out its landing maneuvers, and then land the aircraft.
In a laboratory simulation the engineers have tried the system with an oscillating runway, simulating a flight deck at sea. It worked, but more development is needed. Unsurprisingly, then, BAE says it is in talks with the U.S. Naval Research Laboratory on the system. Although the company will not comment on the reason for the U.S. Navy's interest, the key point must be, if the system were developed for carrier use, it would not need a signal from the ship to guide the aircraft aboard. The advantages would go beyond recovering combat drones. The cost of pilot training in deck landings could be eliminated if the system, perhaps with a radiating backup, were completely reliable.