With increasing numbers and types of sensors for airborne intelligence, surveillance and reconnaissance comes a dramatic increase in data collected. While attention is being paid to the means of collection, arguably the bigger challenge is how to cope with the mass of information once it has been acquired.

Key among the problems is the ability to pluck nuggets of needed and actionable intelligence from the river of raw data, and getting that intelligence quickly and securely into the hands of tactical operators.

For the U.S. Army, the problem is taking on a new shape as it begins operations in Afghanistan with the first General Atomics Aeronautical Systems Inc. MQ-1C Gray Eagle unmanned aerial system (UAS) unit to be an integral part of a combat aviation brigade. The Army operates the medium-altitude, long-endurance UAS in theater as a quick-reaction capability, but the aircraft support special operations.

For troops accustomed to managing video feeds from short-duration, hand-launched UAS such as the AeroVironment RQ-11B Raven, or the tactical RQ-7B Shadow, the first Gray Eagles introduce the problem of processing, exploiting and disseminating (PED) greater quantities of information gathered on flights lasting up to 20 hr. “The unit has decided to provide its own PED. We will see if they understand what the demand is,” says Col. Grant Webb, UAS capabilities manager at Army Training and Doctrine Command.

The Army is also close to deploying the first full-spectrum combat aviation brigade that includes two Shadow UAS units in addition to Bell OH-58D Kiowa Warrior and Boeing AH-64D Apache helicopters. “Division and below has not had organic UAS before. There will be a lot of learning,” says Webb. “The Army plans on doing PED forward, using intelligence NCOs. We're waiting to see how well they do,” he says. “With the aircraft up for 20 hr., it takes a lot of manpower, and there is a big difference in size between brigade and division level—they may require support.”

The U.K. has been working to resolve the issues for years. Its solution in part has been the fielding to Afghanistan of the DataMan system (DTI July/August 2011, p. 14). This allows connected users to input information to a central command-and-control server, where data—human intelligence on insurgent activity, video from aerial platforms, signals intelligence about the dated known location of a high-value target and medical information—is tagged against the geographical locations they relate to.

The server stores the information in more than 350 layers. When a commander planning a mission requests a map of an operational area, he selects the information layers he would like the map to display, in a Google Maps-like interface called GeoViewer. The process ensures the user is presented with intelligence relevant to his task, without being overwhelmed by unnecessary data. By projecting stored intelligence onto the blank canvas of the GeoViewer map, the server minimizes the bandwidth required by only moving requested information, rather than requiring the user to access an entire data cache and instructing his terminal to ignore most of it.

“Not only have you got the geoint tacked on to the C2 system, but we discovered that we can connect to other databases,” says Col. Mark Burrows of the Defense Ministry's Joint Aeronautical and Geospatial Organization, which operates DataMan, built by Esri in the U.K. “It's a database manager, so it reaches into other people's databases, and we can link in as well.

“It's about getting the best out of all the information out there, bringing it together to let you do the information-exploitation activity.”

Other information-sharing efforts are more low-tech. At Camp Bastion, Afghanistan, the ground station for the RAF's Sentinel R1 Airborne Stand-Off Radar system is located next to the data exploitation center for the British Army's Hermes 450 UAS. Officers work on the two systems in close proximity. Although the data produced is different—full-motion video from the UAS; ground-moving-target indicator data and synthetic aperture radar imagery from Sentinel—both teams interoperate, often use the same software and share interpretation techniques. Information is also integrated into DataMan, though the process is seamless and users may not realize that their product ends up in the intelligence hub.

The advent of multispectral and wide-area sensor capabilities, meanwhile, will require an exponential increase in data analysis to separate informational wheat from digital chaff. As UAS operators have found, savings made because the platforms cost less than a manned aircraft are offset by the manpower required to monitor the amounts of data the persistent platforms generate. Efforts to automate elements of the analytical process are therefore some of the most pressing in intelligence gathering.

“The biggest challenge is to quickly give the analysts, in real time, the ability to extract value—to use more machine-based algorithms and decision-making to assess whether activity they're looking at is abnormal,” says Jon Armstrong, senior manager for FMV (full-motion video) solutions with Lockheed Martin's Information Systems and Global Solutions division. A Lockheed-led team developed NVS—National System for Geo-Intelligence Video Services—for U.S. Joint Forces Command, though the project has migrated to the National Geospatial-Intelligence Agency. NVS attempts to solve the analysis overload problem through software automation.

“Am I able to look at, and have the computer tell me, whether that individual I see walking across the ground is carrying an IED on his shoulder or a rolled-up rug?” Armstrong continues. “Can I set a watch box around a location and say, 'These three vehicles that I've seen come to this compound in the last five days, alert me to go look at the video if a vehicle shows up today?' I don't want to stare at the video hoping to see the vehicle if it doesn't return. I want to put my attention on something else.

“It's the ability to do activity-based intelligence, where you use concepts of heat maps, watch boxes or tripwires to define multiple elements of what you want to bring together,” Armstrong says. “I only want to look at this box if the conditions are met. And those conditions are becoming more definable, as is the ability to recognize the activity to match with it.”

Monitoring the activity of insurgent networks is a key objective of the Army's plan to deploy three Boeing A160T Hummingbirds to Afghanistan this year. The unmanned helicopters will carry the Argus-IS wide-area electro-optical (EO) sensor developed by BAE Systems for the U.S. Defense Advanced Research Projects Agency (Darpa). Argus provides 65 simultaneous, independent video feeds that can be programmed to watch a location or track an individual or vehicle. A Darpa-developed ground station will fuse the EO imagery with signals-intelligence to monitor insurgents.

Argus-IS is daylight-only, and Lockheed is developing an infrared (IR) version, Argus-IR, for Darpa. This uses a new mid-wave IR detector technology called nBn, which requires less cooling and allows for larger focal-plane arrays with more pixels per array. Argus-IR will provide at least 130 independently steerable video feeds within its field of view, each with resolution comparable to the EO/IR sensor on a Predator-class UAS, says Keith Flail, business development manager with Lockheed Martin Missiles and Fire Control.

Adding to the challenge is the increasing use of multispectral and hyperspectral sensors that not only cover large areas, but do so across and beyond the electromagnetic spectrum.

Another development is the use of airborne sensing to detect trace explosives. Modified Bombardier Dash 8 aircraft, operated by Dynamic Aviation and equipped with multiple IR turrets and other sensors, possibly including explosives detectors, have been operating for the Army in the Middle East. At the U.S. Navy League show in April, SAIC showed models of these aircraft and disclosed that it was the prime contractor in the program.

As automation tackles ever-bigger data sets, Armstrong points out that the nature of the work changes from image analysis to network analysis. “Gorgon Stare, Blue Devil, Argus —these are capabilities that allow you to look at a 10 X 10-km (6.2 X 6.2-mi.) space at extremely high resolution,” he says. “So you have the ability to look at human activity networks, track vehicles over large distances—the ability to look at activity, not just pixels. How do you consume those huge data sets, and the ability to look at an entire city in one view? That's the emerging space we're using a lot of our energy and focus on.”