While most airline pilots probably have the “right stuff” to handle abnormal situations, loss-of-control accident statistics make clear that in the highly complex and automated cockpit of the 21st century, some percentage does not. 

Rather than blaming pilots for the rare but often deadly failure to perform, the FAA and industry Commercial Aviation Safety Team (CAST), has launched a multiyear, multipronged research effort with NASA and others to help find better ways to design cockpits and training to avoid the common human/machine-interface mistakes revealed by post-crash or post-incident analysis. 

Behind the action is a 2010 analysis by CAST’s Airplane State Awareness (ASA) analysis group, which identified 12 major themes common to 18 commercial aviation accident and incidents worldwide in 2001-10. While CAST is preparing to publish a series of mitigations based on its findings, officials realized more research is needed in certain key areas, including loss of attitude awareness (also known as spatial disorientation) and loss of energy state awareness (LESA), both of which fall into the broader category of crew attention management. Spatial disorientation was a factor in the loss-of-control crash of an Aeroflot-Nord Boeing 737-500 in Russia in 2008, and LESA will likely be the probable cause in the “too low and too slow” crash of an Asiana Airlines Boeing 777-200ER short of the runway at San Francisco International Airport in July 2013.

Complicating crew attention issues are the effects of unexpected events that startle or surprise pilots and can lead to inappropriate responses, “channelized attention,” which is focusing solely on an incorrect solution, or “confirmation bias,” which favors only the information that supports the pilot’s preconceived notion of what is causing the problem despite more compelling data to the contrary.

“CAST is looking to the aviation community to develop and validate prototype technologies to detect and mitigate attention issues for use in the design evaluation process,” says Randy Bailey, project scientist for crew decision-making in NASA Langley Research Center’s Aviation Safety Program. Langley is coordinating the crew attention research project, with partners that include NASA’s Ames and Glenn research centers, Honeywell, Rockwell Collins, the University of Iowa and Georgia Tech. 

“They want details on channelized attention, confirmation bias, startle and surprise, and how can we quantify and manage those,” Bailey says. With that information, he says avionics designers will be able to “explicitly” evaluate their designs in terms of pilot attention. For training, the findings could lead to tools that instructors can use in the simulator “to know what’s going on and to get crews to notice, to know they’ve been startled or surprised, and how to combat it,” he says.

To obtain the information, Bailey’s NASA and contractor team will integrate a suite of measurement technologies to create “converging metrics” from physiological sensors attached to, or focused on, instrumented pilots in live aircraft and simulators as they use different types of flight displays. Included are eye tracking, brain monitoring through a 20-node electroencephalography (EEG) cap, heart monitoring through an electrocardiogram (EKG), respiration, body temperature and skin conductance (caused by sweating). 

“EEG is quite good at detecting underload,” says Chad Stephens, an aerospace research scientist in the Flight Deck Interface Technologies group at Langley. “Heart rate, skin conductance and respiration help us to pick up on stress and overload.”

The first crew-state measurements will come from a group of 12 volunteer regional airline pilots who will fly in the backseat of a specially equipped Aero Vodochody L-29 Delphin in Iowa City, Iowa, this summer as part of a loss-of-attitude awareness study. The University of Iowa’s Operator Performance Lab (OPL) is under contract to NASA to perform the work. Tom Schnell, OPL director, is specifically choosing relatively low-time regional airline turboprop or jet first officers with no military training or experience with aerobatic flight. 

“The idea is that if you do have aerobatic experience, you’re not too worried with the aircraft upside down. You roll to the nearest horizon,” says Schnell, an assistant professor and pilot who flies the L-29 and a variety of other aircraft and helicopters for OPL. Schnell does not plan to invert the pilots but will put them into attitudes and scenarios with bank and pitch angles that a typical regional pilot does not see in daily practice. 

The subject pilot will fly in the rear seat behind two stacked 15.1-in. displays, the lower one representing a primary flight display and the upper showing a conformal 33-deg. lateral field of view of a virtual world outside that can be set to any location—a helpful option for scene texture, given the relatively flat landscape in eastern Iowa. The upper display can also be set to show head-up display symbology. The canopy is covered by a removable curtain on the inside, putting pilots in a simulated instrument-flight-rules environment and forcing them to rely on the instruments or misleading internal sensory information for situational awareness.

The pilots will fly scenarios with three versions of attitude information on the primary flight display: Classic “blue-over-brown” attitude graphics, a 12.8-deg. field of view synthetic vision attitude display (representing an Arinc size D display that would be found on an 8-in. primary flight display in commercial aircraft such as the Boeing 737NG), or a synthetic vision display with a 30-deg. field of view, similar to what is available on 15.1-in. primary flight displays in the Boeing 787 or coming in the 737 MAX.

One theory is that synthetic vision, which creates a daytime visual scene outside the aircraft regardless of the actual weather, will help pilots avoid losing attitude awareness. “It always happens in low visibility, at night or in fog; it never happens in daytime [visual meteorological conditions],” says Schnell of accidents caused by loss of spatial orientation.

The test will look at whether synthetic vision enhances attitude awareness and whether the width of the scene and optical flow cues (a terrain mesh, checkerboard pattern or speckling) boosts that awareness. Schnell says the wider scene should help by triggering the ganglion cell layers in the retina of the human eye, sensors that help orient humans in the horizontal direction based on peripheral vision.

Flights will last 90 min. and include multiple examples of “sub-threshold” roll departures, post-roll illusions and other maneuvers. Pilots will be told to close their eyes and put their head down while the maneuver is being set up and will take control and recover when cued by the pilot. During a sub-threshold maneuver, turn rates are too low for the human vestibular system to pick up, which can lead to a very steep spiral if pilots follow their inner sense of motion rather than the aircraft’s instruments. 

In a roll-reversal illusion, the front-seat pilot puts the aircraft in a turn in one direction and then rapidly reverses the roll, after which the subject pilot takes control and has to maintain final bank angle, despite an innate desire to reverse the roll. Steep spirals in instrument conditions are very dangerous, as the pilot’s natural reaction is to pull the control stick to exit the condition, an action that only steepens and tightens the corkscrew dive— hence the expression “graveyard” spiral.

Schnell will monitor the pilots in the rear cockpit with a helmet-mounted eye-tracking system and chest-worn EKG, as well as through flight-control inputs and aircraft state for post-flight analysis. Pilots’ performance will be measured in part by the time it takes them to respond and recover from a maneuver, as well as altitude loss, maximum vertical speed and pitch-and-roll deviations from those commanded. Schnell says he can overlay the physiological data on the performance metrics to determine “how hard your brain is working second by second as you perform the recovery.” 

He notes that the OPL technology developed for the project “using the EKG waveform of the heart . . .  does not use pulse-rate or heart-rate variability but samples the EKG at very high frequencies, looking for very subtle changes in waveforms using algorithms based on chaos theory. Changes in those waveforms are proportional to cognitive workload, and we have the ability to correlate that with 95 to 98 percent accuracy.” Schnell previously used the process for a task analyzing fighter pilots performing “certain tasks.”

While loss of attitude awareness is linked to lack of visibility and requires a dynamic environment to simulate, loss of energy-state awareness can happen in perfect weather with the aircraft in a 1g environment, as occurred in the Asiana 214 crash at San Francisco. Therefore, NASA and contractors will initially study the issue in Langley’s research flight deck full-motion simulator in Hampton Roads, Va., using pilot crews for realistic interaction. 

The simulator is modeled on the Boeing 787 flight deck, with four 15.1-in. displays in landscape layout and Rockwell Collins head-up guidance systems for both pilots. The flight deck is equipped with five cameras per side that track the eyes, mouth and nostrils to determine where a pilot is looking inside and outside the cockpit. 

Kyle Ellis, an aerospace research engineer in Langley’s Flight Deck Interface Technologies group, says it takes about 30 min. to build a head profile for a new user before tracking can start. “We can quantitatively assess how much time a pilot spends on each display,” says Ellis, a doctoral student of Schnell’s. “If they have less workload, there should be a less-defined scan. But if the activity is more intense, they should look in certain locations more of the time; if that is not so, then the probability for error increases.” 

Ellis says improper visual patterns can be flagged to trigger alerts for the instructor in the simulator, which can stop negative training “before it starts.”

Ultimately, researchers want to couple all of the physiological sensors for converging metrics. “Eye tracking is telling us where the pilots are allocating their attention, and we can infer things from that,” says Ellis. “But bring in EEG, heart rate and other measures . . . and you can combine where you are looking with [your] psycho-physiological state when you’re looking at those things. That brings a lot more insight into the actual state of the pilot and the ability to handle situations.” Researchers will begin collecting data this summer, Ellis says, most likely using volunteer general aviation pilots with commercial licenses to develop the initial system.

Key to success in part for Langley will be quantifying the impact of channelized attention on a pilot’s response to spatial disorientation and loss of energy-state awareness, says researcher Stephens. “When pilots are doing their job and then something unexpected happens—for example, if the automation disconnects—nine out of 10 times the pilots will turn automation back on and move on,” says Stephens. “But sometimes they get hyper-focused on what was the cause of that, and they’ll lose track of responsibilities, specifically monitoring.”

The goal will be to quantify the behavior in the simulator and from in-aircraft testing, then transfer that knowledge into specific best practices for designing or evaluating existing or future avionics, systems or training to steer pilots away from channelized attention.

“We want to identify it in research and supply data that the designers can reference when they’re coming up with a new display or even one part of a display,” Stephens says. “If it’s causing too much attention to be focused on that one moving part of the display, it’s probably a bad design.” 

 

Tap the icon in the digital edition of AW&ST to watch a video about the work NASA and key contractors are performing to help builders of future cockpits and avionics weed out designs that can lead to spatial disorientation and loss of energy state awareness, or go to  AviationWeek.com/video