They are those convenient intellectual shortcuts we use to streamline our decision-making processes. Cognitive biases help us process information expeditiously when we’re under pressure.

There are 100, or more, cognitive biases that are well known to psychologists. They influence or control ranges of behaviors, including eating and drinking, along with social, economic, religious and political actions. A few help us make good decisions with virtually no conscious thought. Most are relatively benign as long as you stay on the ground and steer clear of heated discussions. But there are about a dozen such biases that can kill you in an aircraft.

Such biases are formed through formal learning, personal experiences and hereditary factors. We use them to conserve our limited memory processing time and capacity.

How finite is human memory? While the human brain is capable of 10 quadrillion processes per second, far more than any computer yet designed, our accessible memory actually is far less capable. So, we tend to make choices within “bounded rationality” . . . “within the limits imposed by given conditions and constraints,” according to Nobel Laureate and cognitive psychologist Herbert A. Simon. The limits include the time available to make the decision, cognitive limitations and available information, among other variables.

Bounded rationality can lead to “satisficing,” another term coined by Simon. In essence, that means a problem-solving shortcut that’s “good enough for government work.” As such, “satisficing” can hamper us from making the most rational, logical and optimal decisions based upon all available information and resources. Bounded rationality is just plain intuitive. In severe instances, “satisficing” can lead to grave mistakes and fatal errors as pilots fail to recognize and take action regarding overt and latent threats.

Analytical, logical decision-making — in contrast, is based upon conscious deliberation using rule-based thinking. But humans operate with both analytical and intuitive process models. Under time pressures, humans’ cognitive processes often fall back on shift intuitive cognition, as we unconsciously retreat from analytical decision-making. Our decision-making becomes prone to cognitive biases.

The first step in eliminating or mitigating cognitive biases is to identify them and determine how they can deter us from making the best decisions. For pilots, the key is to focus on the cognitive biases mostly likely to have impact on aeronautical decision-making.

Shem Malmquist, a senior MD11 captain for a major U.S. freight carrier and a Fellow of the Royal Aeronautical Society, has identified several cognitive biases that he believes are most likely to affect safety of flight.

Let’s look at how a dozen such biases, named by Malmquist, can impact safety.

Ambiguity Effect

The old saw “Better the devil you know than the devil you don’t” may help explain this bias. It’s when “a person is more likely to select something that has an intuitively clear risk as opposed to one that seems relatively less certain,” writes Malmquist in his April 2014 blog post.

Ambiguity effect is reinforced by structured experiences, such as flying between regular origin and destination points. Crews become comfortable and accustomed to landing at a destination airport. There’s seemingly never a need to divert to an alternate.

Anchoring Bias

Also known as focalism, this cognitive bias causes pilots to rely excessively on the first piece or set of information provided to them, forming an anchor for making decisions. They may use an adjustment bias to make minor changes to the base assumption of the anchor, but there is a reluctance to deviate sufficiently from the anchor to assure adequate safety margins.

If a computer-generated flight plan, for instance, predicts a certain required fuel burn based upon predicted winds, weather and temperatures, plus assumed reserves, that can create an anchor bias that can hamper the crew from making dynamic, logical decisions about fuel planning as conditions change. Intuitively, they’re willing to accept minor changes to the original plan. But large-scale changes require a more difficult, analytical approach.

Attentional Bias

“Once bitten, twice shy,” goes the saying.  Attentional bias can result from a previous “thermal scarring,” causing us to focus or fixate on that earlier perceived threat or scare while overlooking other threats that should be considered in the decision-making process. If, for instance, a pilot has had previous anxious experiences flying through rough weather, a subsequent encounter with a weather threat might cause that pilot to pay close attention to that threat while overlooking or ignoring threats related to low fuel quantity, systems malfunctions or navigation errors. Intuitive fixation can blind us to the big picture.

Attentional Tunneling

Also known as “tunnel vision,” “attentional fixation” and “cognitive tunneling,” this bias is related to Attentional Bias but doesn’t necessarily result from memories of a previous threat encounter. Rather, it occurs when the crew focuses excessive attention or time on one task or threat to the detriment of being aware of other threats. It’s the “one track mind” mentality, when pilots become blind to what’s going on outside of a narrow channel of attention.
Attentional tunneling is a leading cause of controlled flight into terrain (CFIT) accidents. “Consider, for example, a crew that becomes so fixated on troubleshooting a burned-out warning light that they fail to monitor their altimeter and end up flying into the ground,” says FAA Aerospace Medicine Technical Report, DOT/FAA/AM-03/4.

That’s precisely what happened aboard Eastern Airlines Flight 401, a Lockheed L1011 that crashed into the Florida Everglades in December 1972 while the flight crew became preoccupied with troubleshooting a burned-out landing gear annunciator bulb, says Malmquist.

Automaticity

“Going through the motions,” is an apt descriptor for this tendency. This is not a cognitive bias, but it describes the ability to perform tasks by rote without much mental concentration, without paying close enough attention to detect problems. While automaticity can streamline task management when following checklists, during aircraft preflight inspection or programming cockpit automation, it also puts crews at risk for going through the motions without being alert for indications of anomalies or abnormalities. In an aircraft, what you don’t know (or inadvertently overlook) indeed can hurt or even kill you.

Preflight inspections often fail prey to Automaticity. Eyes and hands gloss over every part of the airplane. But, intuitively, pilots may fail to look actively for anomalies, including under-inflated tires, worn brakes, missing or damaged static wicks and open latches.

Availability Cascade

Repeated exposure to information (or misinformation) can cause people to accept these perceptions as facts. Call this aviation folklore. It’s a self-reinforcing process that can form a stronger cognitive bias the more a particular perception is discussed publicly, the more it becomes accepted as fact.

“A lie told often enough becomes the truth,” noted no less an authority than Vladimir Lenin. In “Availability Cascades and Risk Regulation,” a 2007 research paper by Timur Kuran and Cass Sunstein, the authors note that an Availability Heuristic can morph into an Availability Cascade as a result of the spread of exorbitant rumors. They cite the prolonged public hysteria surrounding the 1996 explosion and crash of TWA Flight 800 as an example. Even though an extensive investigation found no evidence of terrorist involvement or foul play, the public outcry over the event led to the White House’s proposing “extensive additional safeguards against terrorism” within 45 days of the crash.

The cost of the new safeguards ran to billions of dollars, even though the likely culprit of the accident was the ignition of fuel vapors in a warm, nearly empty center wing fuel tank.

Hangar flying can be a prime source of unscientific aviation folklore that reinforces misinformation. If a particular fiction gets repeated enough during hangar flying sessions, it can become accepted as fact.

Base Rate Bias (or Base Rate Fallacy)

A little knowledge is dangerous. A lack of big data can cause an intuitive inability to see large-scale statistical trends. Here’s an example. Let’s assume that a wheel brake anti-skid system activates 5% of the time when the wheels have not lost traction. Let’s also assume that it activates 100% of the time when the wheels actually have lost traction. And let’s further assume that one landing in 1,000 will result in the wheels skidding.

If the anti-skid system activates on a particular landing, what’s the probability that it’s functioning as intended? Some people might conclude that it’s as high as 95% of the time. But the correct probability is only 2% of the time.

Big data relies on Bayes’ theorem to arrive at the 2% probability. The anti-skid system is 100% reliable when the aircraft is skidding. But the wheels don’t skid in 999 out of 1,000 landings. The 5% anti-skid trigger rate for the 999 landings during which the wheels don’t skid results in 49.95 false anti-skid trigger events. Add in the one wheel skid event in 1,000 landings that correctly triggers the anti-skid system and there are 50.95 anti-skid triggers per 1,000 landings. Bayes’ theorem computes inverse probability by dividing 1 by 50.95 to arrive at 1.96+%, roughly 2%, probability of the anti-skid system functioning as it should.

Confirmation Bias

Many people arrive at their political beliefs in the absence of information. The same holds true for pilots in airplanes. Once they’ve established a particular mental model, they intuitively become blind to clear and convincing evidence to the contrary. They only look for additional information that confirms their original model.

Critical thinking and a willingness to search for information that is contrary to one’s mental model can be lifesaving. It’s all too easy to slip into a comfort zone in which small, but vital, bits of information are ignored.

Confirmation bias is suspected as a contributing factor in several loss of control accidents and incidents during which pitot/static system malfunctions have occurred. When pilots earn their instrument ratings, they’re taught to believe the instruments rather than their own perceptions. The instruments don’t often lie, but when they do, it’s often difficult for pilots to accept that they’re reading erroneous indications because of confirmation bias.

Pitot tube icing, for instance, will cause a decrease in airspeed indication. But as the aircraft climbs, the decrease in static pressure will cause an erroneous increase in the airspeed indication because of the blocked pitot pressure.

In contrast, a static port blockage will cause an airspeed indication to increase with decreasing altitude if the pitot system is properly functioning.

Confirmation bias can cause pilots to believe erroneous instruments are providing correct indications, thus leading to incorrect power and pitch inputs that can cause the aircraft to exceed stalling angle of attack or maximum airspeed limits.

Optimism Bias

Pilots can be prone to flying with rose-colored glasses, becoming unrealistically optimistic about the probable successful outcome of challenges. They become accustomed to working through difficult situations to reach successful outcomes so many times, that they begin to believe they can successfully work through any abnormality or emergency to a successful conclusion. Such an Optimism Bias (also known as Comparative Bias) makes them believe they’re less prone to risk than other pilots.

“It can’t happen to me” becomes a prevailing attitude. This can lead to bending or breaking SOPs, such as failure to stabilize a landing approach at or before the aircraft reaches standardized continue or go-around decision points.

Optimism Bias is closely related to three types of Overconfidence Bias: (1) overestimation of one’s own piloting abilities and aeronautical decision-making expertise, (2) over-placement of one’s abilities compared to those of other pilots and (3) over-precision, which is the unfounded certainty of the validity of one’s abilities and judgment.

Such bravado can make a pilot believe that nothing can happen that she or he can’t handle. They become fearless of the consequences of Optimism Bias.

Plan Continuation Bias

This is the strong unconscious tendency to forge ahead with the original plan in spite of changing conditions. This bias grows stronger near the end of the mission as the crew anticipates landing the aircraft and completing the flight. Plan Continuation Bias “may have the effect of obscuring subtle cues which indicate that original conditions and assumptions have changed,” according to Eurocontrol’s Skybrary online aeronautical reference system. Skybrary cites several airline incidents and accidents in which Plan Continuation Bias has been a factor.

Aboard business aircraft, crews also may be biased toward continuing missions because of external factors, notably their passengers’ expectations. Such biases can be especially risky on positioning flights with no passengers aboard, as evidenced by the December 2005 crash of a Learjet 35A attempting to land at Truckee-Tahoe Airport in low-visibility conditions and the November 2004 crash of a Gulfstream GIII during an instrument approach to Houston-Hobby Airport, among several others.

Prospective Memory

This is a form of memory that enables people to remember to do something or pay attention to something at a future point in time, such as mentally noting a maintenance snag during a flight and later remembering to record it in the discrepancy log after landing.

Prospective Memory is susceptible to being interrupted by distractions, such as the immediate need to deal with an abnormality or emergency, or even fly a challenging instrument approach in low-visibility conditions. Preoccupation with an immediate problem can result in forgetting high priority tasks later in the flight, including forgetting to extend the landing gear on approach or initiating the final landing checklist.

Selective Perception

One’s frame of reference very much influences the kinds of information we are willing to accept as valid. We may not notice and readily forget information that is in conflict with our belief systems, stimuli that cause emotional discomfort or stress. This filtering process is known as Selective Perception and it prejudices and perverts our objective fact-finding and decision-making processes. Closely related are Selective Attention, the bias that causes us to choose to pay attention to certain information and ignore other stimuli, and Selective Retention, the bias that causes people to remember information that is more closely aligned with their belief systems.

Selective Perception may result from an Availability Cascade of misinformation as noted earlier, causing us to discount or discard information relevant to objective decision-making. Our belief systems become tainted. We’re not aware of such bias, but we’re acutely sensitive to the Selective Perceptions of others who do not share our belief systems.

Countering Cognitive Biases

The first step in countering cognitive biases is to identify them. Accident reports provide plenty of events that can be analyzed. Malmquist notes that one or more cognitive biases are primary or secondary factors in most aircraft accidents. Typical flight crew errors in flight include Anchoring Bias, Attentional Tunneling, Confirmation Bias and Plan Continuation Bias.

Startle effect compounds the adverse effects of cognitive biases. The initial emotional shock resulting from an unexpected anomaly can cause crews to fall back on old habits and perceptions as noted by France’s Bureau d’Enquêtes et d’Analyses (BEA) in its investigation of the Air France Flight 447 loss of control crash off the coast of Brazil in June 2009. Some 44 sec. after the aircraft’s pitot tubes iced up, causing the autopilot and autothrottles to disconnect, the pilot flying increased pitch to 11 deg. nose up in 10 sec. Then, when the aircraft initially entered the stall, the flight crew didn’t recognize that the aircraft had exceeded critical angle of attack. Less than 4 min. later, the aircraft hit the Atlantic Ocean with a vertical descent speed of nearly 11,000 fpm.

The BEA report notes that prior to the accident, the aircraft captain “appeared very unresponsive” to the concerns expressed by the pilot flying (PF) regarding weather hazards in the intertropical convergence zone. The PF noted that the OAT was considerably warmer than standard day, thus limiting available reserve thrust to climb higher than 35,000 ft. to top the weather.

The captain said he’d flown the route between South America and Paris several times and he preferred to wait and respond to turbulence if it were encountered. Could his decision have been influenced by Anchoring Bias, Plan Continuation Bias and/or Optimism Bias?

When all the pitot tubes became jammed with tiny ice crystals causing the autopilot and autothrottles to disconnect, was the pilot flying’s decision-making influenced by Attentional Tunneling, Confirmation Bias and/or Selective Perception?

Dozens of other turbine aircraft accidents provide ample fodder for discussion of cognitive biases as causal factors in these mishaps. Once the relevant cognitive biases have been identified in such accidents, then training scenarios can be developed that include anomalies designed to trigger such intuitive decision-making.

Classroom instruction about cognitive biases in the cockpit helps pilots acquire essential knowledge about their risks. Comprehensive line-oriented flight training (LOFT) simulator sessions help pilots acquire the skills to suppress intuitive cognition, develop immunities to cognitive biases and replace them with logical, analytical cognition processes.

Incorporating startle factors when they’re least expected in LOFT sim sessions is essential. Without such “wild cards” in the sim to trigger cognitive biases, the training leaves pilots vulnerable to unexpected events, especially when they involve subtle clues.

The struggle between intuitive and analytical cognitive processes is ongoing. Learning how to recognize and suppress cognitive biases in order to gain the complete mental image of what is occurring and take full advantage of all the data can be critical to safety of flight.

A problem-solving shortcut decision-making that’s “good enough for government work” may be easy and comfortable. But pilots need full-range analytical cognition, based upon the depth and breadth of available information, to ensure everyone aboard arrives safely back on the ground.

This article appears in the April 2016 issue of Business & Commercial Aviation with the title "Cockpit Cognitive Biases."