If you are like most pilots, you probably think you have taken more than enough check rides to last a lifetime, certainly enough for a career. For many of us, a check ride means engine failures, cabin fires, rapid depressurizations and other catastrophes followed by a landing in the crummiest weather. We take these with someone holding the power to threaten our paychecks watching our every move, waiting for the slightest mistake. And some of us do this every six months. But this level of scrutiny of our “performance” is based on a tightly controlled script, looking at us to accomplish a specific list of tasks that have little in common with what we do on a day-to-day basis in our real, non-simulated airplanes. These check rides often miss things not on the script, things the screenwriter never thought to put on the page.
If you are in a small flight department, say less than 10 pilots, I think you are at much higher risk for three hazards that you may not recognize until it’s too late:
(1) Lacking required skills and knowledge.
(2) Becoming complacent by taking shortcuts around standard operating procedures (SOPs).
(3) Becoming intentionally noncompliant with standard operating procedures (SOPs).
Higher risk than who? Well, certainly at higher risk than pilots flying for a large airline. The airlines and other cadres of professional pilots in large groups have several advantages over those of us in small flight departments. (I am including myself; we have four pilots, two mechanics, a dispatcher and a line technician.) These larger organizations have dedicated training and standardization departments that devote themselves to making sure you are an SOP maven. Their pilots do not normally fly with you on the line, so they don't have a vested interest in keeping on good terms with you. You either follow SOP, or your longevity with the company is in peril.
If you think you are not at risk, it may because you’ve already succumbed to complacency, one of the three hazards. It goes like this: One pilot finds a shortcut that violates SOPs but is convinced his or her way is better. Nobody else objects because they want to get along. Pretty soon, everyone is taking the same shortcut. After a while, SOPs start to mean very little. You have normalized deviance, perhaps without even realizing it.
This sounds bad, but there is an easy solution. As much as we tend to hate check rides, orals and written exams, there is something else we hate: looking bad in front of our professional pilot peers. And that is the key to avoiding the three hazards: Submit yourself to regular peer reviews.
What is a ‘Peer Review?’
What is a peer review? First, what it isn't: It isn't a check ride. The observer has no legal authority and your license is not at risk.
Another thing it isn’t is a Line Oriented Observation. Some management companies conduct these observations, emphasizing they are merely "how are you doing" rides and certainly not check rides. I did many of these for a few management companies and can tell you they were always perceived to be check rides because every now and then pilots were fired--for good reason, of course, but fired nonetheless. A peer review does not carry that risk, since it is being done by a peer with no standing in the company.
A peer review is nothing more than the opportunity to have a peer--another professional pilot--observe your crew in action. The reviewer is a guest of yours whom you've asked to watch and provide feedback that will help you and your team to become better. You may think you already have this opportunity during regular simulator check rides. But those are artificial environments where the observer has a tightly structured "to-do list" and no time for anything else. Besides, the simulator observer is probably not a true peer, not someone who does what you do for a living.
I've already mentioned that small flight departments have a special vulnerability to three hazards because there is so little oversight. I think we can mitigate those hazards with peer reviews. Let's take a look at these hazards, an example accident case study for each, and how a peer review could have saved the day.
The Hazards of Being ‘Peerless’
Hazard 1: Lacking required skills and knowledge.
We've all seen pilots out of their depth at recurrents and wondered how they got to where they are without some kind of quality control step along the way. They didn't know what they needed to know and didn't have the skills required.
On May 15, 2017, a couple of Learjet pilots with a long history of performance deficiencies so badly scared their passengers while landing at Philadelphia, the passengers decided to rent a car and drive rather than fly to their next stop, Teterboro. The pilots repositioned empty and crashed a mile short of their intended runway. It was a challenging day at Teterboro, with gusty winds and the need to break out from an ILS and circle to an adjacent runway. These pilots didn't know the difference between an IFR circling approach within circling radii at minimum descent altitudes and maneuvering visually from one runway to align with another. This misunderstanding is just the tip of the iceberg of the knowledge and skills these pilots lacked, and yet they somehow became qualified to fly their jet with paying passengers.
Of course, these pilots didn't set out in their Learjet with the objective of doing something they were ill-prepared to do. I think that had they undergone a peer review at some point before finding themselves having to negotiate challenging winds at an airport that is challenging even with calm winds, they might have been given a wakeup call. At the very least, the peer could have sat down with them and told them they were at risk and talked some sense into them.
Keep in mind that lack of knowledge and skills isn’t limited to marginally qualified pilots, it also happens to good pilots who fail to keep up with changes to their aircraft or the industry. Think of how lost an international procedures expert from just 10 years ago would be in today’s North Atlantic High Level Airspace. If you don't keep up, you fall behind.
I've seen this happen many times over the years. I once gave a line observation to a pair of Falcon pilots who didn't understand that their takeoff minimums are different at most Canadian airports and politely pulled out the appropriate minimums chart before they fired up their aircraft's APU, hours before the visibility would be sufficient for takeoff. I might have saved them a violation or perhaps just some embarrassment. But I hope I gave them cause to understand the need to better prepare themselves for flights outside their home country.
Hazard 2: Becoming complacent by taking shortcuts around SOPs. Even very good pilots can be tempted by SOP shortcuts that seem to get the job done more quickly, with less fuss, and that are just easier. The problem with taking that shortcut "just this one time," is that success with the shortcut encourages repeat performances. Pretty soon the shortcut becomes the new SOP.
On Nov. 11, 2007, a pair of highly experienced Bombardier Challenger 604 pilots flew their brand-new Global 5000, C-GXPR, into a small airport with a short, 4,885-ft. runway. On paper the runway was more than long enough and the captain had flown there 75 times in a Challenger 604. He had apparently gotten into the habit of aiming to touch down in the first 500 ft. He did this once in the Global, a month before the accident flight, and his technique resulted in a touchdown 200 ft. beyond the threshold. On the day of the accident, the wheels touched 7 ft. prior to the runway, impacting the runway lip, causing the gear to collapse and damage to the aircraft beyond repair.
These pilots apparently did not understand that the geometry of their new airplane meant the main landing gear would touch hundreds of feet short of their aim point. A peer review from a more experienced Global or Gulfstream pilot could have saved the day. Their peer may have picked up their tendency to "duck under" and then explained how this is harder to get away with in a larger jet.
I saw this very tendency in another Global a few years before this accident. Like the accident crew, these pilots had upgraded from the Challenger 604 and just assumed that even without a flare, the wheels hit just 50 ft. behind the cockpit. After I drew them a few stick-figure diagrams, they realized just how important a 1,000-ft. aim point is.
Hazard 3: Becoming intentionally noncompliant with SOPs. The natural progression of complacency, if left unchecked, is willful and intentional noncompliance with SOPs. While simple complacency is insidious and its victims can be thought of as unfortunate innocents, I think intentionally noncompliant pilots are different. They are willing perpetrators.
On May 31, 2014, two highly experienced Gulfstream GIV pilots killed themselves and five others by attempting to take off with their gust lock engaged, and then attempting to disengage that gust lock at a speed that made such an action impossible. Subsequent investigations into the crash revealed that these pilots had a pattern of not using checklists, callouts or flight control checks that would have caught the forgotten gust lock. They managed to fool their evaluators at regular recurrents by flying one way for those charged with checking them, and another way during actual operations. Contract pilots who flew with them noticed many of their transgressions but as contract pilots had a vested interest in keeping quiet.
A peer review could have saved the day had someone they knew and respected noticed their procedural noncompliance and let them know just how reckless they were. "You guys are courting disaster," the talk could have begun. "You might be good, but nobody is immune from making mistakes. Your behavior might kill someone." And, of course, it did.
Behavior this flagrant is more common than we might think. Since this incident, I've watched several Gulfstream operators with similar gust-lock systems start up and take off without a flight control check. Over the years of giving check rides, line observations and peer reviews, the most frequent examples of willful noncompliance I’ve noticed deal with oxygen use, followed by sterile cockpit rules, headphone usage and inadvisable automation procedures. Of course, all of these are fairly common occurrences and we usually get away with them. As the reviewer I try to diplomatically point out the advantages of compliance and hope my words are taken to heart.
Editor's Note: The next two parts of this article are "How to Conduct A Peer Review" and "Peer Reviews: Real-World Results."
Comments