The NTSB is seeking changes that would further automate airborne collision avoidance systems.
A well-run business aviation operation has data collection programs such as flight data monitoring, confidential incident reporting systems and irregularity reports. However, just having the programs is not enough. They must be used properly to accomplish their intended purpose of helping to identify problems before they become bigger problems. With that premise, consider this: Is information that could predict and prevent your next accident hiding in your data? Are you sure?
The International Civil Aviation Organization defines safety data as “a defined set of facts or set of safety values collected from various aviation-related sources which is used to maintain or improve safety.”
In late January, the NTSB concluded its investigation of last year’s midair collision near Washington. As part of that investigation, the board found that in the 39 months leading up the accident, the airspace surrounding Ronald Reagan Washington National Airport (DCA) had 15,214 cases where commercial aircraft and helicopters had lateral separation of 1 nm or less and vertical separation of 400 ft. or less. Of those, 85 events involved lateral separation less than 1,500 ft. and vertical separation less than 200 ft.
“The FAA lacked effective strategies to identify, assess and reduce recurring midair collision hazards in the skies around Reagan National,” the NTSB found. “Despite available safety data showing repeated close encounters between helicopters and airplanes near the airport, the FAA did not conduct sufficient safety analysis or take timely corrective action.”
The data was there, but warning signs were somehow missed. Part of the problem was that data was contained in disparate sources. One source of FAA data was the Aviation Risk Identification and Assessment (ARIA). In its retrospective review, the NTSB found that ARIA contained an average of 390 encounters per month where helicopters and aircraft were within 1 nm laterally and 400 ft. vertically of each other. Also within the FAA, the Performance Data and Analysis Reporting System, revealed 65.6 encounters per month where aircraft and helicopters were too close to each other. Other sources included NASA’s Aviation Safety Reporting System and the Aviation Safety Information Analysis and Sharing, which at the time was maintained by the Mitre Corp.
Granted, it is easier to find something once you know what you are looking for, and for this analysis, the NTSB has a tragedy on which to base its various data searches. The holy grail, of course, is to use data to predict and prevent your next accident or serious incident proactively. Think of safety data as the lifeblood of an effective Safety Management System (SMS). You take operational data—incidents, hazard reports, accidents, near collisions—and feed it back into your SMS to identify hazards and perform risk assessments, and to determine the effectiveness of previously implemented risk controls.
A National Business Aviation Association (NBAA) one-pager titled “NBAA Safety Data Collection, Analysis and Sharing Quick Facts” lists some benefits of an effective safety data program, which include proactively identifying risks and safety measures and allowing for continuous safety improvements. The article also notes the possibility of reduced insurance premiums.
I have heard of organizations taking the ostrich approach—believing that what they do not know will not hurt them. The NBAA solidly refutes that attitude: “Not having [a safety data] program may result in increased liability after an accident or incident, because failing to participate in these programs can be used as evidence that the operator is not participating in efforts to enhance safety.”
There are challenges with collecting the right kinds of data and then turning it into meaningful and actionable information. Just because you collect information does not necessarily mean it can be useful data. Be on guard for “feel-good” metrics like on-time performance and dispatch reliability—measures that may not necessarily indicate true problems. In fact, they can give a false sense of confidence.
I remember one business aviation flight department proudly touting that one of its safety metrics was tallying the number of safety meetings held. Being the skeptic that I am, I questioned how the number of safety meetings correlated with how safe they were. They gave me a blank stare.
Do not get me wrong—I am all in favor of safety meetings. However, is there a correlation between the number of safety meetings and actual safety of the operation? I suspect it depends on what is discussed in those meetings. If the time is spent poring over recent events and analyzing trends, it is likely a useful indicator. However, just tallying the number of safety meetings without context may be a fool’s errand.
Choosing the right metrics is critical. When I was at the NTSB, we saw cases where large transportation organizations collected what they believed were solid safety metrics. One was a large subway operator. It collected information on crime in subway stations and parking lots, escalator injuries and slips, trips and falls. Yes, all of these are important to keep the riding public safe, but they had nothing to do with predicting and preventing the type of rail flaw that led to two trains colliding and claiming multiple lives.
In another case, the nation’s second largest commuter railroad used on-time performance as a primary indicator of how well the operation was performing. It did not work—we investigated five accidents with that railroad that occurred within an 11-month period.
If you feel data-rich and information-poor, do not feel left out. Some organizations go to great lengths to collect all kinds of data, thinking they can make something useful out of it. I hope they can, but many organizations are inundated with terabytes of data and end up overwhelmed. I had a recent conversation with a safety manager who embarrassingly admitted, “We just don’t know what to do with it.” I assured him that he was not alone.
Another challenge is analyzing data and turning it into useful information—information that you can act on to prevent your next serious incident or accident. I suspect this may be the most difficult component of any safety data program. Consider this NTSB finding from the DCA midair: “The [FAA] was made aware of and had multiple opportunities to identify the risk of a midair collision between airplanes and helicopters at Ronald Reagan Washington National Airport; however, their data analysis, safety assurance and risk assessment processes failed to recognize and mitigate that risk.”
I will admit—I am sick of hearing about AI. But there truly is promise for how data science, including AI, can help with data analysis. “Fundamentally, data science is about extracting meaning from data,” says my colleague, Dr. Kristy Kiernan, associate professor of aeronautical science at Embry-Riddle. Natural language processing, for example, can go through safety reports to spot issues, and it can do it more easily and quickly than a human. It also can compile textual narratives, such as those contained in a written safety report, and turn it into data for statistical analyses. “AI-enabled data fusion will allow us to get a more complete picture of hazard and risk,” Kiernan explains. “Right now, integrating those sources is laborious and time-consuming. But if we had the right tools to combine disparate datasets, we could get safety insights back to the enterprise quickly and at scale.”
More development of these tools is needed, but there is promise. Kiernan told me that future applications of AI tools should have the ability to identify patterns of hazard and risk, which can then be used to monitor events in real time to alert when circumstances are veering toward the edge of the safety envelope.
Meanwhile, listen to what your data is trying to tell you. And, as much as this column is about having solid data, do not overlook the importance of a strong hunch or gut instinct. After the Columbia space shuttle accident, it was noted that NASA was such an engineering-oriented organization that it discounted information unsupported by data. The fleeting voice in the back of someone’s head may be just what it takes to prevent an accident.
Safety data had the potential to unveil issues we have not begun to even think about. Are you using data to make sure you are detecting and preventing your next accident?
Robert L. Sumwalt is executive director for the Boeing Center for Aviation and Aerospace Safety at Embry-Riddle Aeronautical University. He was a member of the NTSB from 2006-21, including serving as chairman from 2017-21. Before that, he managed a corporate flight department for a Fortune 500 company, and previously was a pilot for US Airways.




