Wiener’s Laws
As far back as 1980, renowned aviation human factors guru Earl Wiener (pictured sitting in the captain’s seat of the a Northwest Airlines Boeing 757 circa 1992) was asking the question on everyone’s mind after the tragic crash of Asiana Flight 214 in San Francisco earlier this month – Has automation gone too far?
Had he been around and of sound mind, Wiener would surely have weighed in.
However he passed away June 14 at the age of 80, the victim of a long bout with Parkinson’s disease.
Along with his family, books and scholarly papers and a new generation of human factors professionals, Wiener left us with Wiener's Laws, 15 jewels of wisdom that will keep giving for decades to come because human nature, hence human error, is not changing all that rapidly.
The laws were sent to me by former Wiener student and co-worker Asaf Degani, Ph.D., now a technical fellow at General Motors. “Some are funny and some are dead serious,” says Degani.
I have no explanation of why Laws 1-16 are "intentionally left blank"...
Which one is your favorite?
WIENER’S LAWS
(Note: Nos. 1-16 intentionally left blank)
17. Every device creates its own opportunity for human error.
18. Exotic devices create exotic problems.
19. Digital devices tune out small errors while creating opportunities for large errors.
20. Complacency? Don’t worry about it.
21. In aviation, there is no problem so great or so complex that it cannot be blamed on the pilot.
22. There is no simple solution out there waiting to be discovered, so don’t waste your time searching for it.
23. Invention is the mother of necessity.
24. If at first you don’t succeed… try a new system or a different approach.
25. Some problems have no solution. If you encounter one of these, you can always convene a committee to revise some checklist.
26. In God we trust. Everything else must be brought into your scan.
27. It takes an airplane to bring out the worst in a pilot.
28. Any pilot who can be replaced by a computer should be.
29. Whenever you solve a problem you usually create one. You can only hope that the one you created is less critical than the one you eliminated.
30. You can never be too rich or too thin (Duchess of Windsor) or too careful what you put into a digital flight guidance system (Wiener).
31. Today’s nifty, voluntary system is tomorrow’s F.A.R.
Degani, sitting in the co-pilot seat in the picture, and several other acquaintances of Wiener reached out to me when they learned that I would be writing an obituary for Earl, a short version of which ran in the July 22, 2013, issue of Aviation Week & Space Technology, which was fitting as Earl often contributed to our stories about human factors and automation.
“I was Earl's student for several years and afterwards we collaborated on several NASA research projects, primarily on aviation safety and checklist design,” says Degani. “Earl was a wonderful mentor and a forward-looking researcher, not to mention just being a very special and generous human being. His research work on vigilance and automation pioneered the area of human-automation interaction.” Then there was the human side of Wiener. “He had a great fondness for unicycles, and he made sure all his graduate students unicycle qualified,” says Degani. “It was a kind of unofficial requirement for graduation.” Wiener’s family says on his 40th birthday he rode his unicycle across the Golden Gate Bridge.
Deborah A. Boehm-Davis, now Dean of the College of Humanities and Social Sciences at George Mason University, had Earl as an office mate for one year as part of a NASA project after she finished graduate school. It left an impression. “He was a great mentor to me in the human factors profession,” she says.
Alan Price, a retired Delta Airlines captain, says he is especially fond of Law #29. “It’s the law of unintended consequences which I refer to constantly when speaking,” he says. “Every time you solve a problem you create a new problem. The best you can hope for is that the problem you solved is of greater magnitude than the one you created."
Price managed Delta's captain’s leadership program, called In Command, from 1996-2001, which drew heavily from Wiener’s work. “In March of 2001 we hosted the first-ever Captain’s Leadership Symposium at Delta, a 3-day program with participants from airlines, academia, and government from around the world about left seat leadership. Earl was one of our keynote speakers and we took the occasion to present him with the first ever Dr. Earl Wiener "Lifetime Achievement Award" for his contributions to aerospace safety. We presented the award only twice since, to Capt. Al Haynes and to Dr. Bob Helmreich.”
Wiener in the early 1980s began researching what happens when humans and computers attempt to coexist on a flight deck. Though his “day job” was professor of management science at the University of Miami, Wiener is widely known for embedding in jump seats of his airline pilot subjects as part of research projects funded by the NASA Ames Research Center. Wiener would continue performing NASA human factors works for more than two decades. “Earl was an ongoing grantee,” says a NASA co-worker from that time. “He would publish a paper and 25 people would write their masters’ theses or doctoral dissertations on the topic.
In a 1980 paper he co-wrote with NASA’s Renwick Curry, “Flight-deck automation: promises and problems”, Wiener wrote, “It is highly questionable whether total system safety is always enhanced by allocating functions to automatic devices rather than human operators, and there is some reason to believe that flight-deck automation may have already passed its optimum point.” Compilations of scholarly papers by Wiener and his colleagues resulted in two key human factors books, one of which – Human Factors in Aviation – is still in publication today, albeit as a new edition with new editors.