You Don’t Understand Murphy’s Law: The Importance of Defensive Design

CALLBACK is the monthly newsletter of NASA’s Aviation Safety Reporting System (ASRS)1. Each edition features excerpts from real, first-person safety reports submitted to the system. Most of the reports come from pilots, many by air traffic controllers, and also the occasional maintainer, ground crew, or flight attendant. Human factors concerns feature heavily and the newsletters provide insight into current safety concerns2. ASRS gets five to nine thousand reports each month, so there’s plenty of content for the CALLBACK team to mine.

The February 2022 issue contained this report about swapped buttons:

A Confusing Communication Interface.
An Aviation Maintenance Technician (AMT) described this incorrect interface configuration noted by a B777 Captain. It had already generated multiple operational errors. 

The Captain reported that the Controller Pilot Data Link Communications (CPDLC) ACCEPT and REJECT buttons were switched.... This caused 2 occasions of erroneous reject responses being sent to ATC. On arrival, the switches were confirmed [to be] in the wrong place (Illustrated Parts Catalog (IPC) 31-10-51-02), and [they were] switched back (Standard Wiring Practices Manual (SWPM) 20-84-13) [to their correct locations].... These switches can be inadvertently transposed.

This reminded me of the story of Capt. Edward Aloysius Murphy Jr., the very individual for whom Murphy’s Law is named. It’s a great story, uncovered by documentarian Nick Spark whose work resulted in the key players receiving the 2003 Ig Nobel Prize3 in Engineering.

Murphy’s Law

You’ve probably heard Murphy’s Law stated as:

Anything that can go wrong, will go wrong.

That’s not incorrect, per se; in fact, it’s a useful generalization. The problem is that it is often misinterpreted. When something goes wrong, Murphy will be invoked with an air of inevitability: of course [whatever improbable event] would happen, it’s Murphy’s law!

You, dear reader and astute system thinker, may have already spotted the issue. If anything that can go wrong will, why not take steps to preclude (or at least mitigate the impact of (or at least be willing to accept)) this possibility?

The story of Murphy’s Law starts with some of the most important, foundational research in airplane and automotive crash safety. I will summarize the program, but there is no way I can do it justice. I’d highly recommend this article by Nick Spark, or the video below by YouTube sensation The History Guy.

Rocket sleds

Physician and US Air Force officer John Paul Stapp was a pioneer in researching the effects of acceleration and deceleration forces on humans. This work was done using a rocket-powered sled called the Gee Whiz4 at Edwards Air Force Base. A later version called Sonic Wind was even faster, capable of going Mach 1.75.

There’s a long history of doctors and scientists experimenting on themselves. Doctor Barry Marshall injected himself with the bacterium Helicobacter pylori in order to prove that ulcers were not caused by stress and could be treated with antibiotics, winning a Nobel prize for the work. Doctors Nicholas Senn and Jean-Louis-Marc Alibert each showed that cancer was not contagious by injecting or implanting it into themselves. Of course, self-experimentation is not without risk. Dr. William Stark died from scurvy while deliberately malnourishing himself to research the disease. Stapp was cut from the same cloth, subjecting himself to 29 rocket sled tests including some of the most severe and uncertain of the setups.

Human strapped into a chair atop a rocket sled on railroad tracks
The Gee Whiz with human subject.
NASA/Edwards AFB History Office, and pilfered from the Annals of Improbable Research

And they were quite severe. Test subjects routinely experienced forces up to 40g. The peak force experienced by a human during these tests was a momentary 82.6g, which is insane. By comparison, manned spacecraft experience 3-4g and fighter pilots about 9g. People lose consciousness during sustained forces of 4-14g, depending on training, fitness, and whether they’re wearing a g-suit.

Stapp and his fellow subjects suffered tunnel vision, disorientation, loss of consciousness, “red outs” due to burst capillaries in the eyes, and black eyes due to burst capillaries around the eyes. They lost teeth fillings, were bruised and concussed, they cracked ribs and broke collarbones. Twice Stapp’s wrist broke from a test and one of those times he simply set it himself before heading back to the office. The team, particularly project manager George Nichols, were legitimately worried about killing test subjects as they were accelerated faster than bullets and close to the speed of sound6.

Six frames of subject's face clearly in discomfort and experiencing high winds
Stapp during a test on Sonic Wind.
NASA/Edwards AFB History Office, and pilfered from someone posting it on Reddit

All of this effort was designed to understand the forces humans could withstand. It had been thought that humans were not capable of surviving more the 18g, so airplane seats weren’t designed to withstand any more than that. Stapp thought, correctly, that aviators were dying in crashes not because of the forces experienced but because their planes didn’t have the structural integrity to protect them. The work lead to major advances in survivability in military aviation.

Stapp then applied his expertise to automotive research, using the techniques he’d developed to create the first crash tests and crash test dummies. He also advocated for, tested, and helped to perfect automotive seatbelts, saving millions of lives. A non-profit association honors his legacy with an annual conference on car crash survivability. We all really owe a debt of gratitude to Dr. Stapp and this program.

Murphy

So, where does Murphy fit into all of this? Well, during the program there was some question about the accuracy of the accelerometers being used to measure g-forces. Another Air Force officer, Edward Aloysius Murphy Jr. had developed strain transducers to provide this instrumentation for his own work with centrifuges. He was happy to provide these devices to the rocket sled program and he sent them down to Edwards Air Force Base with instructions for installing and using them.

Black and white yearbook photo of a man in military uniform
Murphy as a college student.
U.S. Military Academy, West Point

The gauges were installed, Stapp was strapped in, and the test conducted. The engineers eagerly pulled the data from the devices and found… nothing. Confused, the team called Murphy to ask for help and he flew out to Edwards AFB to see for himself. Upon investigation, he found that the transducers had each been meticulously installed backwards, resulting in recording no data. He blamed himself for not considering that possibility when writing the instructions and in frustration said:

If there’s more than one way to do a job, and one of those ways will end in disaster, then somebody will do it that way.

Stapp then popularized the phrase, stating in a press conference that “We do all of our work in consideration of Murphy’s Law… If anything can go wrong, it will. We force ourselves to think through all possible things that could go wrong before doing a test and act to counter them”. With human subjects in highly risky experiments, the team put the utmost care into the design of the system and the preparation of each test event. By assuming that anything that can go wrong will indeed eventually go wrong, the team would put in the effort to minimize the number of things that could go wrong and thus maximize safety.

Conclusion

Murphy’s Law isn’t about putting your fate in the hands of the universe, it’s about defensive7 and robust design. Reliability engineering and the technique of Failure Modes, Effects, and Criticality Analysis (FMECA) have their roots in this concept.

It’s why we have polarized outlets and safety interlocks. It’s why any critical component should only be able to be installed the correct way, because the ASRS database is filled with hundreds, if not thousands, of reports like the one from the top of this story of parts that fit correctly but are actually installed wrongly8.

I heavily relied on the work of Nick Sparks for this article. He tells the story much better than I in A History of Murphy’s Law, which one reviewer compares favorably to The Right Stuff.

How have you seen defensive design practiced? Has Murphy’s law impacted your engineering approach? Share your thoughts in the comments.


Footnotes:

  1. ASRS is an FAA-funded service independently administered by NASA. It’s purpose is to collect voluntary reports from the aviation community regarding potential safety concerns. A key feature is that submitting a report demonstrates a “constructive attitude” and so reporters are given immunity from FAA penalties in certain circumstances if they proactively file a report, encouraging users to own up to potential mistakes and strengthening the aviation community as a whole. I love this concept and would like to see it applied to other high-risk industries. There’s a free idea for you… though you can always hit me up if you need an implementation consultant.
  2. There’s also a fantastic database search function if you’re interested in performing your own analysis.
  3. “For achievements that first make people LAUGH then make them THINK.”
  4. Because it goes fast and pulls high g-forces
  5. Though human subjects were not used at those speeds, see footnote #6 later
  6. The fastest human trial was Mach 0.9, by Stapp himself on December 10, 1954. The braking force was equivalent to hitting a brick wall a 120 MPH and it left Stapp temporarily blinded, his eyeballs filled with blood. In an interview 50 years later you can hear the distress and emotion in Nichol’s voice as he retells the story. And Stapp wanted to push it even further to try break the sound barrier! Air Force leadership rejected any further pushing of the limits with human subjects.
  7. The term “defensive design” came into use in the 1970s to describe civil engineering and architectural safety features. If you search for “defensive design” you’ll get many results about hostile architecture; it seems to me like someone is trying to rebrand hostile architecture with a more positive name, but I assure you that hostile architecture has nothing to do with defensive design.
  8. Not to mention the NTSB reports of fatal crashes caused by these issues.