top of page

Meltdown by Chris Clearfield

Writer's picture: Lars ChristensenLars Christensen

I finished this book in February 2025. I recommend this book 8/10.


Why you should read this book:

This book will show you how we are adding more and more complexity to our systems and processes and how our safety systems fail. The book teaches what we can learn from plane crashes, oil spills, and dumb business decisions. It teaches leaders to pause, exercise pre-mortem, and the power of a diverse team.


Get your copy here.


🚀 The book in three sentences

  1. We are adding more complexity but not the systems to monitor them.

  2. We ignore the small warning signs, rush forward, and act dumb if we have power.

  3. Pause, look at diversity questions, pre-mortem, devil's advocate, and understand everyone's role.


📝 My notes and thoughts

  • P85. Charles Perrow once wrote that "safety systems are the biggest source of catastrophic failures in complex, tightly coupled systems." He was referring to nuclear power plants, chemical refineries, and airplanes. But he could have been analyzing the Oscars. Without the extra envelopes, the Oscars fiasco would have never happened.

  • P102. The trouble is that we're very bad at these forecasts. We draw the rangers too narrowly. As psychologists Don Moore and Uriel Haran put it, "Research on the types of forecasts finds that 90% confidence intervals, which, by definition, should hit the mark 9 out of 10 times, tend to include the correct answer less than 50% of the time." When we are 90 percent sure about a forecast, we are right less than half of the time. We feel very confident even though it's a toss-up. Likewise, when we are 99 percent confident, we end up being wrong much more often than 1 percent of the time. If you are 99 percent sure that the highest waves will be between 7 and 10 meters, you may be in for a nasty surprise.

  • P120. Use pre-mortem, like described in Leaders Read#72.

  • P137. But the crew of Flight 514 misunderstood the approach. Their mental model didn't match reality. In the years since the accident, researchers have learned a lot about how the brain deals with ambiguous situations. when there isn't enough information to resolve a question, we feel discordance, and the brain works quickly to fill in the gaps so that it can replace dissonance with harmony. In other words, it makes stuff up. It isn't clear what the crew should do. They weren't flying along a normal course. The "dumb sheet" told them not to descend. But air traffic control had cleared them for the approach. To deal with this ambiguity, they invented a rule: "When he clears you, that means you can go to your initial approach altitude."

  • P143. We treat a toilet that occasionally clogs as a minor inconvenience rather than a warning sign—until it overflows. Or we ignore subtle warning signs about our cars—like rough gear shifting or a tire that slowly loses air—rather than taking the car into the shop. To manage complexity, we need to learn from the information our systems throw at us in the form of small errors, close calls, and other warning signs:

    • Gather information

    • Fix the issue

    • Find the root cause

    • Share the information

    • Audit and repeat

  • P159. In fact, having power is a bit like having brain damage. As Keltner put it, "People with power tend to behave like patients who have damaged their brain's orbitofrontal lobes," a condition that can cause insensitive and overly impulsive behavior. When we are in charge, we ignore the perspectives of others. This is a dangerous tendency because more authority does not necessarily equal better insights. A complex system might reveal clues that a failure looms, but those warning signs don't respect hierarchy. They often reveal themselves to folks on the ground rather than to higher-ups in the corner office.

  • P164. In his free time, Dr. Speers is a pilot and aviation enthusiast, and he's been on a mission to teach dentists safety lessons from the airline industry. The biggest lesson he learned from pilots was to get people lower down in the hierarchy to speak up and to get higher-ups to listen.

  • P173. You need to realize that most people are really concerned—consciously or not—of offending authority and ruining social relationships. So, as a boss, it's not enough for you to create a generally pleasant environment and have an open-door policy. You need to be much more active than that. Don't wait for people to come to your office to speak up—go to theirs. If no one speaks up in a meeting, don't assume they all agree—actively ask for divergent viewpoints. And schedule frequent conversations when people can share ideas with you. That way, speaking up isn't extraordinary; it is a casual, routine thing.

  • P174. How to get people to speak up:

    • Charm school: Start by admitting you don't know everything.

    • Soften Power Cues: Pull up a chair or place yourself in the center of the cubicles.

    • Leaders speak last: Get everyone's opinion and ideas.

  • P194. Diversity works because it makes us question the consensus. What the heck is that? Why are we doing it? Can you run that by me one more time? Diversity is like a speed bump. It's a nuisance, but it snaps us out of our comfort zone and makes it hard to barrel ahead without thinking. It saves us from ourselves.

  • P215. A modern-day example of this approach is the Devil's Advocate Office at Aman, Israel's military intelligence agency. This special unit is made up of respected officers whose job is to criticize other departments' assessments and consider totally different assumptions. They entertain the possibility of worst-case scenarios and question the wires of the defense establishment. Their memos go directly to all major decision-makers, sidestepping the agency's chain of command. "Creative" is usually not the first word that comes to mind when describing military intelligence analyses, but as a former division head at the agency put it, " The Devil's Advocate Office ensures that Aman's intelligence assessments are creative and do not fall prey to groupthink." The sportswriter Bill Simmons proposed something similar for sports teams. "I'm becoming more and more convinced that every professional sports team needs to hire a Vice President of Common Sense," Simmons wrote. "One catch: the VP of CS doesn't attend meetings, scout prospects, watch any film, or listen to any inside information or opinions; he lives the life of a common fan. they just bring him in when they're ready to make a big decision, lay everything out, and wait for his unbiased reaction."

  • P222. But Brian stopped and took his time to revise his plan. In a complex system, that's often the right thing to do. Pausing gives us a chance to understand what's going on and decide how to change course. But we often fail to pause even when we can. We press on even when our original plan no longer makes sense. Pilots call this get-there-itis. The formal name is plan continuation bias, and it's a common factor in airline accidents. And the closer we are to our goal, the stronger this bias becomes. Pilots might notice signs that they should abandon their plan and divert to another airport—the weather is getting worse, and there isn't much fuel left—but it's hard to stop when the destination airport is only fifteen minutes away. Get-there-itis affects all of us, not just pilots. We become so fixated on getting there—whether "there" is an airport or the end of a big project—that we can't stop even when the circumstances change.

  • P227. In contrast, the best teams found a balance. "They focused not only on coordinating the tasks but would also say things like, 'You know, can we step back for a second? Do you think there's something else going on? Let's check in on where we are.'"

    • Perform

    • Monitor

    • Suggest diagnosis

  • P231. They started having family meetings every Sunday night. And that changed everything. Each meeting starts with three questions:

    • What went well this week?

    • What things should be improved next week?

    • What will we commit to changing next week?

  • P238. Traders blamed Nasdaq for hundreds of millions of dollars of losses. Nasdaq itself, though legally prohibited from trading stock, ended up accidentally selling $125 million worth of Facebook shares. The mistake exposed Nasdaq to litigation, fines, and ridicule. SWAT officers trained with the sniper rifle to understand what snipers could see. And their trainer told them that they had to know something about everybody else's job. Nasdaq managers needed the same kind of training. They didn't need to be programmers, and they didn't need to be able to write the computer code for the validation check. But they did need to understand what it was—and why they shouldn't bypass it. SWAT: You expect a hallway, and there's a wall. Nasdaq: You expect trading, and there's a validation check. The SWAT team figured out how to go around the wall. Nasdaq managers tried to run through it.

  • P243. We can all run a pre-mortem, use predetermined criteria, and make predictions using the SPIES method. We can all use Perrow's matrix to figure out which part of our organization or project is most prone to a nasty surprise—and what we might do about it. And we can all do a better job of listening to skeptics and speaking up when something doesn't feel right. You don't need to be a CEO to make a difference. And many of these approaches work even in our personal lives when we're making decisions about where to live, what job to take, and how to work together as a family.

2 views0 comments

Recent Posts

See All

Comentários


© 2025 by Lars Christensen

  • LinkedIn
bottom of page