Reading time approx. 5 minutes
Good morning. Let’s start this Tuesday with a conversation we both need to have. It’s about how we think when the skies turn dark. Because, as always, “It is perfectly possible to be both rational and wrong.” Our tool of a Chief Behavioral Officer today:
A hurricane is heading (y)our way
I’ve been thinking a lot about hurricanes lately—about the decisions people make when they know disaster is coming. It’s fascinating how our minds can trick us into either acting too late or not at all, even when the risks are as plain as the radar map. Here’s a story that I think brings this to life.
In September 2017, Hurricane Irma was rapidly approaching Florida, with news and warnings flooding every channel. Despite the relentless coverage, many people in the neighborhood seemed unfazed, going about their usual weekend routines—mowing lawns, stocking up on BBQ supplies, or relaxing on their porches as though nothing unusual was coming. Gas tanks were filled, extra water bottles were grabbed, but the idea of evacuating seemed like an overreaction. After all, storms had hit before, and they never caused much damage in this area. Was leaving really necessary?
The answer, in hindsight, was a resounding yes. The normalcy bias—the tendency to believe that things will remain the same as they’ve always been—was at play. It wasn’t that anyone was consciously ignoring the danger; it was more that the idea of a catastrophic event felt distant, almost impossible.
But then the storm hit. Roads flooded, roofs were torn off, and the seemingly calm decisions made just days earlier quickly turned into regrets. Only when trees snapped and floodwaters rose did it become clear how much the storm had been underestimated.
It’s a familiar story—knowing the facts, hearing the warnings, yet somehow not fully grasping the reality of what’s coming until it’s too late.
How does it work? Science, baby!
The human brain is wired to protect us, but sometimes it does the opposite. When it comes to natural disasters, three key cognitive biases often lead us astray: normalcy bias, optimism bias, and the availability heuristic.
Normalcy bias, as I experienced during Hurricane Irma, convinces us that things will stay as they are. It’s a kind of mental inertia—if you’ve never faced a devastating hurricane, you assume you won’t this time either. Your brain sticks to past experiences, giving you a false sense of security.
Then there’s optimism bias—the tendency to believe that while bad things happen, they’ll probably happen to someone else. You may hear about hurricanes devastating other towns, but you think your house will be fine. This bias lulls us into a false sense of immunity, making us believe that we won’t be the ones caught in the storm’s path.
Lastly, the availability heuristic plays tricks on our risk perception. If we’ve seen dramatic coverage of a recent hurricane, we may overestimate the likelihood of the next one being just as catastrophic. On the flip side, if we’ve gone a while without a major storm, we tend to think we’re in the clear—even when forecasts say otherwise.
Together, these biases make it incredibly hard for people to make rational, timely decisions when facing natural disasters.
Why is this important?
The psychology behind these biases shows that when we rely on past experiences or gut feelings, we often underestimate danger in life-threatening situations like hurricanes. Normalcy bias makes us think "this storm will be like the others." Optimism bias lets us believe "it won’t happen to me." And the availability heuristic makes us remember only what’s freshest in our minds, not necessarily what’s most relevant.
Dealing with these biases is crucial because it directly impacts our safety during disasters. Failing to prepare or evacuate doesn’t just affect individuals—it stresses entire communities, puts lives at risk, and hampers emergency responses. That’s why governments and institutions need to frame their warnings in ways that break through these biases. It’s not just about issuing a hurricane warning; it’s about ensuring people act on it. And for us, recognizing when our own brain might be downplaying a threat could save our lives.
And now?
So, how can we approach the next storm—or any disaster—with clearer minds? Here are a few things we can do to combat these biases:
Personalize the threat: When you hear a warning, think about your home, your street, your family. Don’t let the abstract nature of disaster reports trick you into believing it’s happening "somewhere else."
Start small, act now: Prepare in bite-sized steps before the storm even hits. Create a checklist. Gather your supplies over time rather than waiting for panic to set in.
Look at what your neighbors are doing: Social proof is powerful. If others in your area are boarding up windows and evacuating, ask yourself if they might know something you don’t. It’s often easier to move when you’re not moving alone.
Don’t rely on memory: Just because last year’s hurricane didn’t do much damage doesn’t mean this one won’t. Every storm is different, and so is the risk.
Bottom line
When hurricanes approach, cognitive biases like normalcy bias, optimism bias, and the availability heuristic can distort our perception of risk.
Checklist for staying prepared:
Acknowledge the threat: Don’t assume it won’t happen to you.
Start acting early: Gather supplies and create an evacuation plan before the pressure mounts.
Trust the experts: Heed weather warnings and predictions—don’t rely solely on your past experiences.
Check in with your community: Social proof can reinforce good decision-making during crises.
Reassess constantly: As conditions change, so should your response. Be ready to adapt, even if it feels inconvenient.
Chief Behavioral Officer Wanted
Where are daily management decisions still based on the myth of the purely rational human? Where can you be a Chief Behavioral Officer this week?
See you next Tuesday.
If you’d like to send us tips or feedback, email us at redaktion@cbo.news. Thank you!