Why do good leaders sometimes make bad decisions? The answer has to do with the remarkably complex human brain. Every day, the typical person makes somewhere between 2,000 and 10,000 decisions. Unable to cope with all the detail of each choice, our brains use shortcuts called cognitive biases. Biases work by choosing based on what has worked before, thereby avoiding having to analyse the merits of an argument each and every time. While often helpful, cognitive biases can also lead to mistakes with consequences ranging from the harmless (wearing mismatched clothes) to the catastrophic (allowing high-potential exposures to persist to the point of failure).
If we can learn to spot our own biases, we can understand the impact they have on safety decision-making. More importantly, we can adopt approaches to minimize them and support better decision-making throughout the organization.
Overcoming cognitive bias
The cognitive bias effect can be characterized as the tendency to make decisions or take actions based on limited acquisition and/or flawed processing of information, self-interest, overconfidence, or attachment to past experience. Among the most common types of cognitive biases are confirmation bias (favoring information that confirms our preconceptions), in-group bias (the perception that your team alone has the right answers), recency bias (ignoring important data for the most recent information), and the halo effect (allowing overall impressions of people to influence unrelated decisions about them and their abilities).
Here are three steps you can take to help overcome biases in yourself and your team:
- Raise awareness. Biases are etched into our DNA. Knowing that they exist-- and can distort our thinking -- will help lessen their impact. Post short articles on noticeboards and in newsletters. Educate safety and steering committees on the top five biases without boring them with lengthy, academic descriptions. Present relevant, engaging scenarios that are likely to trigger biases—such as disregarding near misses ('nothing bad happened last time, so it won't the next'). The objective is to encourage healthy discussion.
- Encourage inquiry and dissenting voices. We want people to speak up about safety, and there are many ways to surface concerns (such as near-miss reporting, card systems, or sharing concerns with direct supervisors). This practice needs to be deeply rooted in team and company culture. For example, at the senior manager level, choose someone (in advance) at a meeting to argue against the proposition being discussed. Even if they're in favor of the decision, they must play devil's advocate. This encourages people to proactively offer opposing views and challenge conventional wisdom. For colleagues everywhere, it's important to promote the practice of speaking up about safety concerns by ensuring their opinions aren't ignored or punished.
- Promote collaboration. It's easier to see biases in others than to see them in ourselves. Cooperation breaks down barriers and exposes entrenched views ('this is the way we do things around here'). What mechanisms do you have in place for sharing ideas and working on initiatives across departments? Could you adapt toolbox talks, safety meetings, or town hall meetings to enable colleagues to recognize the characteristics and dangers of cognitive biases?
Better decisions save lives
Our brains take shortcuts to help us through the day. It's human nature to make the predictable mistakes defined by cognitive biases. We can identify and minimize them, but we need better strategies than simply saying, 'I will change this.' It's the task of every safety leader to identify and understand their key 'thinking exposures.' Raising awareness about them will help us make better decisions. Start by sharing this article with a colleague. Do it now rather than tomorrow. Don't fall into the trap of putting it on a to-do list. After all, the longer we wait to address exposures created by cognitive biases, the longer our colleagues remain at risk.