What are you looking for?
Behavioral Science Technology, Inc. (BST)

Identify Bias. Improve Safety.

Too much information too quickly can crash even the most advanced computers. Luckily our brains are equipped with a failsafe that guards against processing meltdowns. This failsafe is called cognitive bias, and it helps us make decisions based on past experience and knowledge.

Cognitive biases can be characterized as the tendency to make decisions and take action based on limited acquisition or flawed processing of information. To understand how this works, let's take a simple test: open a book at random and choose the first word you see. Is it more likely that the word will begin with the letter R or that it will have an r as its third letter? Most people will respond R is more likely to be the first letter.

Would it surprise you to know that there are three times as many words in the English language that have r as a third letter, than a first letter? Simple math dictates the random word we choose is more likely to have r as the third letter. But our brains take a shortcut, think of several words beginning with R, and reach a decision. Our brains have good intentions; they're trying to save us time. But the decision they come to is wrong.

This type of error is NOT random or an accident. It follows a shape and a pattern, because humans tend to make inaccurate judgments about future probabilities, in predictable ways. Some decisions are automatic, fast, and effortless. We don't think hard about brushing our teeth, tying our shoe laces, or walking across a room. We just perform these tasks 'computer-like.' Other decisions are slower, deliberate, and rational. Think of an effortless telephone conversation you recently had. You may have been walking leisurely along the street. But if a conversation becomes difficult or heated, you'll probably stop and stand still—it requires more effort to think and focus.

So how does this impact safety? Cognitive biases may almost benefit us when performing low-risk everyday activities. However, in safety, cognitive bias can often lead to poor decision making.

Here are some common cognitive biases and the potential safety-specific impact they can have:

• Confirmation bias leads us to search for or interpret information that confirms our preconceptions. This can shut off our brain to alternative thoughts and probabilities. For example, if our statistics tell us we are 'safe,' confirmation bias ignores an increase in exposure.
• Recency bias is the tendency to rely on information that is the most high profile or recent, and therefore easiest to remember. This bias is common in a high pressure situation or when people are overconfident. For example, a recent incident could cause us jump on the safety rollercoaster and call into question great progress, with a single data point making us doubt our success.
• The halo effect is when our overall impression of a person influences how we feel and think about her ideas. If someone has a history of good ideas, we may be tempted to automatically support his next idea—without necessarily applying appropriate rigor to our assessment. We judge the person, rather than the idea.