Cognitive biases shape our everyday experiences and influence our decision-making. But one bias, in particular, can be extremely dangerous – the outcome bias.

What Is an Outcome Bias?

The outcome bias can make you focus on the end result and ignore the risks along the way. When we already know the outcome of a specific task, we can become blind to the dangers that might present during the process.

“When people observe successful outcomes, they tend to focus on the results more than on the (often unseen) complex processes that led to them.” Tinsley, Dillon, Madsen

This usually happens when we have repeated a task and achieved a good result at the end. This gives us the impression that every time we undertake this task, the results will be good.

But lots of things can happen during the process.

For example, imagine you live in an area prone to flooding. So far, in the twenty years you have lived in your house, the floodwaters have never been close to reaching your property. Do you buy flood insurance?

Many people would say no. The floods have not affected you for twenty years. But then the following year you see exceptional levels of rain and the river banks burst, flooding your house.

Studies show that if a person experiences a near-miss where they’ve escaped a potential hazard, they are less likely to take protective action. They’ll have an “I was alright last time, it will be alright again,” attitude.

Instead of evaluating the situation as it unfolds, they are focusing on past outcomes to inform their future decisions. But this is a rapidly changing world. So why do we feel the need to concentrate on the results, rather than the process?

Why do we experience it?

Human beings are continually trying to make sense of the world and to do this, we have to take shortcuts in our cognitive processing. We can’t evaluate every single new experience and try and decode it.

As a result, we learn these cognitive shortcuts. In the outcome bias, we do this by evaluating a situation against a previous one. If the previous situation had a good outcome, then we’ll chalk that up to a good decision. It’s a little like we’re using the power of retroactive hindsight when we focus on the outcome. It worked before, it will work again.

But is the outcome bias such a bad thing? Surely learning from past experiences is a good thing for humans?

Yes, it is, but the problem with the outcome bias is that we are not learning from our previous experiences. We are simply replicating them. And that’s where it gets dangerous. Because we fool ourselves into thinking that our decisions don’t matter, and they do of course.

Famous examples of disastrous outcome bias

Deepwater Horizon Oil Rig

In April 2010, a gas blowout safety mechanism failed on BP Gulf’s oil ring. The blowout ignited which caused the rig to sink, triggering a massive oil spill that wrecked wildlife and killed 11 people. This accident is one of the worst environmental disasters in American history.

But why did it happen? There had been warning signs.

The crew on the rig called it ‘the well from hell’ because of numerous technical problems. For a start, the main pipe that led into the well didn’t have enough centralizers in place to keep it straight. In addition, the drillers had removed the drilling mud too soon which lead to an unstable pipe.

The night before the blowout crew had performed a negative pressure test on the pipe to see if it was leaking oil and gas. Basically, this meant removing the heavy mud and replacing it with lighter seawater. In order to see if pressure built up the well was shut down. Pressure-build up is a sure sign that oil and gas were seeping into the well.

The tests showed that pressure had indeed built up, but BP managers and rig crew disagreed on the results. The test had to be repeated as no one could agree. After repeating the test now everyone agreed they had a good result and many crew members went to bed.

But it wasn’t a good result.

Over the next few hours, hundreds of barrels of oil and gas were leaking out and travelling up the pipe with increasing momentum. This roiling mass of pressure burst through the safety blowout and just kept going. Eventually, it ignited, blowing up the oil rig.

An ensuring investigation took years to complete, but it found a catalogue of errors that lead to this disaster. BP executives had experienced dozens of near-misses in the industry but with no major consequences.

However, each near-miss was down to sheer luck or circumstances, not good decision-making. For example, wind direction, or using different safety equipment. But instead of raising alarms and being carefully investigated, each near-miss was viewed that the safety procedures were working.

The Challenger Space Shuttle

Most of us can remember the horrific sight of the Challenger Space Shuttle breaking up in mid-air.

In January 1986, 73 seconds after it was launched, NASA’s Challenger space shuttle exploded. Seven astronauts were killed instantly, including a teacher. Broadcast live, this launch happened with millions of spectators around the world watching. So what went so drastically wrong?

Investigators attributed the accident to a failure of an O ring seal. This was a sealing ring that should have protected two joints in the lower parts of the rocket. The seal was designed to stop extremely hot gases from leaking from these two joints.

However, it broke, the gas escaped causing foam to break off an external rocket tank. This created shards of debris that pierced a hole through the wing of the space shuttle and causing it to explode.

Many people questioned why the launch went ahead as the initial recommendation was to cancel, due to the extreme cold temperatures on that day. However, the decision to launch was made.

The following investigation showed that doubts had been initially raised by the failure of the O rings on previous flights. But they were effectively ignored because their failures had never caused damage before.

In fact, Richard Feynman, a professor of theoretical physics and part of the investigation, stated:

“There were many seals that didn’t have any problem, and so it is obviously a random effect. It depends upon whether or not you get a blowhole or you don’t get a blowhole. So if within a particular flight it happens that all six seals don’t get a blowhole, that’s no information.”

Mars probe

And I can show you another example of NASA’s failure to properly investigate an anomaly in space travel which led to disaster.

During its 1998 journey towards Mars, the Mars Climate Orbiter kept drifting off-course. Actually, it drifted four times and each time analysts on Earth had to make small adjustments to correct it.

Scientists did not try to find the cause of the drifting of a $200 million spacecraft. Instead, they carried on correcting the trajectory. As it approached Mars, instead of entering into orbit, it crashed and broke up in the atmosphere.

NASA investigators later discovered that the programmers had mistakenly used English measurements instead of metric ones when they set the code for the journey. A mistake easily rectified, but because it wasn’t causing much concern and they could fix it during its journey, they didn’t look into the drifting.

Four ways you can avoid outcome bias

Of course, we’re not all in charge of expensive oil rigs or space shuttles, but that doesn’t mean we shouldn’t be wary of outcome bias. Here are four ways to avoid it:

  • Don’t act under pressure

Feeling under pressure leads to hastily-made decisions where we could be tempted to rely on the outcome, rather than the procedure.

  • Don’t rely on previous experiences

Of course, it is natural to examine past scenarios and make judgements against those. But where near-misses are concerned we should take each case on an individual basis.

  • Look at the cause, not the result

Again, this is difficult but we need to see what is happening now, and not concentrate on the results of what happened before.

  • When in doubt, assume the worse

Just because some experience worked out before, doesn’t guarantee that it will have a favourable outcome again. Always assume the worse.

Final thoughts

Remember, just because you’ve had good results time and time again, doesn’t mean you can predict the same results this time around.

References:

  1. onlinelibrary.wiley.com
  2. sas.upenn.edu
  3. hbr.org

Copyright © 2012-2024 Learning Mind. All rights reserved. For permission to reprint, contact us.

power of misfits book banner desktop

Like what you are reading? Subscribe to our newsletter to make sure you don’t miss new thought-provoking articles!

Leave a Reply