menu

How people use, and lose, preexisting biases.

Listen to this article

How people use, and lose, preexisting biases.

From love and politics to health and finances, humans mostly make decisions that appear irrational, or dictated by an existing bias or belief. But a new study uncovers a surprisingly rational feature of the human brain: A previously held bias can be set aside so that the brain can apply logical reasoning to the decision at hand. These findings highlight the importance that the brain places on the accumulation of evidence during decision-making, as well as how prior knowledge is assessed and updated as the brain incorporates new evidence over time.

The researchers emphasize that this process is unconscious, not the result of consciously weighing information. The research was reported in Neuron.

What the researchers say: “As we interact with the world every day, our brains constantly form opinions and beliefs about our surroundings,” said the study’s senior author. “Sometimes knowledge is gained through education, or through feedback we receive. But in many cases we learn, not from a teacher, but from the accumulation of our own experiences. This study showed us how our brains help us to do that.”

As an example, consider an oncologist who must determine the best course of treatment for a patient diagnosed with cancer. Based on the doctor’s prior knowledge and her previous experiences with cancer patients, she may already have an opinion about what treatment combination (i.e. surgery, radiation and/or chemotherapy) to recommend—even before she examines this new patient’s complete medical history. This is an unconscious bias.

But each new patient brings new information, or evidence, that must be weighed against the doctor’s prior knowledge and experiences. The central question, the researchers of today’s study asked, was whether, or to what extent, that prior knowledge would be modified if someone is presented with new or conflicting evidence?

To find out, the team asked human participants to watch a group of dots as they moved across a computer screen, like grains of sand blowing in the wind. Over a series of trials, participants judged whether each new group of dots tended to move to the left or right—a tough decision as the movement patterns were not always immediately clear.

As new groups of dots were shown again and again across several trials, the participants were also given a second task: to judge whether the computer program generating the dots appeared to have an underlying bias.

Without telling the participants, the researchers had indeed programmed a bias into the computer; the movement of the dots was not evenly distributed between rightward and leftward motion, but instead was skewed toward one direction over another.

“The bias varied randomly from one short block of trials to the next,” said the paper’s co-author. “By altering the strength and direction of the bias across different blocks of trials, we could study how people gradually learned the direction of the bias and then incorporated that knowledge into the decision-making process.”

The study took two approaches to evaluating the learning of the bias. First, implicitly, by monitoring the influence of bias in the participant’s decisions and their confidence in those decisions. Second, explicitly, by asking people to report the most likely direction of movement in the block of trials. Both approaches demonstrated that the participants used sensory evidence to update their beliefs about directional bias of the dots, and they did so without being told whether their decisions were correct.

“Originally, we thought that people were going to show a confirmation bias, and interpret ambiguous evidence as favoring their preexisting beliefs” said the researchers. “But instead we found the opposite: People were able to update their beliefs about the bias in a statistically optimal manner.”

The researchers argue that this occurred because the participants’ brains were considering two situations simultaneously: one in which the bias exists, and a second in which it does not.

“Even though their brains were gradually learning the existence of a legitimate bias, that bias would be set aside so as not to influence the person’s assessment of what was in front of their eyes when updating their belief about the bias,” said the lead author “In other words, the brain performed counterfactual reasoning by asking ‘What would my choice and confidence have been if there were no bias in the motion direction?’ Only after doing this did the brain update its estimate of the bias.

The researchers were amazed at the brain’s ability to interchange these multiple, realistic representations with an almost mathematical quality.

“When we look hard under the hood, so to speak, we see that our brains are built pretty rationally,” said the researchers. “Even though that is at odds with all the ways that we know ourselves to be irrational.”

Irrationality the authors hypothesize, may arise when the stories we tell ourselves influence the decision-making process.

“We tend to navigate through particularly complex scenarios by telling stories, and perhaps this storytelling—when layered on top of the brain’s underlying rationality—plays a role in some of our more irrational decisions; whether that be what to eat for dinner, where to invest (or not invest) your money or which candidate to choose.”

So, what? The main reason governments, businesses and so forth are rushing helter-skelter into the future-unknown world of AI is that they are searching for rationality—for certainty in decision-making. It maybe that that rationality is already there and that the gut decisions we sometimes allow ourselves to make are actually as rational, if not more so, than anything that AI can produce.

What now: Let’s make a start to putting AI aside and learn more about what those remarkable organs, our four brains (head, gut, heart and skin), can actually achieve and how we can best make use of their collective power.

Why allow ourselves to be dominated by AI and its attendant algorithms when we don’t have to be?

Dr Bob Murray

Bob Murray, MBA, PhD (Clinical Psychology), is an internationally recognised expert in strategy, leadership, influencing, human motivation and behavioural change.

Join the discussion

More from this issue of TR

No items found.

Join our tribe

Subscribe to Dr. Bob Murray’s Today’s Research, a free weekly roundup of the latest research in a wide range of scientific disciplines. Explore leadership, strategy, culture, business and social trends, and executive health.

Thank you for subscribing.
Oops! Something went wrong while submitting the form. Check your details and try again.