menu

People only pay attention to new information when they want to

September 19, 2021

Listen to this article

People only pay attention to new information when they want to

How often do you visit websites that you disagree with? If you watch MSNBC do you also tune into Fox News? If you subscribe to the Sydney Morning Herald, do you also subscribe to the Australian? To the Daily Mail and the Guardian? How often do you really listen to the opinions of people that you don’t agree with? The vast majority of people don’t.

A new paper in the Journal of the European Economic Association indicates that we tend to listen to people who tell us things we’d like to believe and ignore people who tell us things we’d prefer not to be true. As a result, like-minded people tend to make one another more biased when they exchange beliefs with one another.

While it would be reasonable to think that people form decisions based on evidence and experience alone, previous research has demonstrated that decision makers have “motivated beliefs.” They believe things in part because they would like such things to be true. Motivated beliefs (and the reasoning that leads to them) can generate serious biases. Motivated beliefs have been speculated to explain the proliferation of misinformation on online forums. Such beliefs may also explain stock market performance. There’s a great deal of objective information available about financial marketplaces, yet group decision making and encouragement (e.g., the Game Stop stock performance of Northern winter 2021) may result in bubbles and financial instability.

Researchers here used laboratory experiments to study whether such biases in beliefs grew more severe when people exchanged these beliefs with one another. The researchers paired subjects based on their score on an IQ test such that both members either both had scores above the median or both had scores below the median. The subjects then exchanged beliefs concerning a proposition both wanted to believe was true: that they were in the high IQ group.

The experiment revealed that people who are pessimistic that they are in the high IQ group tend to become significantly more optimistic when matched with a more optimistic counterpart. An optimistic person is not, however, likely to change his beliefs if matched with a more pessimistic counterpart. This effect is particularly strong for people who are in the low IQ group, where it produces particularly severe biases. Overall, the results suggest that bias amplification occurs because people (selectively) attribute higher informational value to social signals that reinforce their pre-existing motivation to believe.

Halfway through the experiment, however, researchers gave subjects an unbiased piece of information about which IQ group subjects were in. This was highly effective at removing the biases caused by the initial exchange of beliefs. The results therefore suggest that providing unbiased, reliable sources of information may reduce motivated beliefs in settings like echo chambers and financial markets.

What the researchers say: “This experiment supports a lot of popular suspicions about why biased beliefs might be getting worse in the age of the internet,” said the lead author. “We now get a lot of information from social media, and we don't know much about the quality of the information we're getting. As a result, we're often forced to decide for ourselves how accurate various opinions and sources of information are and how much stock to put in them.  Our results suggest that people resolve this quandary by assigning credibility to sources that are telling us what we'd like to hear, and this can make biases due to motivated reasoning a lot worse over time."

So, what? Many years ago, a prominent neuroscientist came up with the concept of the “perceptual filter.” This, he believed was a neural complex in the orbitofrontal cortex of the brain, sitting just behind the ears. When we hear someone saying something that we disagree with—something that goes against our assumptions about ourselves or the world—we literally cease to listen. This—along with reloading (thinking about our response when the other person is talking) and the famous amygdala hijack (ceasing to listen when we hear something that we perceive as a threat)—results in our only really hearing about 40% of what other people are saying to us.

Since then, the concept of the perceptual filter has been proven to be right (though it’s location is somewhat different). We only really hear what we agree with.

The researchers say that people changed their mind when presented with facts. Again, this has been shown to be the case in a number of studies and on the face of it argues for the possibility of rationality. Unfortunately, a large number of prior studies have shown that this “rational” change is usually short-lived and disappears when people again interact with others who share the view which the “facts” show as incorrect.

For more on confirmation bias click here.

Dr Bob Murray

Bob Murray, MBA, PhD (Clinical Psychology), is an internationally recognised expert in strategy, leadership, influencing, human motivation and behavioural change.

Join the discussion

Join our tribe

Subscribe to Dr. Bob Murray’s Today’s Research, a free weekly roundup of the latest research in a wide range of scientific disciplines. Explore leadership, strategy, culture, business and social trends, and executive health.