Superiority complex? People who claim superior beliefs exaggerate their own knowledge

Posted on under Today's research

No one likes smug know- it-all friends, relatives or co-workers who believe their knowledge and beliefs are superior to others.

But now a new study indicates what many people suspect: these know-it-all people are especially prone to overestimating what they actually know.

What the researchers say: Even after getting feedback showing them how much they didn’t know relevant facts, these people still claimed that their beliefs were objectively more correct than everyone else’s. On top of that, they were more likely to seek out new information in biased ways that confirm their sense of superiority.

The study focused on people who profess “belief superiority”—or thinking their views are superior to other viewpoints—as it relates to political issues. However, the researchers noted that people also claim belief superiority in a variety of other domains besides politics, such as the environment, religion, relationship conflicts, and even relatively trivial topics such as etiquette and personal preferences.

The research used several studies to answer two key questions about political belief superiority: Do people who think that their beliefs are superior have more knowledge about the issues they feel superior about? And do belief-superior people use superior strategies when seeking out new knowledge?

To answer the first question, participants reported their beliefs and feelings of belief superiority about several political topics. Researchers asked them how much they thought they knew about these topics and then had them complete quizzes testing their actual knowledge about those issues.

Across six studies and several political topics, people who were high in belief superiority thought that they knew a great deal about these topics. However, when comparing this perceived knowledge to how much people actually knew, they found that belief-superior people were consistently overestimating their own knowledge.

“Whereas more humble participants sometimes even underestimated their knowledge, the belief superior tended to think they knew a lot more than they actually did,” said the study’s lead author.

For the second question, researchers presented participants with news articles about a political topic and asked them to select which ones they would like to read. Half of the articles supported the participants’ own point of view, whereas the other half challenged their position.

Belief-superior people were significantly more likely than their modest peers to choose information that supported their beliefs. Furthermore, they were aware that they were seeking out biased information: when the researchers asked them what type of articles they had chosen, they readily admitted their bias for articles that supported their own beliefs.

“We thought that if belief-superior people showed a tendency to seek out a balanced set of information, they might be able to claim that they arrived at their belief superiority through reasoned, critical thinking about both sides of the issue,” the researchers said.

Instead, they found that these individuals strongly preferred information that supported their views, indicating that they were probably missing out on opportunities to improve their knowledge.

So why do people seem to shun opposing viewpoints? Researchers suggest that while some people insist that they are always right, all of us feel good when the beliefs we think are important are confirmed.

In other words, when a belief is strongly held, is tied to one’s identity or values, or is held with a sense of moral conviction, people are more likely to distance themselves from information and people that challenge their belief.

“Having your beliefs validated feels good, whereas having your beliefs challenged creates discomfort, and this discomfort generally increases when your beliefs are strongly held and important to you,” said the study’s co-author.

So, what? This study validates, yet again, a point that we have been making for some time—we don’t make decisions based on facts or reasoning, if we did then we would be more able to change our views.

In fact, we all have fairly fixed views and we all tend to seek out information which validates those views and filter out that which doesn’t. This “perceptual filter” as it is sometimes called is one of the factors that make it virtually impossible for us to really hear and take in some 60% of what people tell us. We tend to reject any statement which conflicts with our deeply held assumptions and beliefs.

As the researchers say these beliefs and assumptions become part of who we are—part of our core personality. Any attack on them (or denial of them) becomes a threat to our very sense of self. When that happens, our sympathetic nervous system pushes us into flight, fight or freeze mode and we, literally cease to listen, to read or to take in information.

What now? One of the first rules of good dialogue is to never say that someone is wrong. They will likely see it is an attack, a danger signal. Much better to say “You’re right, of course. (pause) And………” The first part of that opens their brains up to safety and to new learning. It should be the first thing that any good leader or manager learns. Being right is nice, but often irrelevant and threatening.