menu

Facial expressions don't tell the whole story of emotion

February 23, 2020

Listen to this article

Facial expressions don't tell the whole story of emotion

Interacting with other people is almost always a game of reading cues and volleying back. We think a smile conveys happiness, so we offer a smile in return. We think a frown shows sadness, and maybe we attempt to cheer that person up.

 

Some businesses are even working on technology to determine customer satisfaction through facial expressions.

 

But facial expressions might not be reliable indicators of emotion. In fact, it might be more accurate to say we should never trust a person’s face, new research suggests.

 

What the researchers say: “The question we really asked is: ‘Can we truly detect emotion from facial articulations?’“ said the lead author. “And the basic conclusion is, no, you can’t.”

 

The team, whose work has focused on building computer algorithms that analyze facial expressions presented their findings at the annual meeting of the American Association for the Advancement of Science.

 

The researchers analyzed the kinetics of muscle movement in the human face and compared those muscle movements with a person’s emotions. They found that attempts to detect or define emotions based on a person’s facial expressions were almost always wrong. The same had earlier been found to be true of “body language.”

 

“Everyone makes different facial expressions based on context and cultural background,” the researchers said. “And it’s important to realize that not everyone who smiles is happy. Not everyone who is happy smiles. I would even go to the extreme of saying most people who do not smile are not necessarily unhappy. And if you are happy for a whole day, you don’t go walking down the street with a smile on your face. You’re just happy.”

 

It is also true, they said, that sometimes people smile out of an obligation to the social norms. This would not inherently be a problem, they said—people  are certainly entitled to put on a smile for the rest of the world—but some companies have begun developing technology to recognize facial muscle movements and assign emotion or intent to those movements.

 

The research group that presented at AAAS analyzed some of those technologies and largely found them lacking.

“Some claim they can detect whether someone is guilty of a crime or not, or whether a student is paying attention in class, or whether a customer is satisfied after a purchase,” the lead author said. “What our research showed is that those claims are complete baloney. There’s no way you can determine those things. And worse, it can be dangerous.”

 

The danger, obviously, lies in the possibility of missing the real emotion or intent in another person, and then making decisions about that person’s future or abilities.

 

After analyzing data about facial expressions and emotion, the research team concluded that it takes more than expressions to correctly detect emotion.

 

In one experiment, the researchers showed study participants a picture cropped to display just a man’s face. The man’s mouth is open in an apparent scream; his face is bright red.

 

“When people looked at it, they would think, wow, this guy is super annoyed, or really mad at something, that he’s angry and shouting,” the lead researcher said. “But when participants saw the whole image, they saw that it was a soccer player who was celebrating a goal.”.

 

And while the lead author said he is “a big believer” in developing computer algorithms that try to understand social cues and the intent of a person, he added that two things are important to know about that technology.

 

“One is you are never going to get 100 percent accuracy,” he said. “And the second is that deciphering a person’s intent goes beyond their facial expression, and it’s important that people--and the computer algorithms they create--understand that.”

 

So, what? OK, so one day we might invent a facial scan system that can accurately show what a person is feeling or even–as some colleagues in the field tell me—broadly what they are thinking. The question is why do we allow this? What persuades us to allow people to create this kind oftechnology and use it to survey and control us?

 

Frankly I don’t care if the AI systems that judge whether someone will be a repeat offender, a purchaser of certain goods or not do as the state tells them are more accurate. That’s not the point. The point is that creating them in the first place is unethical and simply wrong.

 

None of us should have to live in a world where we’re subject to accurate or inaccurate facial recognition and mood and thought prediction systems.

 

Got that off my chest!

Dr Bob Murray

Bob Murray, MBA, PhD (Clinical Psychology), is an internationally recognised expert in strategy, leadership, influencing, human motivation and behavioural change.

Join the discussion

More from this issue of TR

February 23, 2020
No items found.

Join our tribe

Subscribe to Dr. Bob Murray’s Today’s Research, a free weekly roundup of the latest research in a wide range of scientific disciplines. Explore leadership, strategy, culture, business and social trends, and executive health.

Thank you for subscribing.
Oops! Something went wrong while submitting the form. Check your details and try again.