Do not believe claims that facial-recognition technology can accurately identify people’s emotions, advised several scientists at the 2020 AAAS Annual Meeting in Seattle.
Such claims that a photo of a face can be easily interpreted are based on a flawed hypothesis that we always smile when we are happy and always scowl when are angry, said Aleix Martinez, professor of electrical and computer engineering.
“There’s no way that technology will ever be able to detect emotions that you’re experiencing following that approach,” Martinez said.
Research shows that, on average, people scowl only 30% of the time that they are angry, said Lisa Feldman Barrett, professor of psychology at Northeastern University. The rest of the time, they make other faces when they are angry, she said. Additionally, people may scowl for other reasons – “they scowl when they’re concentrating, they scowl when someone tells them a bad joke, they scowl when they have gas,” she said.
A scowl is “not the expression of anger, it’s an expression of anger that people will show in certain circumstances,” Barrett said. This holds for other types of facial expressions and emotions, too.
“Any AI that is claiming to detect a scowl and interpreting it as detecting anger has some real problems,” said Barrett.
So much goes into communicating our emotions beyond our facial movements. Other nonverbal factors include our body pose, body movement and hormone responses like those that cause one’s face to flush from embarrassment or excitement, said Martinez.
Martinez offered an example of the importance of having enough information. For instance, when he showed people a photo of a red-faced man with his mouth wide open and his eyes nearly closed, most thought the man was extremely angry, his research showed. Yet anyone viewing the context – that the subject was a soccer player – could infer that he was displaying excitement while celebrating a goal.
A mix-up like this may have low stakes, but so-called emotion-recognition technology has a larger reach. The technology’s capability to incorporate facial movements is making inroads in a number of sectors, which could have serious, even dangerous outcomes, said Martinez. AI is sometimes used in classrooms, in the judicial system and in hiring for jobs, he noted. Many of these systems learn from U.S. and European data dominated by white people. Such inputs could negatively impact, for instance, the hiring of candidates of other races, Martinez said.
“I think we have to take seriously the context in which this AI is being used,” said Barrett.
Seth Pollak, professor of psychology at the University of Wisconsin-Madison, shared research about the origins of our ability to understand facial expressions and emotions. For several decades, scientists thought that infants arrived into the world with a “kernel” of understanding about emotions like happiness, sadness and anger, Pollak said. To the contrary, babies’ do not express specific emotions. They have a distress system that broadcasts whether they are OK or not, he noted.
Children learn about emotions beyond good or bad, and research shows that even with incredibly brief levels of exposure to contextual information, very young children start to change how they categorize their inferences about other people’s emotions, Pollak said.
“It looks like we actually don’t need to be born with this knowledge. Human brains are actually able to figure out patterns and make inferences about what might be happening at a very, very sophisticated computational level with actually very little experience,” said Pollak.
Computer algorithms currently cannot accurately infer emotions like people do, but scientists ask whether they could soon make progress toward reading emotions by incorporating contextual information the way humans do.
Most likely not, said Martinez. To begin with, no scientific study has so far demonstrated that computers are good at facial recognition – any claims of that are entirely anecdotal, he said. And attempting to consider other factors that contribute to the nonverbal communications of our emotions adds even more of a challenge, he said. Computers are very good at tasks with a clear definition, but emotions are incredibly complex.
Said Martinez, “It’s not like a game of go or chess.”
[Associated image: asiandelight/Adobe Stock]