Amazon And Microsoft Claim AI Can Read Human Emotions. Experts Say the Science Is Shaky

Peshkova/Shutterstock.com

“The problem is now AI is being applied in a lot of social contexts."

Facial recognition technology is being tested by businesses and governments for everything from policing to employee timesheets. Even more granular results are on their way, promise the companies behind the technology: Automatic emotion recognition could soon help robots understand humans better, or detect road rage in car drivers.

But experts are warning that the facial-recognition algorithms that attempt to interpret facial expressions could be based on uncertain science. The claims are a part of AI Now Institute’s annual report, a nonprofit that studies the impact of AI on society. The report also includes recommendations for the regulation of AI and greater transparency in the industry.

“The problem is now AI is being applied in a lot of social contexts. Anthropology, psychology, and philosophy are all incredibly relevant, but this is not the training of people who come from a technical [computer science] background.” says Kate Crawford, co-founder of AI Now, distinguished research professor at NYU and principal researcher at Microsoft Research. “Essentially the narrowing of AI has produced a kind of guileless acceptance of particular strands of psychological literature that have been shown to be suspect.”

Crawford and the AI Now report refer to the system commonly used to codify facial expressions into seven core emotions, originating with psychologist Paul Ekman. His work studying facial expressions in communities separated from modern society suggests that facial expressions are universal. The idea of universal facial expressions is convenient for AI researchers, since much the artificial intelligence in use today must categorizes complex images or sounds.

Amazon sells facial recognition that promises the ability to tell emotions, as does Microsoft. The list of the emotions that Microsoft’s Emotion API claims to discern is the exact same list of emotions that Ekman lists as having universal facial expressions. And it’s already in use in products like Pivothead, a company that makes smart glasses to livestream an employee’s point of view, according to Microsoft’s website.

But the way emotions are expressed on the face could be much more contextual than previously believed—calling the validity of Ekman’s categories into question. ”A knitted brow may mean someone is angry, but in other contexts it means they are thinking, or squinting in bright light,” wrote psychologist Lisa Feldman Barrett Barrett in the Financial Times in 2017. “A hypothetical emotion-reading robot would need tremendous knowledge and context to guess someone’s emotional experiences.”

Nevertheless, this technology is still being implemented under the assumption that it works. This kind of emotional facial recognition is being found in cars, phones, and security cameras.

“Any simplistic mapping of a facial expression onto basic emotional categories through AI is likely to reproduce the errors of an outdated scientific paradigm,” the report says.