Chevron icon It indicates an expandable section or menu, or sometimes previous / next navigation options. HOMEPAGE

The idea that AI can 'read' the emotions on your face is profoundly flawed

GettyImages 170371660
Can you read her emotions? Can a computer? Oli Scarff / Getty

Companies are constantly looking for new ways to respond and preempt our behavior, hoping to better be able to sell us products and tailor services to our (un)conscious needs.

Advertisement

The next big thing is “affective computing," wherein algorithms can supposedly read emotions on our faces and deliver information about what we want to companies.

But at the heart of affective computing are three misguided ideas about human emotions. 

Picture a violin soloist in the spotlight in Carnegie Hall. She is four bars from the end of a cadenza she has practiced for twenty years. She's one with the violin — her face and body are frozen, and only her bow arm and callused fingers are moving. This is the high point of her entire career. How is she feeling? Is she happy? 

If technology can capture how she's feeling, then tech companies can start answering — and monetizing — questions that have until now been the domain of philosophy and the arts. Questions like "What is happiness?" 

Advertisement

The developers of affective computing claim that reading a sequence of facial expressions in great detail — often greater detail than the human eye possesses — allows them to map a series of emotional states over time.

Since these businesses are taking on questions philosophers have debated for centuries, a good way to assess value is philosophical due diligence.

At the heart of affective computing are three misguided ideas about what human emotions are:

  1. All humans have inner states consisting of both rational thoughts and emotional sentiments that make us who we are.
  2. We are able to grasp and perfectly communicate both our rational and emotional inner states to others.
  3. Humans base their actions primarily on these rational thoughts and emotional sentiments.
Advertisement

Let's take them one by one.

All humans have inner states consisting of both rational thoughts and emotional sentiments that make us who we are.

This first assumption comes from the mind/body dualism of Rene Descartes, who wrote "I think, therefore I am." Our thoughts and feelings are what make us unique as individuals, and we understand the world by thinking about it and responding to it with emotions.

For centuries, philosophers (most notably Martin Heidegger) have attacked the idea of mind/body dualism. Heidegger suggested that the way we understand the world is by being involved in it, physically, emotionally, and socially.

Heidegger, and the philosophers influenced by him, argued that you could never isolate and explore inner states as if they were individual butterfly wings kept under glass. Look again at our violinist: How much of the meaning of her big moment — the years of self-doubt, the sacrifices and sore fingers — comes through her face? During this high point in her life, she's not even smiling.

Advertisement

We are able to grasp and perfectly communicate both our rational and emotional inner states to others.

The second assumption is worse than the first. Affective computing assumes that emotions are a set number of discrete reactions to the world. In this scenario, anger or disgust look the same in every situation, and so does passion. The ability of high-resolution cameras to detect "micro-expressions" is analogous to an electron microscope's ability to see the atoms in a molecule in great detail.

But just because they look the same doesn't mean these various emotions are the same. The French existentialist Jean-Paul Sartre posits that people are only angry by being angry at something. To understand most people's anger, we need to understand what they're angry at. The fleeting emotions—those "micro-expressions"—that pass across our faces are only a small part of that story.

Technology will give us ever more knowledge about the granular workings of facial expressions while simultaneously losing more and more understanding about the cultural context of our emotions. The more affective computing attempts to flatten individual experiences into atomized and indexed data points, the more we, as a culture, lose our sensitivity to its genuine meaning. 

Humans base their actions primarily on rational thoughts and emotional sentiments.

The Pixar movie "Inside Out" takes this idea as its premise, with Joy, Fear, Anger, Disgust, and Sadness fighting for control over an eleven-year-old girl. The girl, Riley, makes decisions based on directions from her emotions. The movie agrees with philosopher David Hume that reason is "a slave to the passions," and disagrees with Aristotle, who called humans “rational animals.” Aristotle and Hume would both agree that, whether irrational or irrational, there’s someone at the controls.

Advertisement

Recent developments in thinking about human consciousness suggest, however, that the head isn't always doing the driving. We have bodies, too, and physical context is a crucial determinant of how we will behave.

Physical context is key to decision-making. An emotional response to a commercial in the warm, dark room of the focus group may have no relation to the way that same commercial is perceived at home or on a subway platform. 

Norman masons used soaring columns and stained glass to make people stay quiet in church; house managers on a talk-show keep the set cold to make people laugh more. And how relevant is your emotional state in the grocery store, making an impulse buy of brightly lit, perfectly chilled soda that's right at your fingertips?

There are also wider concerns. What benefit would the rest of us gain by having our every facial expression atomized into stratified bits of data? Isn't this akin to a privatized surveillance state?

Advertisement

Philosophical due diligence tells us that our emotions are meaningless when taken out of context. An economy built on monetizing these isolated emotions is built on shaky foundations.

Christian Madsbjerg is a founding partner of ReD Associates, a strategy and innovation consulting firm based in the human sciences. Jonathan Lowndes is a Senior Consultant and social theorist at ReD Associates.

Artificial Intelligence
Advertisement
Close icon Two crossed lines that form an 'X'. It indicates a way to close an interaction, or dismiss a notification.

Jump to

  1. Main content
  2. Search
  3. Account