• In Brief
  • Want More?
  • Accessible Resources

Algorithms might be able to read our faces, but can they read our emotions? Using biofeedback, like our expressions or heart rate, computers can guess how we might be feeling. But it’s not that easy. The gap of human experiences, social norms, and cultural differences all complicate things.  

When you look into the mirror, it will guess how you feel and write some poetry in response to that feeling. Does it get it right?

Can machines perceive our emotions? The research to see if they can is called affective computing, which is the development of systems that can recognise, simulate or express emotion. 

For a machine to recognise a human emotion, it needs to analyse inputs like:

  • EEG (electrical activity of the brain)
  • Facial expressions 
  • Speech
  • Breathing 
  • Heart rate 
  • Galvanic skin response
  • Gestures

All of these are built on assumptions that emotion expression is universal across cultures, and that each emotional category has a hardwired set of physical indicators, like a biological fingerprint. 

New models in neuroscience propose that emotions are constructed. This is the idea that emotions aren’t fixed or innate, but are generated by the brain, which is constantly making predictions of how you’re feeling.

And neuroscience aside, emotions are not universal across cultures. The words we use to describe our feelings or emotions change based on social and cultural context, as well as the ability of each of our languages to describe them.

This project is a critique of measuring emotion using only biofeedback. Instead of measuring your emotion, Mirror Ritual can co-create your emotional state dependent on your interaction with it. What does the algorithm think you’re feeling? Is it accurate?

 

Discover More

Watch:

Read: 

Listen: 

Audio description (for both Mirror Ritual and Biometric Mirror):

A wheelchair sympol, an open caption symbol, a hearing loop symbol, and an Audio Description symbol.