Salt Ideas essay #4: computers that know how you’re feeling

0
631

We’re proud to bring you the fourth of the Salt Ideas Essays: 15 pieces of expert thought leadership on the innovations and ideas that will change the world for the better. Affective computing aims to build devices that sense, interpret, adapt to and influence human Emotions, writes By Daniel McDuff, director of research at Affectiva.

Have you ever felt frustrated using technology? What would it be like if your computer did not simply repeat the same confusing error message over and over again? Or what if it knew to only give notifications when you were not concentrating on another task? Thinking bigger, could it help make accurate medical diagnoses based on understanding changes in your emotions and behaviour over time? That is the potential that new research into emotionally aware devices can offer. Soon we will think our old machines were stupid for failing to recognise how we felt.

Our cellphones and laptops have amazing cognitive abilities. From searching through thousands of files in a fraction of a second to predicting complex weather and climate patterns to recognising human voice commands. They also have powerful sensors, small enough to fit in our pockets, which can capture rich information about their surroundings and how they are being used. Our cellphones have much more computing power than the rockets that put men on the moon, yet they lack emotional intelligence.

The field of Affective Computing, born in the late 1990s at MIT, aims to build devices that sense, interpret, adapt to and influence human emotions. In the early days hardware for capturing emotional signals was cumbersome and clunky but now the electronics we use in our daily lives (wrist watches, phones, laptops) have the sensing and computing ability to capture and process cues in real time.

Affective computing is at the intersection of computer science, psychology and design. One of the first problems that researchers began to address was the automatic measurement of facial expressions using cameras.

After 20 years of research this technology is now being commercialised and integrated into games, interactive apps and education platforms. Moving beyond the face, recently the availability of wearable sensors has offered the ability to track physiological signals (such as heart and breathing patterns). Soon affective apps will not only respond to what we express outwardly but also how we feel on the inside.

Voice cues and body gestures also encode emotional information, only when signals from all these channels are combined will we get a complete picture of a person’s state. There is still time needed for this technology to mature.

In real life emotions can be very subtle and vary dramatically between individuals. There are social norms that result in people from different cultures, genders and age groups expressing themselves differently. Think about the differences between how a teenager and parent communicate their emotions. Building emotional intelligence into devices is as much about interpreting behaviours through a social lens as it is about detecting the behaviours themselves. In recent work at MIT and spin-out Affectiva we have started to build computer models of emotion based on expressions from millions of people around the world who chose to contribute their data online.

So what will the future look like if electronics can sense and respond to our emotions? Gartner forecasts that 4.9 billion connected things will be in use in 2015. Once these “things” start to communicate all of them will be aware of how we are feeling. Imagine if the mirror you looked into every morning was tracking how your sleep patterns were affecting your mood or your home lighting system adapted to help you battle your winter blues. Eventually emotion-sensing capabilities will be built directly into the hardware itself – an “emotion chip” in every device.

Potentially the biggest benefits of this technology lie in being able to improve mental health and wellbeing. Those suffering from depression could use it to track their emotions over time to help understand their treatment. Children with autism could practice recognising emotional states using interactive games.

As technology advances it is important that people have the choice to use these new types of interfaces.

Developers, engineers and researchers need to think carefully about the social norms that are created around emotionally aware devices, protecting people’s privacy is vital. In everyday life we have the ability to mask our emotions, an important social tool, it is critical that people retain this ability.

A computer sensing human emotions is very different from a computer or robot feeling emotions. Such a capability raises many other questions; however, as devices become more and more a part of our lives this side of “robot ethics” will become increasingly relevant.

Emotionally aware devices offer the potential to dramatically improve our wellbeing and make our devices merge seamlessly into the fabric of everyday life. We will look back on our current computers and we will not miss their lack of empathy.

GAUGING THE MOOD

    • As emotive computing develops, “robot ethics” will become increasingly relevant.
    • Soon apps will not only respond to what we express outwardly but also how we feel on the inside.
    • In everyday life we have the ability to mask our emotions, an important social tool, it is critical that people retain this ability.

 

ABOUT DANIEL MCDUFF

Daniel McDuff has a PhD from the MIT Media Lab. His current research is at the intersection of emotion and computer science. He is interested in sensors and algorithms that can recognise human affect and in building technologies that make people’s lives better.

PLEASE SHARE YOUR EXPERIENCES AND VIEWS IN THE COMMENT SECTION

Photo Credit: Darryl Moran from Flickr.

Previous
Next