Affective computing: how emotional machines are about to take over our lives


Madhumita Murgia

January 18, 2016

From robots anticipating our desires to wristbands that help autistic children speak – the way we engage with technology is changing, finds Madhumita Murgia

In a quiet breakfast cafe, on a sunny October morning in Boston, I am watching a gang of five animated emotions control the thoughts of a little girl called Riley. On an iPad screen, the green character called Disgust gears into action, making Riley overturn her plate of broccoli in a fit of revulsion, and I gasp. When Riley’s father tries to pacify her by pretending her spoon is an aeroplane, I giggle. All the while, the iPad is reading my emotions.

‘Emotional engagement: HIGH’, the screen reads, once the 30-second clip of Pixar’s film Inside Out has ended. On a scale of one to 100, I mostly registered high levels of enjoyment, according to the iPad. During the bit where broccoli goes flying everywhere, my surprise levels go through the roof, mixed in with a little bit of dislike.

‘I didn’t see your face register any dislike; that must be a mistake,’ says my companion, the inventor of the emotion-reading app. ‘I don’t like broccoli, so I may have grimaced,’ I say, surprised that the app could pick up my micro-expressions.‘Aha!’ she says, pleased. ‘That’s what it’s really looking for.’

Showing off her invention, which has been 10 years in the making, is Rana el Kaliouby, an Egyptian-born computer scientist. El Kaliouby studied human-computer interaction in Cairo in 1993, before it became fashionable to analyse our relationships with our devices. “We used to talk about social robots that could respond to your emotions and it all seemed so far out. Computer cameras were massive webcams. But it only took about 10 years for it all to become real,” she says.

“I became convinced you couldn’t build a truly intelligent computer without having emotional capabilities like humans do”
Rosalind Picard

The emotion-sensing app was built by her start-up Affectiva, which was spun out of the Massachusetts Institute of Technology’s ( MIT ) maverick Media Lab – a place where designers, computer scientists, artists, architects and neuroscientists pool ideas. Its ‘anti-disciplinary’ collaborations have led to products that belong firmly in the future – from foldable cars to social robots – and resulted in much-loved spin-offs such as Guitar Hero and the Kindle.

The idea behind Affectiva was to create a computer that could recognise a range of subtle human emotions, based on facial expressions. The company’s work is part of a now-growing field of research known as ‘ affective computing ’, the scientific efforts to give electronic devices emotional intelligence so that they can respond to our stubbornly human feelings and make our lives better.

Currently the big hype in computer science is around artificial intelligence – imbuing computers with the ability to learn from data and make rational decisions in areas such as financial trading or healthcare. From September to December 2014, just nine AI companies raised $201.6 million from Silicon Valley investors who all want in on the gold rush.

But scientists like El Kaliouby think emotion-sensing is as important for a machine’s intelligence as data-driven rationality. ‘It’s not just about human-computer interaction. I realised that by making machines have emotional intelligence, our own communication could become better,’ she says.

Today the idea has started to take root in the public imagination. Another Media Lab roboticist, Cynthia Breazeal, has built Jibo, a Disney cartoon-like family robot that can perform simple tasks such as reading a story to a child at bedtime or giving voice reminders from a to-do list. It recognises faces and can have simple conversations, and its emotions are powered by Affectiva software.

When Picard dug into it, she found emotion was one of the key ingredients of intelligent perception – it tells humans what to pay attention to and what to ignore. But she was determined never to study feelings – they were too irrational and ‘girly’. “How to sabotage your career in one easy step? Start working on emotion!” she says, laughing. “I was afraid people wouldn’t take me seriously.”

But in her quest to build an artificially intelligent computer, the scientist, now a professor at MIT, kept running across emotions. “I became convinced you couldn’t build a truly intelligent computer without having emotional capabilities like humans do,” she says.

Once Picard had decided to found her lab on this principle, she began to measure heart fluctuations, skin conductance, muscle tension, pupil dilation and facial muscles in order to figure out which changes in our body consistently relate to emotions. “We started wiring ourselves up with all these electrodes, pretty hideous-looking, then taking all our data and crunching it,” she recalls.

But it was worth it. “Lo and behold, we found that within a person over a long period of time there were consistent patterns that related to several emotions,” she says. “We could teach their wearable computer to recognise those patterns.” In other words, a computer with a camera could start to learn how to take lots of different data points from your face, and map it to a smile or a frown. This was the first step towards the product built by Affectiva.

Co-founded by Picard and El Kaliouby, who was a researcher in her lab, Affectiva is one of the most successful companies in facial-expression analysis – it is backed by $20 million and has customers ranging from the BBC to Disney. Picard has since left to work on a new emotional computer that focuses on medical conditions such as autism and epilepsy, while El Kaliouby has taken over the reins as Affectiva’s chief scientific officer.

El Kaliouby has that rare quality of putting you at ease instantly. Her face is open and warm, with a dazzling smile, and she is happy to share details of her private life within minutes of meeting: she had a long-distance marriage for many years, and is currently divorced with two kids – one plays the harp and the other is at tae kwon do. She checks that I’ve eaten breakfast. Her emotion-reading software would probably rate her emotional intelligence ‘high’. “I was always particularly interested in the face, because I’m very expressive and Egyptians in general are very expressive people,” she explains.

While Affectiva has been focused on commercial applications, Picard decided to go back to the area that most fascinated her: emotion-sensing wearables for healthcare.

In the early days of her research one of Picard’s neighbours was asking about her work and she explained it to him as “teaching computers to recognise facial expressions, to try and understand emotion.” He asked, “Could you help my brother? He has autism and he has the same difficulties.”

The more Picard read about autism, the more she began to realise that an emotion-decoder could help autistic people interact better with others. Meanwhile, El Kaliouby was still finishing her PhD at the University of Cambridge, where she, too, had come across the strange parallels between people with autism and computers. She began building a system, which she called Mind Reader, that could recognise emotions and act as an emotional crutch for people with autism by giving them feedback.

When she joined Picard’s lab, they put the software into a pair of glasses with a little in-built camera. “It looked a lot like Google Glass, which came much later,” El Kaliouby laughs. The glasses were tested on children with varying degrees of autism, ranging from highly functional to non-verbal, at the Groden Center in Rhode Island.

The newest device is known as the E4, designed in collaboration with an Italian start-up called Empatica that is focused on medical-grade wearables. The $1,690 device, which has recently gone on sale to the public, has already been used to study stress, autism, epilepsy, PTSD and depression in clinical studies with Nasa, Intel, Microsoft and MIT among others.

As I wrap it round my wrist tightly, it buzzes when it connects to an app on Picard’s iPhone and starts streaming my biometric data: my temperature, blood-volume pulse, plus electrodermal activity that could indicate stress. One of the primary uses of the E4 is to predict dangerous epileptic seizures at home. ‘It was a complete accidental finding,’ Picard says.

Over Christmas in 2011, one of her undergraduate students took two autism wristbands home for his little brother, who couldn’t speak. He wanted to know what was stressing him out. “Over the holiday I was looking at his data on my screen and every day looked normal, but suddenly one of the wristbands went through the roof and the other didn’t respond at all.

With a motion-capture system, similar to the Kinect for XBoxes, Berthouze and her students can recreate an animated version of a patient’s movements: standing upright, reaching forwards, bending to touch the ground and straightening up again. They also use two sensors to measure muscle activity in the back and neck.

By comparing these models to those of healthy people’s movements, Berthouze can create computer algorithms to differentiate levels of pain. ‘Ultimately we want to develop a low-cost wearable system that could be embedded in trousers, in shoes or a jacket to monitor pain levels, and help people feel better by recommending physiotherapy exercises,’ she says.

Affective scientists such as El Kaliouby, Picard and Berthouze all agree that emotionally intelligent devices will soon become a part of our daily lives. Already, wearables such as the Apple Watch can do rudimentary ‘emotion’ measurements of your heart rate. And examples of emotionally aware devices are popping up in unexpected places.

‘Even my toothbrush actually smiles at me if I brush for two minutes,’ Picard says, laughing. ‘I know it’s just a little algorithm with a timer, but I still think, I can brush another 15 seconds to get that smile!’

Next, your smartphone could come with a little emotion chip, just like the GPS chip that provides a real-time location service. It might tell you to avoid scheduling an important meeting when you seem tired, or suggest taking a break when your attention wanders.

At home, your emotion-sensing refrigerator could tell you to resist the ice cream today, based on your stress levels, or your car could warn you to drive slowly this morning because you seem upset. ‘We are going to see an explosion of richness in this area,’ Picard says. ‘The age of the emotional machines – it’s coming.’

This article was written by Madhumita Murgia from The Daily Telegraph and was legally licensed through the NewsCred publisher network.

There are 2 comments

  • steven freilich - 01/19/2016 21:04
    There is much about ourselves we can not reach, don't have access to-akin to taking blood pressure revealing anxiety unknown to us. Computers can help us reach ourselves when up to now we have had to solely rely, to use attachment theory vernacular, on the 'mirroring of others'. While many times this is helpful , the problem is that other humans have their own agendas/issues that are triggered, so it becomes murky to get support or feedback from another human. One doesn't know how 'accurate' (verses 'funhouse') the mirror is they are standing in front of.... Computers can bring their own mirrors helping with the above bias issue and adding more 'reflectors'. Folks with autism will be helped but also many other applications including potentially helping families lower the staggering divorce rate. Many people with intimacy issues from anger management to a low end capability (but not near autism levels) for being empathic-being mirrors themselves- can get essential feedback. But these folks would get it in real time, that now they have to wait for one a marriage counseling session. And this will come with less effort to 'prove to them' what they are missing from their partner and/or what they are 'giving off' to them. It's bio-feedback kind of stuff but much more 'in vivo' (real life) and with multiple amount of applications. My research on social support with multiple supporters was the first to show with regression analysis that interactions intending to be supportive can have a unintended negative effects-to the point of altering the results of the study- because of the negative/aversive piece that is alway present, many times undetected, every time one person tries to help another. Steve Freilich, Ph.D.

  • Affective computing: how emotional machines are... - 01/18/2016 21:52
    […] From robots anticipating our desires to wristbands that help autistic children speak – the way we engage with technology is changing, finds Madhumita Murgia In a quiet breakfast cafe, on a sunny October morning in Boston, I am watching a gang of five animated emotions control the thoughts of a little girl called Riley. On …  […]

Great ! Thanks for your subscription !

You will soon receive the first Content Loop Newsletter