Feeling AI

Feature Story

We all develop emotional connections to the devices we use; the smartphone that is a constant companion or the office printer that is a constant source of frustration. Soon, these machines might be able to respond in kind

The term AI is a little misleading. Artificial intelligences are capable of some incredible feats; in a few minutes they can parse huge datasets that would take a human being a whole lifetime, they can out-strategise or outperform humans at the top of their fields. But artificial intelligences are not really so intelligent. Sure, they’re increasingly adept at image recognition and recreating speech; both of these tasks have been mastered by AI far more quickly than industry experts thought possible just half a decade ago. And yet, for all their abilities, artificial intelligences are only ‘intelligent’ in the most superficial sense. AI is not capable of self-awareness, it cannot reflect upon its own feelings because, of course, it has none. Sure, we have all shouted at a misbehaving piece of tech or become angry at a slow-moving computer, but this is merely down to the human capacity to project our own emotions onto dumb, unfeeling machines. We can get frustrated by machines, sometimes we may even be pleasantly surprised when one performs its task well. But our feelings towards artificial intelligences are never reciprocated. That is, until now.

Ford, the world’s fifth-largest car manufacturer, has recently unveiled an EU-funded project that will allow vehicles to detect and interpret the emotions of drivers and passengers. Why would you want a car to know how its occupants are feeling? If the car’s onboard AI detects that the driver is tired or distracted, maybe about to let their frustration get the better of them, the car can switch to self-driving mode and take control for a few vital seconds. A potentially life-saving intervention based on an emotional response. This can be done by leveraging existing technology; face recognition is now becoming accurate enough that a car’s onboard cameras can interpret the driver’s facial expressions and guess their emotional state. Wearable tech including heart rate and heat sensors can connect to the car’s onboard computer to monitor the driver’s emotions in real-time.


The benefits for road safety are obvious, and incentives such as reduced insurance premiums are sure to help popularise this technology. However, the potential applications for emotionally-sensitive AI go far beyond the confines of motor vehicles.

Just as emotionally-sensitive AI can look out for the safety of drivers, it can care for the safety of workers on a factory floor, automatically stopping heavy machinery if it sees that a worker becomes distracted. Emotionally-sensitive AI could be applied to any kind of workplace to oversee interpersonal relations between workers. After all, people who are getting along will work more productively together, and emotionally-sensitive AI can spot potential conflicts or disagreements before they arise.

There are broader uses for this technology too. As reported in SIGNED issue 17, retailers are increasingly turning to facial recognition technology to improve the experience of visiting their stores. By integrating emotionally-sensitive AI into their existing facial recognition infrastructure, retailers will be able to assess the efficacy of their operations; are customers relaxed or harried in the store? Do their customers find the retail experience stressful, boring or exhilarating? With this data, retailers can easily tweak the shopping experience to help maximise profit.

And emotionally-sensitive AI has the potential to help us in our leisure time too. Film producers spend millions showing movies to test audiences to gauge their reaction, recutting and even reshooting parts of a movie depending on the test audience’s response. But, using traditional questionnaires or focus grouping, it is notoriously difficult to tell how authentic a test audience’s reaction is. With emotionally-sensitive AI, there’s no need to ask questions or stage focus groups. The technology will be able to ascertain whether a film is engaging or alienating its audience, just by monitoring them during the screening.

Emotionally-sensitive AI is also likely to find its way into domestic spaces. Digital assistants are increasingly common in our homes, yet their biggest weakness is an inability to understand the emotional context of their users’ voice commands. The user inevitably issues instructions to their digital assistant in varying tones, depending on whether they’re in a rush, relaxed, experiencing an emergency and so on. Being able to understand the implications of these different vocal tones will allow digital assistants to serve up more relevant and better tailored information based on the user’s needs.

AI is increasingly common in the classroom, as computers play an ever more central role in education. It is, therefore, likely that emotionally-sensitive AI will eventually fulfil the role of classroom assistant, helping the teacher to care for the emotional needs of their charges, not to mention helping to make sure students are concentrating and behaving themselves.

All of these potential applications for emotionally-sensitive AI will rely on technology that can sense vital signs from the people it is monitoring. While fitness trackers and activity trackers are increasingly popular and can gather much of the data that emotionally-sensitive AI will require to monitor the emotional state of users, not everyone wears such a tracker. Besides, not everyone who does wear an activity tracker will necessarily volunteer to connect their activity tracking data to whichever AI infrastructure wants to use it. So, unobtrusive sensing technology is vital to the future success of emotionally-sensitive AI. Facial recognition cameras are a good example, being reasonably non-invasive. Furthermore, infrared cameras can accurately measure changes in an individual’s body heat, related to their emotion. And researchers at MIT have shown that a camera no more sensitive than that found in a smartphone can remotely detect a subject’s heart rate based on miniscule changes in their skin colour.

All of the technology to make all of these applications of emotionally-sensitive AI happen in real life is already reasonably well-established. Which means that in the coming few years we’re likely to see a lot more of it.