Human is an emotional being and emotions are at the heart of all our actions. We like to imagine ourselves as rational beings. Besides, Aristotle defined human as the only animal endowed with reason. And yet it is our emotions that guide our behavior, our decisions and our actions.
As the etymology of the term emotion tells us, they put us in motion. They are the physical and psychological response, often complex, more or less intense, following external stimuli. Feelings about them are the product of emotions once we try to rationalize them. For example, a defective product will generate emotions such as anger which will cause me a negative feeling to the brand.
Everything must be experiential!
“89% of companies expect to compete primarily on the basis of customer experience, up from 36% four years ago. (Gartner)
Companies have understood that controlling emotions can be the key to selling. That’s why they are interested in it very closely. It’s the basis of “experiential marketing ” where any interaction with the customer has to be an amazing experience. Think about it when you see the efforts made by companies like Apple in things as insignificant as their packaging in terms of the value of the product.
Uses multiply and flirt with the limits of ethics:
- Marketing: agencies integrate the measurement of emotions in their analyzes
- Media, brands and products: the measurement of a panel’s reactions to a creation (a film, an advertisement, a speech, etc.) makes it possible to evaluate their potential. The same with product tests that guide their design.
- Human Resources Management: Analyzing the behaviour and the face of candidates or collaborators during interviews makes it possible to detect possible risks.
- Well-being: more and more connected objects allow you to watch yourself like the Dreem headband that “combines the most effective methods, from biofeedback to neuromodulation, to improve your daily sleep”
Our emotions: new playground for AI
Mastering emotions is on the one hand understanding them and on the other hand generating them:
- To understand them is to be able to measure the impact of the environment on individuals. How is my product perceived? My advertising campaign? etc.
- To generate them is to be able to arouse such or such emotions in an individual. It is the grail of all commercial communication.
“Everything is artificial, to some extent” (Andy Warhol)
The state of the art scientific research does not yet allow this feat to understand the emotions like remind the Professor Laurence Devillers who leads a research team on the emotional and social dimensions in spoken interactions.
But failing to understand the emotions, the Artificial Intelligence makes it possible to measure the effects, the traces. A new term comes to devote this new playground for algorithms: emotional or affective computing (“Affective computing”). According to Crone Consulting, the market for emotion analysis is expected to reach $10 billion globally by 2020, up from $20 million in 2015.
With available data and machine learning techniques in general, and deep learning in particular, we can build models to measure emotions.
In this race to measure all the data are good:
- Texts: these are the most commonly used data. Comments, conversations, notices on specialized sites, everything goes.
- Images and videos: this is probably the richest source of data for measuring emotions. The latest artificial intelligence technologies make it possible to evaluate the faces and the movements of the eyes, to analyze the gestures but also to follow the respiratory movements. From a video we cannot imagine all the signs that can be extracted. Nothing to give them meaning.
- The voice: like pictures and videos, the voice carries a lot of information about the emotional state of a person. Intonation, rhythm, breathing … everything is good for analysis.
- The sensors: always more numerous, they allow to measure signals hitherto inaccessible. For example with the rise of connected watches it is possible to measure furtively and almost continuously heart rate, sweating, body movements, etc.
If you combine all these data sources you get a fairly accurate measure of emotions. It remains to make sense of these measures.
After the measurement, on the way to the generation of emotions
The control of emotions is still limited:
For the moment we are in the infancy but the actors who invest in this area are not lacking. Google for example, works on Google Duplex, a voice chatbot whose voice and artificial gimmicks give the illusion of dealing with a sensitive being. The demonstration is impressive on this point.
In another area is the race for creative AI. For example, Olivier Reynaud wants to make AI capable of generating Oscar-winning films.
The control of emotions is still limited:
- Ethical limit: this is probably the strongest limit. What do we want to do with these new capabilities? China shows a path that has nothing of a dystopia.
- Technical limitation: Deep learning approaches that are heavily used in this field require a lot of data that companies do not easily have at their disposal. With the transfer learning, which involves using a trained model and specializing it, this limit becomes less strong.
- Creative limit: we are still at the beginning and for the moment “emotional” technologies are mainly used to measure the impact of an action or a message. But the potential of these technologies goes far beyond.