Innovation

Building the IBM Emotive Droid

Share this post:

Animating emotion.

This is a project to show of the use of affective computing (Emotional AI) in the Watson suite, which is IBM’s collection of cloud-based AI API’s. We hooked this into The Waston SDK for Unity, which allowed us to use this as a 3D Environment.

The goal here was simple: to create an animated object that reacts to a user’s voice, and understands the emotion it is conveying.

First we needed to use Watson Speech to Text to detect when somebody is talking, and then what is being said We also used this to pick up on certain keywords, in order to trigger other actions like clearing and resetting the animation.

Once we had the text, we could then run it though Watson Tone Analyser .This allowed us to pick up the emotional content of the speech. We used the five-factor model from modern psychology, which assumes there are five core emotions: Joy, Sadness, Fear, disgust and anger. Any other feelings are seen as a mixture of one or more of these core five.

Importantly, we can also understand how confident we are in this emotional content, Watson Tone Analyser produces a confidence score for each emotion, everytime we analyse text. If the system is confident, and the user expresses enough of a particular emotion, the Droid in our animation scene will then react to it.

Modeling Emotion

To model emotion inside the project, we created a concept of emotional buildup, alongside several thresholds.

As you talk, and through Watson Tone Analyser we understand you are expressing a certain emotion, we can then add to the emotional build of that feeling This also triggers decay in other emotions, and if an emotional buildup hits one of our thresholds then a new emotional state will be reached.

In other words, if the Droid is sad and you say happy things, it becomes happier and the “sadness” emotion begins to decay. If you say enough happy things, it will hit one of the thresholds and its emotional state will change from being sad to being happy.

Here are some excerpts from the code, showing how it was done.

Results

Here you can see my amazing colleague Amara Graham (Keller) giving the code a whirl. We picked a little Droid for the project, and gave it a very simple (and looping) animation for each emotion. Watch the video to see it in action, or download the code and try it yourself!

https://github.com/GwilymNewton/IBM-Emotive-Droid-Demo

More Innovation stories