This is a project to show of the use of affective computing (Emotional AI) in the Watson suite, which is IBM’s collection of cloud-based AI API’s. We hooked this into The Waston SDK for Unity, which allowed us to use this as a 3D Environment.
The goal here was simple: to create an animated object that reacts to a user’s voice, and understands the emotion it is conveying.
First we needed to use Watson Speech to Text to detect when somebody is talking, and then what is being said We also used this to pick up on certain keywords, in order to trigger other actions like clearing and resetting the animation.
Once we had the text, we could then run it though Watson Tone Analyser .This allowed us to pick up the emotional content of the speech. We used the five-factor model from modern psychology, which assumes there are five core emotions: Joy, Sadness, Fear, disgust and anger. Any other feelings are seen as a mixture of one or more of these core five.
Importantly, we can also understand how confident we are in this emotional content, Watson Tone Analyser produces a confidence score for each emotion, everytime we analyse text. If the system is confident, and the user expresses enough of a particular emotion, the Droid in our animation scene will then react to it.
To model emotion inside the project, we created a concept of emotional buildup, alongside several thresholds.
As you talk, and through Watson Tone Analyser we understand you are expressing a certain emotion, we can then add to the emotional build of that feeling This also triggers decay in other emotions, and if an emotional buildup hits one of our thresholds then a new emotional state will be reached.
In other words, if the Droid is sad and you say happy things, it becomes happier and the “sadness” emotion begins to decay. If you say enough happy things, it will hit one of the thresholds and its emotional state will change from being sad to being happy.
Here are some excerpts from the code, showing how it was done.
Here you can see my amazing colleague Amara Graham (Keller) giving the code a whirl. We picked a little Droid for the project, and gave it a very simple (and looping) animation for each emotion. Watch the video to see it in action, or download the code and try it yourself!
Rainbow Octopus is an open-source project created to show how a developer could use IBM’s Watson services in ARKit and Unity to control and manipulate a 3d animated character. Specifically, via the Watson Unity SDK, we capture and send speech to the IBM Speech To Text, and interpret the results using Tone Analyser and Watson […]
Myself and some colleagues from IBM’s Emerging Technology attempted a two-day project just before Christmas this year, because we thought we might have the tech to tackle a big problem: cell phones in prisons, we know prisoners have them, but they are rarely ever tracked down. This is something the UK government is currently planning to spend […]
With the release of the Watson Unity SDK in 2018, myself and Amara Graham (Keller) set out to build a chess game that could be completely voice controlled. In order to tackle this, we had three major tasks ahead of us, each of which had a clear solution. First of all, we would need our application to […]