March 24, 2016 | Written by: Colin McCabe
Categorized: Events | Watson
Share this post:
I had the opportunity to ask some questions to the winning team of the SXSW Hackathon Championship last week. Their app, Cognitunes, was a favorite amongst judges for best Consumer Music app, winning $3,500 and attention of the developer community. Sean Cascketta, took me through their approach and how Watson was crucial in the product development.
Who is your team, and what is your background?
Our team is three people – Gus Ireland, Aaron Austin, and myself. Gus and Aaron are both software engineers, working at Rackspace and General Motors respectively. I am a 4th-year undergraduate Computer Science student at The University of Texas at Austin. I knew Gus beforehand since we are both members of a civic tech organization called Open Austin, a local brigade of Code for America. Gus and I met Aaron for the first time at the hackathon, but he was eager to work with us when he heard we were going to make a project with the Amazon Echo.
How did the idea come about?
When we found out that the Amazon Alexa team had demo Echo units available, we knew we wanted to make something using the slick voice interface of the Echo. Gus cleverly suggested that we put a twist on the Echo’s existing music player capabilities by automatically selecting music to play for a user based on their mood. Before we set the idea in stone, we looked for a way to run sentiment analysis beyond just positive or negative. That was when we talked to the IBM Watson team at the hackathon, and discovered that the Alchemy API was just what we needed.
How did you use Alchemy API?
Once we figured out how to use the Alexa Skills Kit API to transform a user’s speech into text, we needed a way to use that text to understand how a user was feeling. We used the Alchemy API’s Emotion Analysis service that takes a text sample and breaks down the emotions into five – fear, anger, joy, disgust, and sadness – each with their own score from 0 to 1, depending on the perceived strength of the emotion. We then used the emotion with the highest score in order to determine what type of music to play.
What did the judges think?
We definitely got a strong positive response from the judges, especially some laughs when the Echo played “Don’t You Worry Child” by Swedish House Mafia in response to “I’m so mad that I had to wait in line for two hours yesterday”. Above everything else, our demo’s best quality was simplicity — all we used for our demo was an Amazon Echo. There was no other device or even a PowerPoint involved. It literally spoke for itself.
What’s next for you?
I can tell you that all of us are thrilled to dig deeper into the possibilities of the Amazon Alexa ecosystem. We believe the frictionless voice interface has immense market potential, especially with the population that has difficulty using existing applications on desktop and mobile applications. One thing that I would like to build is an Alexa skill utilizing Chef Watson. How cool would it be if you could tell the Echo what ingredients you have, and it replies with ingredient pairings or recipe suggestions from Chef Watson?
Check out Cognitunes on GitHub, and follow Sean on Twitter.