April 19, 2016 | Written by: Lexie Komisar
Share this post:
The SXSW Music Hackathon, which straddles the interactive and music portions of the conference, is one of most anticipated events of SXSW as it brings together technologists and artists to create solutions for real problems in the music industry. IBM Watson was an API partner as was Quantone, a UK based music intelligence software company that has integrated Watson into its MusicGeek API. MusicGeek™ is built with IBM’s Watson technology, making it the first music platform to use cognitive computing. Watson “reads” blogs and biographies (at a rate of 800 million pages per second), building an understanding of the music world based on expert opinion and the guidance of our content team.
This year, more than 500 developers applied for a coveted 150 hackathon spots to compete within three hack categories: Commerce, Creation, and Consumer. Tech luminaries from Amazon, TechCrunch, Pandora and 500 Startups judged the event.
Over the course of 24 hours, developers from across the United States came together to build game changing technology to help push the music industry forward. Three high profile artists were in residence (Alex Ebert of Edward Sharpe and the Magnetic Zeros, recording artist and producer Ryan Leslie, and Kiran Gandhi of Madame Gandhi and M.I.A.) working with the hackers and inspiring them to reconsider the intersection between music and technology. In fact, two of the three hackathon winners, Cognitunes and SamPack, built with Watson APIs.
Brian Newsom from Sampack shares how his team built the winning application with Watson, in less than 24 hours.
How did you come up with your idea for the application?
Bret and I are both musicians. We were frustrated with how difficult it can be to get just the right sound for our compositions. We see a lot of our contemporaries illegally downloading sounds because they are so expensive to purchase, so we wanted to make creating high quality, unique, royalty free sounds more approachable and intuitive.
Which APIs did you use and what did you create?
Bret and I built a product called SamPack. SamPack is for aspiring producers and musicians who can’t afford professional sample packs and do not have the technical knowledge to produce their own. SamPack accepts plain language descriptions of sounds (e.g. dark, mellow synth) and then algorithmically generates musical samples (audio files) matching that description. Effectively, we are using Watson’s Tone Analyzer API and translating the emotions and tones present into the musical domain. Output from the tone analyzer plus some of our unique algorithms correlate a quantitative analysis of the text into different musical elements, filters, effects, and this is synthesized from scratch into the finalized sounds.
Was was it like working with Watson APIs?
We were really impressed with how accurate the tone analyzer was, in addition to the breadth of information it provides with relatively little input. Natural language processing is quite complex, and it is amazing to translate user input into predictable, digestible data.
What’s next for your application?
Bret and I are continuing to work on SamPack after participating in the SXSW Music Hackathon Incubator with advising from some of the top minds in music tech. We are currently partnering with professional musicians to use our product, as well as making it more accessible and powerful.
Visit DevPost to learn more about the hackathon teams and the exact code used for their projects