Posted in: Almaden 30, Cognitive Computing, IBM Research-Almaden, IBM Research-Austin

Beats by AI

This post is part of a series recognizing unique IBM Research projects and their unexpected ties to pop culture, with “30” or “1986” being the common thread. The series will run once a week, celebrating the 30th anniversary of IBM Research – Almaden in San Jose, CA.

In the spirit of this series’ “30” theme, this installment delves into IBM’s exploration of the 30 major and minor keys in music. These keys are different from the black and white keys on a piano. Music keys are groups of notes that determine the make-up of musical scales, and are often associated with certain moods or emotions. For example, songs in minor keys tend to evoke a sad or dark mood, while songs played in major keys sound more cheerful and bright.

IBM’s AI foray into music

IBM researchers Janani Mukundan and Richard Daskas developed cognitive technology, Watson Beat, to learn the nuances and characteristics of all 30 music keys, and to use its extensive knowledge of music theory to compose new songs with certain moods or styles. Mukundan, a PhD in Computer Engineering, and Daskas, a professional musician, want to give the world an artificial intelligence app that can help anyone be musically creative.


Watson Beat composes music by “listening” to at least 20 seconds of music, and then creates new tracks of melodies, ambient sounds, and beats based on what it learned from the original sample – whether the user wrote it, or is using other samples and songs. Depending on which mood a user asks Watson Beat to emulate, the program will choose specific instrumentation and compose the new song in a certain key in order to create a sense of emotion or style. Mere seconds after hearing the music sample, Watson Beat produces a new song that is entirely different from the sample, though users can adjust for how similar (or not) the new composition sounds to the original.

And because Watson Beat never composes the same song twice, well, the possibilities are endless. “Watson Beat is a cognitive technology that will inspire anyone, from professional musicians to hobbyists, to be able to compose their own music,” Mukundan said in a live demo of Watson Beat on Facebook.

Though there are currently only four moods that Watson Beat has learned to compose music in, – amped up, spooky, dark and moody, and Middle-Eastern – Mukundan and Daskas hope to release Watson Beat as an open-source application by the end of 2016 so that users everywhere can add their own instruments, styles, and moods to the project (listen to samples).

About the author: Kelly Shi is a Communications intern at IBM Research-Almaden, where she is excited to produce media content about IBM researchers and their teams. In the fall, she will return to Northwestern University and will continue working towards her BA in Communication Studies.


  • Jim says:

    I will right away seize your rss as I can not in finding your e-mail
    subscription hyperlink or e-newsletter service. Do you’ve any?
    Kindly let me know so that I could subscribe.

  • Mike says:

    I absolutely love this story. I’m a huge fan of AI-augmented creative production, and happen to be a homegrown songwriter and producer myself. Love this stuff. I’d love to try using AI Beats to compliment an existing song I’ve recorded, for example.

    Cool stuff!

  • Stephen Perelgut says:

    It would be nice to include a snippet, maybe the 20sec inspiring piece and another 20sec for each style of inspired music.


  • AWESOME! Please lemme see it working! I’m a music producer and bloody love new technologies!

  • Wonderful post, very resourceful and insightful. Loved using the Watson Analytics too!

    Be Well

    Ananth V

  • -->
    Kelly Shi, IBM Research-Almaden intern

    Kelly Shi

    Communications Intern, IBM Research - Almaden