Share this post:

This post is part of a series recognizing unique IBM Research projects and their unexpected ties to pop culture, with “30” or “1986” being the common thread. The series will run once a week, celebrating the 30th anniversary of IBM Research – Almaden in San Jose, CA.

In the spirit of this series’ “30” theme, this installment delves into IBM’s exploration of the 30 major and minor keys in music. These keys are different from the black and white keys on a piano. Music keys are groups of notes that determine the make-up of musical scales, and are often associated with certain moods or emotions. For example, songs in minor keys tend to evoke a sad or dark mood, while songs played in major keys sound more cheerful and bright.

IBM’s AI foray into music

IBM researchers Janani Mukundan and Richard Daskas developed cognitive technology, Watson Beat, to learn the nuances and characteristics of all 30 music keys, and to use its extensive knowledge of music theory to compose new songs with certain moods or styles. Mukundan, a PhD in Computer Engineering, and Daskas, a professional musician, want to give the world an artificial intelligence app that can help anyone be musically creative.


Watson Beat composes music by “listening” to at least 20 seconds of music, and then creates new tracks of melodies, ambient sounds, and beats based on what it learned from the original sample – whether the user wrote it, or is using other samples and songs. Depending on which mood a user asks Watson Beat to emulate, the program will choose specific instrumentation and compose the new song in a certain key in order to create a sense of emotion or style. Mere seconds after hearing the music sample, Watson Beat produces a new song that is entirely different from the sample, though users can adjust for how similar (or not) the new composition sounds to the original.

And because Watson Beat never composes the same song twice, well, the possibilities are endless. “Watson Beat is a cognitive technology that will inspire anyone, from professional musicians to hobbyists, to be able to compose their own music,” Mukundan said in a live demo of Watson Beat on Facebook.

Though there are currently only four moods that Watson Beat has learned to compose music in, – amped up, spooky, dark and moody, and Middle-Eastern – Mukundan and Daskas hope to release Watson Beat as an open-source application by the end of 2016 so that users everywhere can add their own instruments, styles, and moods to the project (listen to samples).

About the author: Kelly Shi is a Communications intern at IBM Research-Almaden, where she is excited to produce media content about IBM researchers and their teams. In the fall, she will return to Northwestern University and will continue working towards her BA in Communication Studies.


More Cognitive Computing stories

Enabling Fabrication Beyond 7nm

How did we get from the Palm Pilots of the 90s to the ultra-powerful smart phones of today? In large part, because of scaling, where integrated circuits are made with smaller feature sizes fitting more and more circuit elements in the same area of silicon at each technology generation. This sets our expectations that in […]

Continue reading

Learning Chinese-Specific Encoding for Phonetic Similarity

Performing the mental gymnastics of making the phoenetic distinction between words and phrases such as “I’m hear” to ‘I’m here’ or “I can’t so but tons” to “I can’t sew buttons,” is familiar to anyone who has encountered autocorrected text messages, punny social media posts and the like. Although at first glance it may seem […]

Continue reading

In Tune With the Heart of a Copper Atom

Our team at IBM Research developed a new technique to control the magnetism of a single copper atom, a technology that could one day allow individual atomic nuclei to store and process information. In a paper published today in the journal Nature Nanotechnology, our team demonstrated that we can control the magnetism of a single […]

Continue reading