Chord Recognition in Beatles Songs

Chord Recognition in Beatles Songs

While a graduate student at MIT’s Media Lab, I collaborated with office-mate Vi­ctor Adán to explore how if we might train a machine to recognize chord changes in music. We tried multiple models to solve the problem, including Support Vector Machines, Neural Networks, Hidden Markov Models, and a few variations of Maximum Likelihood systems.

We chose Beatles tunes as a subset of the larger problem and trained our systems with 16 songs from three of their albums. Our systems processed 2700 training samples, 150 validation samples, and 246 testing samples. Our most successful system, a Support Vector Machine, achieved 68% accuracy in testing.

Our intention was to further the research which will lead to applications such as automatic transcription, live tracking for improvisation, and computer-assisted (synthetic) performers. Our models were an extension of the research provided by the following papers:

  • Musical Key Extraction from Audio, Steffen Pauws
  • Chord Segmentation and Recognition using EM-Trained Hidden Markov Models, Alexander Sheh and Daniel P.W. Ellis
  • SmartMusicKIOSK: Music Listening Station with Chorus-Search Function, Masataka Goto
  • A Chorus-Section Detecting Method for Musical Audio Signals, Masataka Goto

Main Website

Similar Posts

  • Touch #1

    In 2012 I created my first interactive touch wall: Touch #1. The work built on my experience creating the visuals for Still Life and was largely inspired by seeing autistic children experiencing pure joy while interacting in an immersive environment. Touch #1 received a great response and was later installed at Exploration Place and at…

  • Digital Puppetry

    I worked with a team of colleagues, community members, and urban youth. Our intention was to help the youth learn in a playful environment, find personal self-expression, and have their voices heard by communities in Boston. To do this, we adapted commercially available technology to provide a unique medium: digital puppetry.

  • Auralis

    Auralis is a musical instrument constructed from a multitouch table and custom software ported from my earlier touch wall Touch #2. Merging sound, interactivity, and sequencing, Auralis is simultaneously engaging, meditative and soothing. I created the work for youth ages 8-14. A virtual world of suns, stars and planets interact with each other. Participants run…

  • Touch #2

    Touch #2 is a playful virtual environment and an interactive, musical instrument. Viewers become participants through play. The work transforms any flat wall into a touch-sensitive surface. A projector and infrared camera mount on the ceiling and infrared emitters mount on top of the wall, allowing the work to detect as many as 10 touches…

  • Ballet Wichita: Innovations

    As part of the Ballet Wichita’s Innovations performance April 2024 I teamed up with Wichita State University’s Shocker Studios to track a dancer live, reinterpreting their motions on a stage-sized display behind them. Wichita State University’s Fairmount String Trio accompanied the dance, performing Dohnányi’s Serenade. To track the dancer we used 3 Vive 3.0 trackers…

  • Microphone with proximity detection

    Around 2004 I developed a few protoype microphones enhanced to also offer proximity detection. The microphone could adjust it’s amplitude and bass response based on the proximity of the person using it. This would lessen the variable results users experience when holding a microphone too close or too far. Moreover, with proximity or its derivative mapped to a combination…