This is a late, shameless self-plug for a project I worked on in Fall 2010 with Denise Chew (PhD student Human-Centered Computing) for a Music Technology class taught by Parag Chordia*.
In MusEEGk, participants wearing an EEG headset control a musical step-sequencer to produce simple repetitive tunes. Imagine this, only seizure-inducing and a much less intuitive note selection process (participants had to stare at a single note for up to 12 seconds to select/deselect it). Great Fun!
I’ll be revisiting this project in January 2012, after having spent the last year better familiarizing myself with the Clojure+Overtone+Processing development environment. Here’s to hoping that we can find a way to make the note selection process faster without sacrificing too much in the way of accuracy!
Many thanks to the individuals in Melody Moore Jackson’s BrainLab who lent access to their machines and educated us well enough to prevent any test subjects from being electrocuted.
*- I highly recommend attending any of Parag’s lectures, if you can.