CHI Needs Music!

I’m on my way back from Glasgow where I just visited CHI2019. The conference is known for being the biggest research conference within the Human Computer Interaction community. This year I was honored to be a presenter together with my brilliant colleague, Emma Frid. If you are interested in interactive multi modal installations, please have a look at our paper “Sound Forest – an exploratory evaluation of an interactive installation“. I will write another post about the paper so let’s get to the point:

First of all: CHI is huge! I was told, we were were 3800 delegates from all over the world. The biggest companies in the field were there, including Google, Facebook, Adobe, Spotify, Mozilla and many more. From Monday to Thursday there were over 20 seminars in parallell adding up to something like 1000 presentations!!. The schedule is found here https://confer.csail.mit.edu/chi2019/schedule and I went immediately to the filter and searched for “music”. I didn’t find that many talks, so I added “sound” which together with “tactile” made most of days busy enough.

Beside the talks, there was also a big hall where big companies shared the space with news startups and lots of researchers eager to share their thoughts and results with others. One of the most interesting posters to me was Jacob Harrisons “Accessible Instruments”: http://instrumentslab.org/research/accessible-instruments.html.

Looking at the conference as a whole, I would say the trends definitely points towards artificial intelligens and virtual reality. Even different aspects of “robots” seems to attract a lot of attention. One aspect of CHI I really liked was the playful attitude towards design. You could find anything from very useful tools for people with disabilities to more provoking studies on what an Internet for dogs would look like.

Zooming in a bit more on my own topic, Sound and Music Computing, I left with some thoughts:
It seems to me that there is more interest in haptics than sound and more interest in sound than in music. This actually leaves the topic of music in interaction design more or less left out and when someone involves “music” it tends to be sine waves, white noise or MIDI-notes controlled by an interactive system. The result is very rarely something I would consider “music”.

My conclusion is that there is a huge area still to be explored when it comes to integrate more “normal” music into interactive environments. So, let’s roll up our sleeves and see what we can to contribute to this area. It’s too big to be left unexplored!