In 1999 Microsoft released a product called “Direct Music Producer” with a purpose to be “THE Authoring Tool for Interactive Music”. Loaded with lots of features, it promised to solve all the challenges when it came to composing, producing and integrating music for games. For different reasons not many musicians or composers understood the purpose of the product or how to use it; it never became a standard. 17 years have passed. The development of Direct Music Producer has stopped and there are no traces of any other products with the same capacity or complexity. Composers of music for games are often stuck with tools, which are very limiting and tricky-to-use. This makes the complex task to compose, produce and integrate music for games, out of reach for most traditional composers. The result is that a lot of knowledge and experience from composing music for film cannot be applied in music for games.
Since the first day I encountered this dilemma, I’ve been intrigued and challenged to solve it. I have worked on both technical solutions (see “iMusic – documentation in the reference section) and pedagogical strategies at KMH where I’m responsible for all courses related to music in interactive environments. There are lots of interesting, independent initiatives around the world to solve parts of this challenge both in the industry and the research area but there is an important area between music, technology and pedagogy that is still unexplored or undocumented.
Examples of important companies contributing to this field are Audiokinetic, Firelight Technologies, Cycling’74 (MAX/MSP), Elias Software and Dinahmoe where the latter two are located in Stockholm. Centres for related research are Audio Engineering Society (AES), Queen Mary University of London (Centre for Digital Music / Interaction Sound and Music) and Massachusetts Institute of Technology. There are also interesting conferences and inter-university research initiatives on game music like Ludomusicology (UK), GameSoundCon (LA, US) and Game Music Connect (UK).
My goal is to map out current technologies and strategies used to produce and integrate music for interactive media and compare my results with the ideas and needs asked for by composers and music producers. I will also develop prototypes with the aim to answer needs not yet met and cooperate with different artist generating new forms of art not yet experienced. Through these prototypes and new art I will examine the need for music integration solutions that have not yet been asked for.
In the research I will examine and evaluate the differences between current available technologies and requested features and possibilities. I will also discuss and suggest methodologies and approaches used to teach adaptive music as well as suggesting new features and structures for future development of music integration tools.
I will survey currently used tools for integrating music into games and focus on their features to meet the ideas of the composer and the requirements for music.
I will make deep interviews and have focus groups with composers and producers in the film and game industry.
I will use my current Interactive Music Framework (based on Web Audio API) to build test benches and environments to try out and evaluate ideas that the studies bring to surface.
The result will add to the knowledge in the area of adaptive music and contribute to the field of music in interactive media and virtual reality. Students – from music production, composition and performance – will have great possibilities to participate in the study. KMH in general and “Music and Media Production” in particular, will benefit from the outcome through new methodologies, articles, projects and technologies. The industry will benefit by getting a greater understanding of the challenges for a traditional composer to enter the world of adaptive music as well as getting suggestions for new features, structures and approaches for possible new products.