Skip to Main Content
As video games have grown from crude and simple circuit-based artefacts to a multibillion dollar worldwide industry, video-game music has become increasingly adaptive. Composers have had to use new techniques to avoid the traditional, event-based approach where music is composed mostly of looped audio tracks, which can lead to music that is too repetitive. In addition, these cannot scale well in the design of today's games, which have become increasingly complex and nonlinear in narrative. This paper outlines the use of experience-driven procedural music generation, to outline possible ways forward in the dynamic generation of music and audio according to user gameplay metrics.