Skip to Main Content
We present a multi-contextual interactive project which aims to create an open and real-time musical environment extending a natural ambient environment, namely an urban landscape. The basic idea is to convert in real-time the input from a camera sensor recording a urban landscape (in our first test-case: Tokyo city at night) into an electronic musical soundscape. This is achieved by considering each blinking light as a rhythm source (or to combine several lights into a single source in order to have more complex rhythmic structures), to associate it to some sound from a sound bank and possibly to add some sound effects. This will thus generate many layers of intertwined rhythms that represent the pulse of the city through its blinking lights. Therefore this project encompass artistic creation process embedding image tracking, artificial intelligence with inductive original tempo tracking and beat prediction algorithms. We perform pertinent image and symbolic descriptors extractions, as pulsation and rhythm features, in order to synchronize both musical and control worlds with the natural visual environment. The key idea is to achieve an emergent rhythmic process for musical creation and generative music.