Skip to Main Content
This paper presents an interactive installation that employs flocking algorithms to produce music and visuals. The user's motions are captured by a video camera and influence the flocks behaviour. Each agent moving in a virtual 3D space controls a MIDI instrument whose playing style depends on the agent's state. In this system, the user acts as a conductor influencing the flock's musical activity. In addition to gestural interaction, the acoustic properties of the system can be modified on the fly by using an intuitive GUI. The acoustical and visual output of the system results from the combination of the flock's and user's behaviour. It therefore creates on the behavioural level a mixing of natural and artificial reality. The system has been designed to run an a variety of different computational configuration ranging from small laptops to exhibition scale installations.