To date, the integration of brain-computer interfaces and mixed reality headsets in Internet of Musical Things (IoMusT) performance ecosystems has received remarkably lit...Show More
Metadata
Abstract:
To date, the integration of brain-computer interfaces and mixed reality headsets in Internet of Musical Things (IoMusT) performance ecosystems has received remarkably little attention from the research community. To bridge this gap, in this paper, we present BCHJam: an IoMusT-based performance ecosystem composed of performers, audience members, brain- computer interfaces, smart musical instruments, and mixed reality headsets. In BCHJam, one or more musicians are fitted with a brain-computer music interface (BCMI) giving them the possibility to actively or passively control the processing of their instrument's audio. Moreover, the BCMI's signal controls mixed reality visual effects displayed in XR headsets worn by audience members. All the components of BCHJam communicate through a Wi-Fi network via Open Sound Control messages. We refined the system through a series of test performance sessions, resulting in the creation of a signal quality filter that improved the musician's experience, along with a tuning of control parameters. The developed ecosystem was validated by realizing a musical performance. We provide a critical reflection on the achieved results and discuss the lessons learned while developing this first of its kind IoMusT performance ecosystem.
The recent advances in technologies at the confluence of the Internet of Things and music have led to the emergence of the paradigm of the Internet of Musical Things (IoMusT) [1].