Skip to Main Content
A new method for self-supervised sensorimotor learning of sound source localization is presented, that allows a simulated listener to learn online an auditorimotor map from the sensorimotor experience provided by an auditory evoked behavior. The map represents the auditory space and is used to estimate the azimuthal direction of sound sources. The learning mainly consists in non-linear dimensionality reduction of sensorimotor data. Our results show that an auditorimotor map can be learned, both from real and simulated data, and that the online learning leads to accurate estimations of azimuthal sources direction.