Measuring and Predicting Multisensory Reaction Latency: A Probabilistic Model for Visual-Auditory Integration | IEEE Journals & Magazine | IEEE Xplore

Measuring and Predicting Multisensory Reaction Latency: A Probabilistic Model for Visual-Auditory Integration


Abstract:

Virtual/augmented reality (VR/AR) devices offer both immersive imagery and sound. With those wide-field cues, we can simultaneously acquire and process visual and auditor...Show More

Abstract:

Virtual/augmented reality (VR/AR) devices offer both immersive imagery and sound. With those wide-field cues, we can simultaneously acquire and process visual and auditory signals to quickly identify objects, make decisions, and take action. While vision often takes precedence in perception, our visual sensitivity degrades in the periphery. In contrast, auditory sensitivity can exhibit an opposite trend due to the elevated interaural time difference. What occurs when these senses are simultaneously integrated, as is common in VR applications such as 360° video watching and immersive gaming? We present a computational and probabilistic model to predict VR users' reaction latency to visual-auditory multisensory targets. To this aim, we first conducted a psychophysical experiment in VR to measure the reaction latency by tracking the onset of eye movements. Experiments with numerical metrics and user studies with naturalistic scenarios showcase the model's accuracy and generalizability. Lastly, we discuss the potential applications, such as measuring the sufficiency of target appearance duration in immersive video playback, and suggesting the optimal spatial layouts for AR interface design.
Published in: IEEE Transactions on Visualization and Computer Graphics ( Volume: 30, Issue: 11, November 2024)
Page(s): 7364 - 7374
Date of Publication: 09 September 2024

ISSN Information:

PubMed ID: 39250397

Funding Agency:

Citations are not available for this document.

1 Introduction

In immersive applications, such as VR gaming or 360° video-watching, we synchronously integrate and interpret information from multiple independent sensory sources, including vision and hearing. The speed in integrating the multisensory cues and then taking subsequent action dictates our ability to successfully complete a task in time. Imagine we are playing a VR game. Is it possible we could fail to notice an approaching enemy because our focus shifts too slowly?

Cites in Papers - |

Cites in Papers - IEEE (1)

Select All
1.
Xi Peng, Kenneth Chen, Iran Roman, Juan Pablo Bello, Qi Sun, Praneeth Chakravarthula, "Perceptually-Guided Acoustic 'Foveation'", 2025 IEEE Conference Virtual Reality and 3D User Interfaces (VR), pp.450-460, 2025.

Cites in Papers - Other Publishers (1)

1.
Nenad Marković, Aleksandar Trifunović, Tijana Ivanišević, Sreten Simović, "The Influence of Vehicle Color on Speed Perception in Nighttime Driving Conditions", Sustainability, vol.17, no.8, pp.3591, 2025.

Contact IEEE to Subscribe

References

References is not available for this document.