Acoustic Cues Increase Situational Awareness in Accident Situations: A VR Car-Driving Study | IEEE Journals & Magazine | IEEE Xplore

Acoustic Cues Increase Situational Awareness in Accident Situations: A VR Car-Driving Study


Abstract:

Our work for the first time evaluates the effectiveness of visual and acoustic warning systems in an accident situation using a realistic, immersive driving simulation. I...Show More

Abstract:

Our work for the first time evaluates the effectiveness of visual and acoustic warning systems in an accident situation using a realistic, immersive driving simulation. In a first experiment, 70 participants were trained to complete a course at high speed. The course contained several forks where a wrong turn would lead to the car falling off a cliff and crashing – these forks were indicated either with a visual warning sign for a first, no-sound group or with a visual and auditory warning cue for a second, sound group. In a testing phase, right after the warning signals were given, trees suddenly fell on the road, leaving the (fatal) turn open. Importantly, in the no-sound group, 18 out of 35 people still chose this turn, whereas in the sound group only 5 out of 35 people did so – the added sound therefore had a large and significant increase in situational awareness. We found no other differences between the groups concerning age, physiological responses, or driving experience. In a second replication experiment, the setup was repeated with another 70 participants without emphasis on driving speed. Results fully confirmed the previous findings with 17 out of 35 people in the no-sound group versus only 6 out of 35 in the sound group choosing the turn to the cliff. With these two experiments using a one-shot design to avoid pre-meditation and testing naïve, rapid decision-making, we provide clear evidence for the advantage of visual-auditory in-vehicle warning systems for promoting situational awareness.
Published in: IEEE Transactions on Intelligent Transportation Systems ( Volume: 23, Issue: 4, April 2022)
Page(s): 3281 - 3291
Date of Publication: 16 November 2020

ISSN Information:

Funding Agency:

References is not available for this document.

I. Introduction

Human errors constitute the most common reason for car accidents: based on the National Highway Traffic Safety Administration’s 2015 crash statistics report, 94 percent of all crashes are related to drivers’ errors. Importantly, 41 percent of these crashes involve some sort of attentional lapse or distraction of the driver [1]. Such distractions can originate both from in-vehicle or outside events. Concerning in-vehicle distractions, the use of mobile phones during driving is a considerable drain of attentional resources known to reduce, for example, braking speed [2] and to lead to overall poorer driving performance [3]. Similarly, one-time distractions coming from external stimuli also may cause the driver to withdraw attention from the driving task: this can, for example, be due to salient advertisements, the sudden appearance of animals [4], or even a small object on the road that can cause a critical accident if the driver focuses on avoiding the object rather than keeping track of nearby cars [5].

Select All
1.
S. Singh, “Critical reasons for crashes investigated in the national motor vehicle crash causation survey,” Traffic Saf. Facts CrashoStats, Nat. Highway Traffic Saf. Admin., Washington, DC, USA, Tech. Rep. HS 812 115, 2015.
2.
D. L. Strayer, F. A. Drews, and W. A. Johnston, “Cell phone-induced failures of visual attention during simulated driving,” J. Experim. Psychol., Appl., vol. 9, no. 1, pp. 23–32, 2003.
3.
D. E. Haigney, R. G. Taylor, and S. J. Westerman, “Concurrent mobile (cellular) phone use and driving performance: Task demand characteristics and compensatory processes,” Transp. Res. F, Traffic Psychol. Behav., vol. 3, no. 3, pp. 113–121, Sep. 2000.
4.
M. A. Regan, J. D. Lee, and T. W. Victor, Driver Distraction and Inattention: Advances in Research and Countermeasures (Human Factors in Road and Rail Transport), no. 1. Burlington, VT, USA : Ashgate, 2013, p. 440.
5.
K. Pammer and C. Blink, “Attentional differences in driving judgments for country and city scenes: Semantic congruency in inattentional blindness,” Accident Anal. Prevention, vol. 50, pp. 955–963, Jan. 2013.
6.
S. G. Klauer, T. A. Dingus, V. L. Neale, J. D. Sudweeks, and D. J. Ramsey, “The impact of driver inattention on near-crash/crash risk: An analysis using the 100-car naturalistic driving study data,” Nat. Highway Traffic Saf. Admin., Washington, DC, USA, Tech. Rep. HS 810 594, 2006.
7.
M. Staubach, “Factors correlated with traffic accidents as a basis for evaluating advanced driver assistance systems,” Accident Anal. Prevention, vol. 41, no. 5, pp. 1025–1033, Sep. 2009.
8.
M. A. Nees and B. N. Walker, “Auditory displays for in-vehicle technologies,” Rev. Hum. Factors Ergonom., vol. 7, no. 1, pp. 58–99, Sep. 2011.
9.
X. Yan, Y. Liu, and Y. Xu, “Effect of audio in-vehicle red light–running warning message on driving behavior based on a driving simulator experiment,” Traffic Injury Prevention, vol. 16, no. 1, pp. 48–54, Jan. 2015.
10.
W. Xiang, X. Yan, J. Weng, and X. Li, “Effect of auditory in-vehicle warning information on drivers’ brake response time to red-light running vehicles during collision avoidance,” Transp. Res. F, Traffic Psychol. Behaviour, vol. 40, pp. 56–67, Jul. 2016.
11.
C. L. Baldwin, “Acoustic and semantic warning parameters impact vehicle crash rates,” in Proc. 13th Int. Conf. Auditory Display, Montreal, QC, Canada, 2007.
12.
J. Fagerl, S. Lindberg, and A. Sirkka, “Combined auditory warnings for driving-related information,” presented at the Audio Mostly Interact. Sound, Thessaloniki, Greece, 2015.
13.
R. Mohebbi, R. Gray, and H. Z. Tan, “Driver reaction time to tactile and auditory rear-end collision warnings while talking on a cell phone,” Hum. Factors, J. Hum. Factors Ergonom. Soc., vol. 51, no. 1, pp. 102–110, Feb. 2009.
14.
C. Ho and C. Spence, The Multisensory Driver: Implications for Ergonomic Car Interface Design. Farnham, U.K. : Ashgate, 2012.
15.
B. Mok, “Emergency, automation off: Unstructured transition timing for distracted drivers of automated vehicles,” in Proc. IEEE 18th Int. Conf. Intell. Transp. Syst., Sep. 2015, pp. 2458–2464.
16.
M. Saffarian, J. C. De Winter, and J. W. Senders. ( 2017 ). The Effect of a Short Occlusion Period on Subsequent Braking Behavior: A Driving Simulator Study. [Online]. Available: https://www.researchgate.net/publication/314658202
17.
C. D. Wickens, “Multiple resources and mental workload,” Hum. Factors, J. Hum. Factors Ergonom. Soc., vol. 50, no. 3, pp. 449–455, Jun. 2008.
18.
C. L. Baldwin, Auditory Cognition and Human Performance: Research and Applications. Boca Raton, FL, USA : Taylor & Francis, 2012, p. 325.
19.
M. R. Endsley, “Toward a theory of situation awareness in dynamic systems,” Hum. Factors, J. Hum. Factors Ergonom. Soc., vol. 37, no. 1, pp. 32–64, Mar. 1995.
20.
S. D. Kreibig, “Autonomic nervous system activity in emotion: A review,” Biol. Psychol., vol. 84, no. 3, pp. 394–421, Jul. 2010.
21.
T. Zimasa, S. Jamson, and B. Henson, “Are happy drivers safer drivers? Evidence from hazard response times and eye tracking data,” Transp. Res. F, Traffic Psychol. Behav., vol. 46, pp. 14–23, Apr. 2017.
22.
N. A. Stanton, Handbook of Human Factors and Ergonomics Methods. Boca Raton, FL, USA : CRC Press, 2005.
23.
M. Peruzzini, M. Tonietti, and C. Iani, “Transdisciplinary design approach based on driver’s workload monitoring,” J. Ind. Inf. Integr., vol. 15, pp. 91–102, Sep. 2019.
24.
F. Faul, E. Erdfelder, A.-G. Lang, and A. Buchner, “G* Power3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences,” Behav. Res. Methods, vol. 39, no. 2, pp. 175–191, May 2007.
25.
D. J. Simons and C. F. Chabris, “Gorillas in our midst: Sustained inattentional blindness for dynamic events,” Perception, vol. 28, no. 9, pp. 1059–1074, 1999.
26.
G. A. Alvarez and P. Cavanagh, “The capacity of visual short-term memory is set both by visual information load and by number of objects,” Psychol. Sci., vol. 15, no. 2, pp. 106–111, Feb. 2004.
27.
K. Pammer, J. Bairnsfather, J. Burns, and A. Hellsing, “Not all hazards are created equal: The significance of hazards in inattentional blindness for static driving scenes,” Appl. Cognit. Psychol., vol. 29, no. 5, pp. 782–788, Sep. 2015.
28.
D. L. Strayer, J. Turrill, J. R. Coleman, E. V. Ortiz, and J. M. Cooper, “Measuring cognitive distraction in the automobile II: Assessing in-vehicle voice-based interactive technologies,” AAA Found. Traffic Saf., Washington, DC, USA, Tech. Rep., 2014.
29.
S. A. Brewster, “Using non-speech sound to overcome information overload,” Displays, vol. 17, nos. 3–4, pp. 179–189, May 1997.
30.
J. Maciej and M. Vollrath, “Comparison of manual vs. speech-based interaction with in-vehicle information systems,” Accident Anal. Prevention, vol. 41, no. 5, pp. 924–930, Sep. 2009.

Contact IEEE to Subscribe

References

References is not available for this document.