Skip to Main Content
The distribution of tasks and stimuli across multiple modalities has been proposed as a means to support multitasking in data-rich environments. In particular, the tactile channel and, more specifically, communication via the use of tactile/haptic icons have received considerable interest in recent years. Past research has examined primarily the impact of concurrent task modality on the effectiveness of tactile information presentation. In contrast, the present study investigates to what extent the interpretation of complex tactile patterns - ldquotactonsrdquo - is affected by another attribute of information: the processing code of concurrent tasks. Participants decoded tactons composed of temporal patterns of vibrations (categorical data) - and concurrently interpreted one of two types of visual task stimuli - requiring either spatial or categorical processing - in a driving simulation. Compared to single-task performance, both dual-task conditions showed a performance decrement. As predicted by multiple resource theory, this decrement was significantly larger when the tacton task was paired with the visual task requiring categorical (as compared to spatial) processing. The findings from this study can serve as input to multidimensional quantitative models of timesharing performance. From an applied perspective, the results suggest that nonspatially-encoded tactons would be preferable in environments which rely heavily on spatial processing, such as car cockpits or flight decks.