Skip to Main Content
A wearable twenty-channel electrotactile vocoder was used to transform audio speech stimuli into tactile patterns via a linear display on the abdomen, analogous to a frequency-to-spatial transform with increased resolution in the F2 region. A two-choice discrimination task, with simultaneous auditory and tactile feedback, was used to train and test hearing subjects on the tactile discrimination of monosyllabic words having minimal phonemic differences. In subsequent studies, the perception of words embedded in sentences and in connected discourse was tested. Results, and implications for the processing of speech information via the tactile mode, will be discussed.