Skip to Main Content
Summary form only given. In 1987 Alan Kay pointed out that the development of interfaces to computers is going backwards. Whereas children begin building their understanding of the world through embodied exploration, then “advance” to learning through images and finally reach the “sophisticated” state of symbolic understanding, our interaction with computers began by being symbolic, progressed through the GUI and only now is reaching the point where we can explore computational environments through movement and touch. While this has certainly been true for mainstream computing, interaction for people with visual impairments got stuck at the symbolic phase. We still use command lines and keyboard shortcuts to navigate through our interfaces, and they still communicate with us through words (either spoken or, very occasionally, through the symbolic code of Braille. In many ways, therefore, blind computer users have not been able to take advantage of the revolution in information presentation facilitated by computation that has changed the lives of their sighted peers. And this is not for lack of trying to find alternative ways for blind individuals to access non-textual information. There is a long history of attempts to build refreshable tactile displays and, more recently, of using haptic feedback to present graphs, maps and 3D models to blind computer users. But in most cases the main stumbling block has been the cost and availability of suitable hardware for this purpose. The search is on, therefore, for a low-cost full-page refreshable tactile display that can convey both text and graphics. Such a display would allow for the presentation of spatial information such as curves and lines in graphs, shapes of objects in pictures and maps, and the display of spatially-dependent Braille codes for mathematics and music. A second part of the process of designing effective tactile and/or haptic information displays is in understanding how - o convey spatial and relational information through touch. Here too, there is a substantial body of work on haptic spatial perception, haptic exploration, and on the perception of tactile diagrams and images upon which we as designers can draw. In this talk, I will introduce our own bid to find the Holy Braille. I will first review relevant work in both hardware design and haptic perception. I will then discuss our design process for an interactive product that takes full account of the many perceptual and cognitive layers inherent in presenting information in tangible form to users who have little or no vision. In particular I will discuss the knotty issue of understanding the differences between xviii sensory substitution, symbolic translation, and semantic interpretation, and the potential pitfalls of misunderstanding the relationship between these three categories of information presentation.