Skip to Main Content
An architecture for speech- and auditory cue-based route instructions is introduced in the paper. The server side architecture is composed of a set of web services. The architecture allows connecting different route services behind a single interface. The route instructions can be transferred as recorded speech, vibration patterns, auditory cues and in textual form between the server and client side. The OpenLS Route Service schema is extended to include in the route instruction responses references to the recorded speech, auditory cues, vibration patterns, encoded textual instructions and brief instructions. The auditory cues may include auditory icons, earcons or spearcons. The textual instructions are based on the Speech Synthesis Markup Language, and a text-to-speech engine on the client side can automatically translate them. The presented holistic approach is aiming to increase the accessibility of route services, especially for visually impaired and elderly people.