Skip to Main Content
People use mobile web applications in a variety of contexts, typically on-the-go, while engaged in other tasks, such as walking, jogging or driving. Conventional visual user interfaces are efficient for supporting quick scanning of a page, but they can easily cause distractions and accidents. This problem is intensified when web information services are richer and highly structured in content and navigation architectures. To support a graceful evolution of web systems from a conventional to an aural experience, we introduce ANFORA (Aural Navigation Flows On Rich Architectures), a framework for designing mobile web systems based on automated, semi-controlled aural navigation flows that can be listened to by the user while engaged in a secondary activity (e.g., walking). We demonstrate a set of design rules that could govern salient aural interactions with large web architectures. Our approach opens a new paradigm for aural web systems which can complement existing visual interfaces, and has the potential to inform new technologies, navigation models, design tools, and methods in the area of aural web information access. As case study, we are applying ANFORA to the domain of web-based news casting.