Skip to Main Content
Augmented Reality (AR) applications provide a high potential to support the user in solving tasks e.g. in scenarios from the domain of assembly, maintenance and repair. AR applications enrich the physical world around the user with additional useful virtual information. However, the integration of physical objects into AR user interfaces poses a challenge to developers. In addition, the information displayed in such an environment often strongly depends on the users current task. In this paper we present an approach to specify the integration of real objects into AR user interfaces and the task dependent visualization of AR user interface elements on the design level. To describe user tasks, the AR user interface structure and the relations between them we use UML activity diagrams in combination with the Scene Structure and Integration Modelling Language (SSIML), a visual language which provides support for the description of 3D user interface structures. Furthermore, code can be generated from the visual models. The proposed concepts are illustrated by an AR application example from the domain of assembly and lead to a new language called SSIML/AR.