Skip to Main Content
Optical tracking using ARToolkit provides us with the base technology for a wealth of augmented reality applications. However marker-based optical tracking using a single camera has some drawbacks. Markers must be fully visible to the camera all the time to produce tracking output - occlusion by other objects and limited camera field of view constrain the area that can be effectively tracked by ARToolkit. In our approach to improve tracking availability, tracking data from multiple hosts is shared across the network. Pairwise camera-to-camera relationships are established automatically, as soon as any marker is seen by two cameras, independent of the cameras' placement (e.g. cameras worn by a user or mounted at a "hot spot" location to improve tracking in that area). The setup is completely dynamic: both cameras can be continuously moving, and there is no "special marker" that must be seen by both cameras - as soon as any one marker in the system is visible to both cameras, all missing tracking information can be calculated from the data sent over the network. In this paper we describe such a configuration for multiple hosts in detail, as well as special aspects such as automatically selecting the best marker to use as a reference point, a description of the system's dataflow, scalability and accuracy issues and future work such as automatic configuration of the system.