Skip to Main Content
A distributed testbed designed to support the development of a multi-sensory (vision and tactile) system for investigations in "active perception" of three dimensional objects is presented. Active perception means being able to not only see and feel objects but also manipulate and probe them. The nucleus of the testbed is a network of heterogeneous computers designed to support low-level real-time control processes as well as high-level knowledge-based systems. The programming environment of the testbed facilitates the construction and execution of a distributed multi-sensory system from sequential programs written in different programming languages.