Skip to Main Content
One way of interacting with a virtual environment is through visual perception. This paper presents research work on modeling an artificial vision system for highly mobile autonomous agents that is capable of dynamic obstacle avoidance and active perception. The robust performance of the system is demonstrated in artificial animals with directable, foveated eyes, situated in a physics-based virtual environment. Through simulated active perception, each agent controls its eyes and body by continuously analyzing photorealistic binocular retinal image streams. The vision system estimates optical flow, computes stereo disparity and segments looming targets in the low-resolution visual periphery while controlling eye movements to track, an object fixated in the high-resolution fovea. It matches segmented targets against mental models of colored objects of interest in order to decide whether the segmented objects are harmless or represent dangerous obstacles. The latter are localized, enabling the artificial animal to exercise the sensorimotor control necessary to support complex behaviors, such as predation, and obstacle avoidance.
Date of Conference: 2002