By Topic

Real-time visual behaviors with a binocular active vision system

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
J. Batista ; Dept. of Electr. Eng., Coimbra Univ., Portugal ; P. Peixoto ; H. Araujo

An active vision system has to enable the implementation of reactive visual processes and of elementary visual behaviors in real time. In this paper we describe the real-time implementation of several visual behaviors in an active vision system. Issues related to the real-time implementation are discussed, namely in what concerns the modelling of the measurements made in the image. Even though in most applications a fully calibrated system is not required, we also describe a methodology for calibrating the camera head, taking advantage of its degrees of freedom. These calibration parameters are used to evaluate the performance of the system. Another important issue of the operation of active vision binocular heads is their integration into more complex robotic systems. We claim that higher levels of autonomy and integration can be obtained by designing the system architecture based on the concept of purposive behavior. At the lower levels we consider vision as a sensor and integrate it in control systems (both feedforward and servo loops) and several visual processes are implemented in parallel, computing relevant measures for the control process. At higher levels the architecture is modeled as a state transition system. Finally we show how this architecture can be used to implement a pursuit behavior using optical flow. Simultaneously vergence control can also be performed using the same visual processes

Published in:

Multisensor Fusion and Integration for Intelligent Systems, 1996. IEEE/SICE/RSJ International Conference on

Date of Conference:

8-11 Dec 1996