By Topic

Visual feedback by using a CNN chip prototype system

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
P. Arena ; Dipt. Elettrico Elettronico e Sistemistico, Catania Univ., Italy ; A. Basile ; L. Fortuna ; A. Virzi

Robot locomotion control passes through a series of sensors that, according to information from the environment, allow the robot to adapt, in real time, its locomotion scheme or trajectory. When the goal of the robot is to reach a target in a non-structured environment the best approach is visual control realized by a fast image processing system. Fast parallel image processing of the CNN-UM cP4000 chip prototype permits one to obtain good performance, even in a real time control problem. The robot controlled by the implemented CNN visual feedback has a hexapod configuration and its locomotion system is also implemented by a multi-layer CNN structure. In this paper a CNN approach for both locomotion generation and visual control of the bio-inspired robot is presented.

Published in:

Cellular Neural Networks and Their Applications, 2002. (CNNA 2002). Proceedings of the 2002 7th IEEE International Workshop on

Date of Conference:

22-24 Jul 2002