By Topic

Hebbian learning of visually directed reaching by a robot arm

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

6 Author(s)
Yiwen Wang ; Hong Kong Univ. of Sci. & Technol., Kowloon, China ; Tingfan Wu ; Orchard, G. ; Dudek, P.
more authors

We describe a robotic system consisting of an arm and an active vision system learns to align its sensory and motor maps so that it can successfully reach the tip of its arm to touch the point where it is looking. This system uses an unsupervised Hebbian learning algorithm, and learns the alignment by watching its arm waving in front of its eyes. After watching for 25 minutes, the maps are sufficiently well aligned that it can execute the desired behavior.

Published in:

Biomedical Circuits and Systems Conference, 2009. BioCAS 2009. IEEE

Date of Conference:

26-28 Nov. 2009