By Topic

Integrating multi-sensory input in the body model — A RNN approach to connect visual features and motor control

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Schilling, M. ; Int. Comput. Sci. Inst., Berkeley, CA, USA

An internal model of the own body can be assumed to be a central and early representation as such a model is already required in simple behavioural tasks. More and more evidence is showing that such grounded internal models are applied in higher level tasks. Internal models appear to be recruited in service for cognitive function. Understanding what another person is doing seems to rely on the ability to step into the shoes of the other person and map the observed action onto ones own action system. This rules out dedicated and highly specialized models, but presupposes a flexible internal model which can be applied in different context and fulfilling different functions. Here, we are going to present a recurrent neural network approach of an internal body model. The model can be used in the context of movement control, e.g. in reaching tasks, but can also be employed as a predictor, e.g. for planning ahead. The introduced extension allows to integrate visual features into the kinematic model. Simulation results show how in this way the model can be to utilised in perception.

Published in:

Neural Networks (IJCNN), The 2011 International Joint Conference on

Date of Conference:

July 31 2011-Aug. 5 2011