By Topic

Implementation of real time spatial mapping in robotic systems through self-organizing neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Morellas, V. ; Dept. of Mech. Eng., Minnesota Univ., Minneapolis, MN, USA ; Minners, J. ; Donath, M.

Presents a methodology which allows an autonomous agent i.e., a mobile robot, to learn and build maps of its operating environment by relying only on its range sensors. The maps, described with respect to the robot's inertial frame, are developed in real time by correlating robot position and sensory data. This latter feature characterizes part of the uniqueness of the authors' approach. These maps are topologically isomorphic to the maps created for the same room(s) by humans. The methodology exploits the principle of self-organization, implemented as an artificial neural network module which processes incoming sensor range data. The generation of environmental maps can be visualized as an elastic string of neurons whereby every neuron represents a finite portion of the physical world. This elastic string stretches dynamically so as to take on the shape of the environment, a unique characteristic of the authors' methodology. In this respect, the neural net provides a discretized representation of the “continuous” physical environment as the latter is seen through the robot's own sensors. Experiments, focused on indoor applications, have successfully demonstrated the ability of a robot to build maps of geometrically complex environments. The results presented in this paper, compared with the authors' earlier efforts, show significant improvement in that every single sensor data point contributes equally to the location of the neurons of the spatial map at the end of the learning process. This is important because the authors wish to minimize the effect of the order in which data points are processed

Published in:

Intelligent Robots and Systems 95. 'Human Robot Interaction and Cooperative Robots', Proceedings. 1995 IEEE/RSJ International Conference on  (Volume:1 )

Date of Conference:

5-9 Aug 1995