By Topic

Mobile robot navigation using neural networks and nonmetrical environmental models

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Meng, M. ; Robot Vision Lab., Purdue Univ., West Lafayette, IN, USA ; Kak, A.C.

A reasoning and control architecture for vision-guided navigation that makes a robot more humanlike is presented. This system, called NEURO-NAV, discards the more traditional geometrical representation of the environment, and instead uses a semantically richer nonmetrical representation in which a hallway is modeled by the order of appearance of various landmarks and by adjacency relationships. With such a representation, it becomes possible for the robot to respond to commands such as, 'follow the corridor and turn right at the second T junction'. This capability is achieved by an ensemble of neural networks whose activation and deactivation are controlled by a rule-based supervisory controller. The individual neural networks in the ensemble are trained to interpret visual information and perform primitive navigational tasks such as hallway following and landmark detection.<>

Published in:

Control Systems, IEEE  (Volume:13 ,  Issue: 5 )