By Topic

Towards a fully-autonomous vision-based vehicle navigation system in outdoor environments

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Moghadam, P. ; Sch. of Electr. & Electron. Eng., Nanyang Technol. Univ., Singapore, Singapore ; Wijesoma, W.S. ; Moratuwage, M.D.P.

Colour Stereo visions are the primary perception system of the most Unmanned Ground Vehicles (UGVs), which can provide not only 3D perception of the terrain but also its colour and texture information. The downside with present stereo vision technologies and processing algorithms is that they are limited by the cameras' field of view and maximum range, which causes the vehicles to get caught in cul-de-sacs. The philosophy underlying the proposed framework in this paper is to use the near-field stereo vision information associated with the terrain appearance to train a classifier to classify the far-field terrain well beyond the stereo range for each incoming image. We propose an online, self-supervised learning method to learn far-field terrain traversability with the ability to adapt to unknown environments without using hand-labelled training data. The method described in this paper enhances current near-to-far learning techniques by automating the task of selecting which learning strategy to be used from among several strategies based on the nature of the incoming real-time input training data. Promising results obtained using real datasets from the DARPA-LAGR program is presented and the performance is evaluated using hand-labelled ground truth.

Published in:

Control Automation Robotics & Vision (ICARCV), 2010 11th International Conference on

Date of Conference:

7-10 Dec. 2010