By Topic

Robust environment perception based on occupancy grid maps for autonomous vehicle

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Suganuma, N. ; Inst. of Sci. & Eng., Kanazawa Univ., Ishikawa, Japan ; Matsui, T.

For a purposes of reduction of driving workload, traffic accidents, and so on, autonomous vehicle systems, which can drive even though no human driver rides on, have been developed all over the world. Recently, it is considered that a part of such techniques play an important roll to energy saving. To navigate the autonomous vehicle in a complex environment, it is necessary to extract static objects precisely and robustly. Moreover, about dynamic objects, it is not enough to consider its existing position, and it is considered that estimating its motion and predicting its future position are strongly demanded. In this paper, for a solution of such problems, we propose an environment perception method using a LIDAR based on Occupancy Grid Maps.

Published in:

SICE Annual Conference 2010, Proceedings of

Date of Conference:

18-21 Aug. 2010