Skip to Main Content
This paper presents a new technique for the update of a probabilistic spatial occupancy grid map using a forward sensor model. Unlike currently popular inverse sensor models, forward sensor models can be found experimentally and can represent sensor characteristics better. The formulation is applicable to both 2D and 3D range sensors and does not have some of the theoretical and practical problems associated with the current approaches which use forward models. As an illustration of this procedure, a new prototype 3D forward sensor model is derived using a beam represented as a spherical sector. Furthermore, this model is used for fusion of point-clouds obtained from different 3D sensors, in particular, time-of-flight sensors (Swiss-ranger, laser range finders), and stereo vision cameras. Several techniques are described for an efficient data-structure representation and implementation. The range beams from different sensors are fused in a common local Cartesian occupancy map. Experimental results of this fusion are presented and evaluated using Hough-transform performed on the grid.