Skip to Main Content
This paper introduces an automatic approach for the estimation of registration parameters between successive viewpoints visited by a laser range sensor. The proposed technique works directly on the raw range measurements and does not require any external device for pose estimation nor any sophisticated feature extraction or triangular mesh computation. Assuming only object rigidity and some overlap between the scanned areas, the approach allows to estimate the full set of six parameters that define geometrical transformations in three-dimensional space. A compact modified Gauss sphere representation is used to encode a simple planar patch approximation of the objects' surface and to validate mapping between the measurements collected from different viewpoints. The technique also makes use of the compact surface representation to successively estimate the rotation and the translation parameters between sensor viewpoints. This solution results in an important reduction of the computational workload and provides sufficient accuracy for most robot navigation applications. The proposed approach performances are demonstrated in an experimental context using real range measurements collected from a series of viewpoints.