Skip to Main Content
This paper demonstrates how a robot can perform global localisation using a panoramic mirror in conjunction with a rich 3D model of the environment and a particle filter for localisation. The 3D model is created using Riegl Z420i laser range scans taken at various positions in the environment and registered together. An appearance-based map is created from this 3D model by sampling from it at various poses. For feature extraction we use Haar-wavelets as a global scene signature. In our experiments, the robot is able to localise on average within 1 metre of the true position and 5 degrees for orientation, in an environment measuring about 50times50 metres.