Skip to Main Content
Local feature matching has become a commonly used method to compare images. For mobile robots, a reliable method for comparing images can constitute a key component for localization tasks. In this paper we present a mobile robot localization system based on local feature matching of omnidirectional images. In particular, we address the issues of appearance-based topological localization by comparing common feature-extractor methods(SIFT and SURF) to select robust features to match the current robot view with reference images. Our datasets, each consisting of a large number of omnidirectional images, have been acquired over different day times (different lighting conditions) and dynamic content in large outdoor environments (over 80.000 m2). Two different approaches (WTA and MCL) were used to evaluate performances, which, in general, are satisfactory. In particular, the use of Monte Carlo particle filtering improves topological localization results for all datasets with all algorithms.