Skip to Main Content
This paper presents experimental results on the localization of a mobile robot equipped with relative frequent and absolute infrequent sensors. The relative sensors used are two: a wheel based odometry and a visual based odometry. The absolute sensor is a vision based landmark detector that computes the pose of the robot relative to a pre-mapped visual beacon. This would be a simple sensor fusion problem, which could be solved using standard recursive estimators, if we would not have considered two extra characteristics of the beacon detector: (1) since we assume a monocular vision system and a planar visual mark, the localization problem presents up to four possible solutions; and (2) the frequency that the robot meets a visual mark is very low (0.01 Hz or less). To consider these characteristics, we propose the use of a particle filter with a very precise prediction step (obtained by combining the two odometry sensors available) and a correction step that considers the multi-modal characteristic of the data. Besides, the sensor fusion algorithm, the paper also describes the development of the visual sensors used in the localization process.