Skip to Main Content
We present a framework that allows for localization based on very low resolution omnidirectional image data using regression techniques. Previous related methods are constrained to image data labeled with exact position information acquired in the training phase. We relax this constraint and propose to learn local heteroscedastic Gaussian processes by accumulating odometry data which can easily be acquired. The processes are used as a probabilistic map to predict recording positions of newly acquired images by a fusion of the uncertain training data. In contrast to many feature-based approaches, our framework does not rely on any explicit correspondences over images as well as over positions and only imposes very weak assumptions on the type and quality of the image representations.