Skip to Main Content
Pose estimation of mobile devices is useful for a wide variety of applications, including augmented reality and geo-tagging. Even though most of today's cell phones are equipped with sensors such as GPS, accelerometers, and gyros, the pose estimated via these is often inaccurate, particularly in urban environments. In this paper, we describe an image based localization algorithm for estimating the pose of cell phones in urban environments. Our proposed approach solves for a homography transformation matrix between the cell phone image and a matching database image, constrained by knowledge of the change in orientation obtained from the cell phone gyro, and augmented with 3D information from the database to achieve an estimate of pose which improves upon readings from the GPS and compass. We characterize the performance of this approach for a dataset in Oakland, CA and show that for a query set of 92 images, our computed location (yaw) is within 10 meters (degrees) for 92% (96%) of queries as compared to 31% (26%) for the GPS (compass) on the cell phone.