Loading [a11y]/accessibility-menu.js
360ViewPET: View Based Pose EsTimation for Ultra-Sparse 360-Degree Cameras | IEEE Conference Publication | IEEE Xplore

360ViewPET: View Based Pose EsTimation for Ultra-Sparse 360-Degree Cameras


Abstract:

Immersive virtual tours based on 360-degree cameras, showing famous outdoor scenery, are becoming more and more desirable due to travel costs, pandemics and other constra...Show More

Abstract:

Immersive virtual tours based on 360-degree cameras, showing famous outdoor scenery, are becoming more and more desirable due to travel costs, pandemics and other constraints. To feel immersive, a user must receive the view accurately corresponding to her position and orientation in the virtual space when she moves inside, and this requires cameras’ orientations to be known. Outdoor tour contexts have numerous, ultra-sparse cameras deployed across a wide area, making camera pose estimation challenging. As a result, pose estimation techniques like SLAM, which require mobile or dense cameras, are not applicable. In this paper we present a novel strategy called 360ViewPET, which automatically estimates the relative poses of two stationary, ultra-sparse (15 meters apart) 360-degree cameras using one equirectangular image taken by each camera. Our experiments show that it achieves accurate pose estimation, with a mean error as low as 0.9 degree.
Date of Conference: 29 November 2021 - 01 December 2021
Date Added to IEEE Xplore: 10 January 2022
ISBN Information:
Conference Location: Naple, Italy

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.