By Topic

A robust top-down approach for rotation estimation and vanishing points extraction by catadioptric vision in urban environment

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Bazin, J.-C. ; RCV Lab., KAIST, Daejeon ; Inso Kweon ; Demonceaux, C. ; Vasseur, P.

A key requirement for unmanned aerial vehicles (UAV) applications is the attitude stabilization of the aircraft, which requires the knowledge of its orientation. It is now well established that traditional navigation equipments, like GPS or INS, suffer from several disadvantages. That is why some works have suggested a vision-based approach of the problem. Especially, catadioptric vision is more and more used since it permits to gather much more information from the environment, compared to traditional perspective cameras, and therefore the robustness of the UAV attitude estimation is improved. Rotation estimation from conventional and catadioptric images has been extensively studied. Whereas interesting results can be obtained, the existing methods have non-negligible limitations such as difficult features matching (e.g. repeated texture, blurring or illumination changing) or a high computational cost (e.g. vanishing point extraction or analyze in frequency domain). In order to overcome these limitations, this paper presents a top-down approach for estimating the rotation and extracting the vanishing points in catadioptric images. This new framework is accurate and can run in real-time. To obtain the ground truth data, we also calibrate our catadioptric camera with a gyroscope. Finally, experimental results on a real video sequence are presented and compared to the ground truth data obtained by the gyroscope.

Published in:

Intelligent Robots and Systems, 2008. IROS 2008. IEEE/RSJ International Conference on

Date of Conference:

22-26 Sept. 2008