Loading [MathJax]/extensions/TeX/ietmacros.js
Estimation of automotive pitch, yaw, and roll using enhanced phase correlation on multiple far-field windows | IEEE Conference Publication | IEEE Xplore

Estimation of automotive pitch, yaw, and roll using enhanced phase correlation on multiple far-field windows


Abstract:

The online-estimation of yaw, pitch, and roll of a moving vehicle is an important ingredient for systems which estimate egomotion, and 3D structure of the environment in ...Show More

Abstract:

The online-estimation of yaw, pitch, and roll of a moving vehicle is an important ingredient for systems which estimate egomotion, and 3D structure of the environment in a moving vehicle from video information. We present an approach to estimate these angular changes from monocular visual data, based on the fact that the motion of far distant points is not dependent on translation, but only on the current rotation of the camera. The presented approach does not require features (corners, edges, …) to be extracted. It allows to estimate in parallel also the illumination changes from frame to frame, and thus allows to largely stabilize the estimation of image correspondences and motion vectors, which are most often central entities needed for computating scene structure, distances, etc. The method is significantly less complex and much faster than a full egomotion computation from features, such as PTAM [6], but it can be used for providing motion priors and reduce search spaces for more complex methods which perform a complete analysis of egomotion and dynamic 3D structure of the scene in which a vehicle moves.
Date of Conference: 28 June 2015 - 01 July 2015
Date Added to IEEE Xplore: 27 August 2015
ISBN Information:
Print ISSN: 1931-0587
Conference Location: Seoul, Korea (South)

I. Motivation and Related Work

In Advanced Driving Assistance Systems (ADAS) scenarios, we have to deal with video streams which are strongly affected by vibrations and the steering of the vehicle. Furthermore, such video streams are most often afflicted by significant fluctuations of the illumination, leading to strong changes of gain and offset in the camera. Thus, we may neither assume stable motion, nor gray value constancy ( temporal photometric stability). Both the illumination changes as well as the (angular) vibrations make the determination of matches (correspondences) between subsequent frames difficult, as they negatively affect visual (feature) tracking. On the other hand, since the translation part of vehicle motion changes only slowly, the ability to compute an estimate of the instantaneous angular motion allows to start with a good estimate of the instantaneous motion, which alleviates tracking.

Contact IEEE to Subscribe

References

References is not available for this document.