Skip to Main Content
Previous methods for estimating the motion of an observer through a static scene require that image velocities can be measured. For the case of motion through a cluttered 3D scene, however, measuring optical flow is problematic because of the high density of depth discontinuities. This paper introduces a method for estimating motion through a cluttered 3D scene that does not measure velocities at individual points. Instead the method measures a distribution of velocities over local image regions. We show that motion through a cluttered scene produces a bowtie pattern in the power spectra of local image regions. We show how to estimate the parameters of the bowtie for different image regions and how to use these parameters to estimate observer motion. We demonstrate our method on synthetic and real data sequences.