Skip to Main Content
On this paper, inertial orientation measurements are exploited to compensate the rotational degrees of freedom for an aerial vehicle carrying a perspective camera, taking a sequence of images of the ground plane. It is known that, on the pure translation case, full homographies are reduced to planar homologies, and the relative scene depth of two points equals the reciprocal ratio of their image distances to the the FOE. The first part of this paper covers trajectory recovery for an airship carrying a perspective camera taking a sequence of images of the ground plane, as a series of relative poses between successive camera poses. This is commonly named "visual odometry". Previous results showed that the ratio of heights over the ground plane on two views can be calculated more accurately, and thus the altitude component of the trajectory, and here these results are extended by recovering the full 3D camera trajectory. In the second part, the same rotation-compensated imagery is exploited on the mapping domain: from pixel correspondences between successive images the height of points over the ground plane can be recovered, and placed on a DEM grid, performing 3D mapping from monocular aerial images. These results may be useful on the SLAM context.