Skip to Main Content
Linear representations and linear dimension reduction techniques are very common in signal and image processing. Many such applications reduce to solving problems of stochastic optimizations or statistical inferences on the set of all subspaces, i.e. a Grassmann manifold. Central to solving them is the computation of an "exponential" map (for constructing geodesies) and its inverse on a Grassmannian. Here we suggest efficient techniques for these two steps and illustrate two applications: (i) For image-based object recognition, we define and seek an optimal linear representation using a Metropolis-Hastings type, stochastic search algorithm on a Grassmann manifold, (ii) For statistical inferences, we illustrate computation of sample statistics, such as mean and variances, on a Grassmann manifold.