Skip to Main Content
Multi-dimensional data such as image patterns, image sequences, and brain signals, are often given in the form of the variance-covariance matrices or their eigenspaces to represent their own variations. For example, in face or object recognition problems, variations due to illuminations, camera angles can be represented by eigenspaces. A set of the eigenspaces is called the Grassmann manifold, and simple distance measurements in the Grassmann manifold, such as the projection metric have been used in conventional researches. However, in linear spaces, if the distribution of patterns is not isotropic, statistical distances such as the Mahalanobis distance are reasonable, and their performances are higher than simple distances in many problems. In this paper, we introduce the Mahalanobis distance in the Grassmann manifolds. Two experimental results, an object recognition problem and a brain signal processing, demonstrate the advantages of the proposed distance measurement.