By Topic

Mahalanobis distance on Grassmann manifold and its application to brain signal processing

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Yoshikazu Washizawa ; The University of Electro-Communications, Japan ; Seiji Hotta

Multi-dimensional data such as image patterns, image sequences, and brain signals, are often given in the form of the variance-covariance matrices or their eigenspaces to represent their own variations. For example, in face or object recognition problems, variations due to illuminations, camera angles can be represented by eigenspaces. A set of the eigenspaces is called the Grassmann manifold, and simple distance measurements in the Grassmann manifold, such as the projection metric have been used in conventional researches. However, in linear spaces, if the distribution of patterns is not isotropic, statistical distances such as the Mahalanobis distance are reasonable, and their performances are higher than simple distances in many problems. In this paper, we introduce the Mahalanobis distance in the Grassmann manifolds. Two experimental results, an object recognition problem and a brain signal processing, demonstrate the advantages of the proposed distance measurement.

Published in:

2012 IEEE International Workshop on Machine Learning for Signal Processing

Date of Conference:

23-26 Sept. 2012