Skip to Main Content
Among most of the subspace learning methods, Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) are classic ones. PCA tries to maximize the total scatter across all classes. In this case, however, the data set, with a small between-class scatter and a large within-class scatter, can also have a large total scatter. It conflicts with Maximum Margin Criterion (MMC) which tries to maximize the between-class scatter and minimize the within-class scatter. To address the conflict problem, we proposed a dynamic subspace learning method which can balance the objectives of PCA and MMC simultaneously by searching for the best coefficient. Our experiments are implemented by classification on two tumor microarray datasets. Firstly a simple t-test was used for gene selection, then our novel method was applied to gene extraction, and finally we adopted KNN and SVM classifiers to evaluate the effectiveness of our method. Results show that the new feature extractors are effective and stable.
Natural Computation (ICNC), 2011 Seventh International Conference on (Volume:1 )
Date of Conference: 26-28 July 2011