Skip to Main Content
We propose a feature selection algorithm suitable for classification problems. Our algorithm tries to find a subset of features, which maximizes separability between Gaussian clusters. To reduce the complexity of exhaustive searching the best feature set, we follow a backward elimination method. Our feature selection algorithm can be applied to a full search classifier to obtain a single global subspace. However, one global subspace may not alone capture local behavior well. We realize multiple subspace clustering by applying our dimension reduction algorithm to a tree structured classifier. Experimental results show that the resulting classifier not only removes irrelevant features but also improves classification performance.