Skip to Main Content
To improve the classification performance of k-NN, this paper presents a classifier, called k -NS, based on the Euclidian distances from a query sample to the nearest subspaces. Each nearest subspace is spanned by k nearest samples of a same class. A simple discriminant is derived to calculate the distances due to the geometric meaning of the Grammian, and the calculation stability of the discriminant is guaranteed by embedding Tikhonov regularization. The proposed classifier, k-NS, categorizes a query sample into the class whose corresponding subspace is proximal. Because the Grammian only involves inner products, the classifier is naturally extended into the high-dimensional feature space induced by kernel functions. The experimental results on 13 publicly available benchmark datasets show that k-NS is quite promising compared to several other classifiers founded on nearest neighbors in terms of training and test accuracy and efficiency.