Abstract:
As an effective method for XOR problems, generalized eigenvalue proximal support vector machine (GEPSVM) recently has gained widespread attention accompanied with many va...Show MoreMetadata
Abstract:
As an effective method for XOR problems, generalized eigenvalue proximal support vector machine (GEPSVM) recently has gained widespread attention accompanied with many variants proposed. Although these variants strengthen the classification performance to different extents, the number of fitting hyperplanes, similar to GEPSVM, for each class is still limited to just one. Intuitively, using single hyperplane seems not enough, especially for the datasets with complex feature structures. Therefore, this article mainly focuses on extending the fitting hyperplanes for each class from single one to multiple ones. However, such an extension from the original GEPSVM is not trivial even though, if possible, the elegant solution via generalized eigenvalues will also not be guaranteed. To address this issue, we first make a simple yet crucial transformation for the optimization problem of GEPSVM and then propose a novel multiplane convex proximal support vector machine (MCPSVM), where a set of hyperplanes determined by the features of the data are learned for each class. We adopt a strictly (geodesically) convex objective to characterize this optimization problem; thus, a more elegant closed-form solution is obtained, which only needs a few lines of MATLAB codes. Besides, MCPSVM is more flexible in form and can be naturally and seamlessly extended to the feature weighting learning, whereas GEPSVM and its variants can hardly straightforwardly work like this. Extensive experiments on benchmark and large-scale image datasets indicate the advantages of our MCPSVM.
Published in: IEEE Transactions on Neural Networks and Learning Systems ( Volume: 34, Issue: 8, August 2023)