Skip to Main Content
The standard 2-norm support vector machine (SVM for short) is known for its good performance in classification and regression problems. In this paper, the 1-norm support vector machine is considered and a novel smoothing function method for Support Vector Classification(SVC) and Regression (SVR) are proposed in an attempt to overcome some drawbacks of the former methods which are complex, subtle, and sometimes difficult to implement. First, using Karush-Kuhn-Tucker complementary condition in optimization theory, unconstrained non-differentiable optimization model is built. Then the smooth approximation algorithm basing on differentiable function is given. Finally, the paper trains the data sets with standard unconstraint optimization method. This algorithm is fast and insensitive to the initial point. Theory analysis and numerical results illustrate that the smoothing function method for SVMs are feasible and effective.