Skip to Main Content
Two strategies for selecting the kernel parameter (sigma) and the penalty coefficient (C) of Gaussian support vector machines (SVMs) are suggested in this paper. Based on viewing the model parameter selection problem as a recognition problem in visual systems, a direct parameter setting formula for the kernel parameter is derived through finding a visual scale at which the global and local structures of the given data set can be preserved in the feature space, and the difference between the two structures can be maximized. In addition, we propose a heuristic algorithm for the selection of the penalty coefficient through identifying the classification extent of a training datum in the implementation process of the sequential minimal optimization (SMO) procedure, which is a well-developed and commonly used algorithm in SVM training. We then evaluate the suggested strategies with a series of experiments on 13 benchmark problems and three real-world data sets, as compared with the traditional 5-cross validation (5-CV) method and the recently developed radius-margin bound (RM) method. The evaluation shows that in terms of efficiency and generalization capabilities, the new strategies outperform the current methods, and the performance is uniform and stable.