As in most Data Mining procedures, how to tune the parameters of a Support Vector Machine (SVM) is a critical, though not sufficiently explored, issue. The default approach is a grid search in the parameter space, which becomes prohibitively time-consuming even when just a few parameters are to be tuned. For this reason, for models involving a higher number of parameters, different metaheuristics have been recently proposed as an alternative. In this paper we customize a continuous Variable Neighborhood Search to tune the parameters of the SVM. Our framework is general enough to allow one to address, with the very same method, several popular SVM parameter models encountered in the literature. The structure of the optimization problem at hand is successfully exploited. This is done by expressing the problem of parameter tuning as a collection of nested problems which can be solved sequentially. As algorithmic requirements we only need a routine which, for given parameters, finds the SVM classifier. Hence, as soon as an SVM library or any routine for solving linearly constrained convex quadratic optimization problems is available, our approach is applicable. The experimental results show the usefulness of our tuning method for different SVM parameter models analyzed in the literature: we can address tuning problems with dimensions for which grid search is infeasible, and, at the same time, our rather general approach yields comparable results, in terms of classification accuracy, against ad-hoc benchmark tuning methods designed for specific SVM parameter models.
Citation
unpublished