A Linearly Convergent Linear-Time First-Order Algorithm for Support Vector Classification with a Core Set Result

We present a simple, first-order approximation algorithm for the support vector classification problem. Given a pair of linearly separable data sets and $\epsilon \in (0,1)$, the proposed algorithm computes a separating hyperplane whose margin is within a factor of $(1-\epsilon)$ of that of the maximum-margin separating hyperplane. We discuss how our algorithm can be extended to nonlinearly separable and inseparable data sets. The running time of our algorithm is linear in the number of data points and in $1/\epsilon$. In particular, the number of support vectors computed by the algorithm is bounded above by $O(\zeta/\epsilon)$ for all sufficiently small $\epsilon > 0$, where $\zeta$ is the square of the ratio of the distances between the farthest and closest points in the two data sets. Furthermore, we establish that our algorithm exhibits linear convergence. We adopt the real number model of computation in our analysis.

Citation

Bilkent University, Department of Industrial Engineering, Technical Report, April 2008.

Article

Download

View PDF