Optimal distance separating halfspace

One recently proposed criterion to separate two datasets in discriminant analysis, is to use a hyperplane which minimises the sum of distances to it from all the misclassified data points. Here all distances are supposed to be measured by way of some fixed norm,while misclassification means lying on the wrong side of the hyperplane, or … Read more

Optimal expected-distance separating halfspace

One recently proposed criterion to separate two datasets in discriminant analysis, is to use a hyperplane which minimises the sum of distances to it from all the misclassified data points. Here all distances are supposed to be measured by way of some fixed norm, while misclassification means lying on the wrong side of the hyperplane, … Read more

A New Computational Approach to Density Estimation with Semidefinite Programming

Density estimation is a classical and important problem in statistics. The aim of this paper is to develop a new computational approach to density estimation based on semidefinite programming (SDP), a new technology developed in optimization in the last decade. We express a density as the product of a nonnegative polynomial and a base density … Read more

Gradient Projection Methods for Quadratic Programs and Applications in Training Support Vector Machines

Gradient projection methods based on the Barzilai-Borwein spectral steplength choices are considered for quadratic programming problems with simple constraints. Well known nonmonotone spectral projected gradient methods and variable projection methods are discussed. For both approaches the behavior of different combinations of the two spectral steplengths is investigated. A nw adaptive stplength alternating rule is proposed, … Read more

The Maximum Box Problem and its Application to Data Analysis

Given two finite sets of points $X^+$ and $X^-$ in $\R^n$, the maximum box problem consists in finding an interval (“box”) $B=\{x : l \leq x \leq u\}$ such that $B\cap X^-=\emptyset$, and the cardinality of $B\cap X^+$ is maximized. A simple generalization can be obtained by instead maximizing a weighted sum of the elements … Read more

Semismooth Support Vector Machines

The linear support vector machine can be posed as a quadratic program in a variety of ways. In this paper, we look at a formulation using the two-norm for the misclassification error that leads to a positive definite quadratic program with a single equality constraint when the Wolfe dual is taken. The quadratic term is … Read more

Interior point methods for massive support vector machines

We investigate the use of interior point methods for solving quadratic programming problems with a small number of linear constraints where the quadratic term consists of a low-rank update to a positive semi-definite matrix. Several formulations of the support vector machine fit into this category. An interesting feature of these particular problems is the volume … Read more