Adaptive Constraint Reduction for Convex Quadratic Programming

We propose an adaptive, constraint-reduced, primal-dual interior-point algorithm for convex quadratic programming with many more inequality constraints than variables. We reduce the computational e ort by assembling, instead of the exact normal-equation matrix, an approximate matrix from a well chosen index set which includes indices of constraints that seem to be most critical. Starting with a … Read more

Multi-Standard Quadratic Optimization Problems

A Standard Quadratic Optimization Problem (StQP) consists of maximizing a (possibly indefinite) quadratic form over the standard simplex. Likewise, in a multi-StQP we have to maximize a (possibly indefinite) quadratic form over the cartesian product of several standard simplices (of possibly different dimensions). Two converging monotone interior point methods are established. Further, we prove an … Read more

A First-Order Interior-Point Method for Linearly Constrained Smooth Optimization

We propose a first-order interior-point method for linearly constrained smooth optimization that unifies and extends first-order affine-scaling method and replicator dynamics method for standard quadratic programming. Global convergence and, in the case of quadratic programs, (sub)linear convergence rate and iterate convergence results are derived. Numerical experience on simplex constrained problems with 1000 variables is reported. … Read more

Support Vector Regression for imprecise data

In this work, a regression problem is studied where the elements of the database are sets with certain geometrical properties. In particular, our model can be applied to handle data affected by some kind of noise or uncertainty and interval-valued data, and databases with missing values as well. The proposed formulation is based on the … Read more

Adaptive Constraint Reduction for Training Support Vector Machines

A support vector machine (SVM) determines whether a given observed pattern lies in a particular class. The decision is based on prior training of the SVM on a set of patterns with known classification, and training is achieved by solving a convex quadratic programming problem. Since there are typically a large number of training patterns, … Read more

Relaxing the Optimality Conditions of Box QP

We present semidefinite relaxations of nonconvex, box-constrained quadratic programming, which incorporate the first- and second-order necessary optimality conditions. We compare these relaxations with a basic semidefinite relaxation due to Shor, particularly in the context of branch-and-bound to determine a global optimal solution, where it is shown empirically that the new relaxations are significantly stronger. We … Read more

Classification problems with imprecise data through separating hyperplanes

We consider a supervised classification problem in which the elements to be classified are sets with certain geometrical properties. In particular, this model can be applied to deal with data affected by some kind of noise and in the case of interval-valued data. Two classification rules, a fuzzy one and a crisp one, are defined … Read more

Optimal solutions for unrelated parallel machines scheduling problems using convex quadratic reformulations

In this work, we take advantage of the powerful quadratic programming theory to obtain optimal solutions of scheduling problems. We apply a methodology that starts, in contrast to more classical approaches, by formulating three unrelated parallel machine scheduling problems as 0–1 quadratic programs under linear constraints. By construction, these quadratic programs are non-convex. Therefore, before … Read more

Exploiting separability in large-scale linear support vector machine training

Linear support vector machine training can be represented as a large quadratic program. We present an efficient and numerically stable algorithm for this problem using interior point methods, which requires only O(n) operations per iteration. Through exploiting the separability of the Hessian, we provide a unified approach, from an optimization perspective, to 1-norm classification, 2-norm … Read more

A conic duality Frank–Wolfe type theorem via exact penalization in quadratic optimization

The famous Frank–Wolfe theorem ensures attainability of the optimal value for quadratic objective functions over a (possibly unbounded) polyhedron if the feasible values are bounded. This theorem does not hold in general for conic programs where linear constraints are replaced by more general convex constraints like positive-semidefiniteness or copositivity conditions, despite the fact that the … Read more