Hybrid MPI/OpenMP parallel support vector machine training

Support Vector Machines are a powerful machine learning technology, but the training process involves a dense quadratic optimization problem and is computationally challenging. A parallel implementation of Support Vector Machine training has been developed, using a combination of MPI and OpenMP. Using an interior point method for the optimization and a reformulation that avoids the … Read more

Dynamic Evolution for Risk-Neutral Densities

Option price data is often used to infer risk-neutral densities for future prices of an underlying asset. Given the prices of a set of options on the same underlying asset with different strikes and maturities, we propose a nonparametric approach for estimating the evolution of the risk-neutral density in time. Our method uses bicubic splines … Read more

A globally convergent primal-dual interior-point filter method for nonlinear programming: new filter optimality measures and computational results

In this paper we modify the original primal-dual interior-point filter method proposed in [18] for the solution of nonlinear programming problems. We introduce two new optimality filter entries based on the objective function, and thus better suited for the purposes of minimization, and propose conditions for using inexact Hessians. We show that the global convergence … Read more

Quadratic regularizations in an interior-point method for primal block-angular problems

One of the most efficient interior-point methods for some classes of primal block-angular problems solves the normal equations by a combination of Cholesky factorizations and preconditioned conjugate gradient for, respectively, the block and linking constraints. Its efficiency depends on the spectral radius—in [0,1)—of a certain matrix in the definition of the preconditioner. Spectral radius close … Read more

An FPTAS for Minimizing the Product of Two Non-negative Linear Cost Functions

We consider a quadratic programming (QP) problem ($\Pi$) of the form $\min x^T C x$ subject to $Ax \ge b$ where $C\in {\mathbb R}^{n\mbox{\tiny\texttimes} n}_+, rank(C)=1$ and $A\in {\mathbb R}^{m\mbox{\tiny\texttimes} n}, b\in {\mathbb R}^m$. We present an FPTAS for this problem by reformulating the QP ($\Pi$) as a parametrized LP and “rounding” the optimal solution. … Read more

An LPCC Approach to Nonconvex Quadratic Programs

Filling a gap in nonconvex quadratic programming, this paper shows that the global resolution of a feasible quadratic program (QP), which is not known a priori to be bounded or unbounded below, can be accomplished in finite time by solving a linear program with linear complementarity constraints, i.e., an LPCC. Alternatively, this task can be … Read more

A computational study of the use of an optimization-based method for simulating large multibody systems

The present work aims at comparing the performance of several quadratic programming (QP) solvers for simulating large-scale frictional rigid-body systems. Traditional time-stepping schemes for simulation of multibody systems are formulated as linear complementarity problems (LCPs) with copositive matrices. Such LCPs are generally solved by means of Lemketype algorithms and solvers such as the PATH solver … Read more

Constraint propagation on quadratic constraints

This paper considers constraint propagation methods for continuous constraint satisfaction problems consisting of linear and quadratic constraints. All methods can be applied after suitable preprocessing to arbitrary algebraic constraints. The basic new techniques consist in eliminating bilinear entries from a quadratic constraint, and solving the resulting separable quadratic constraints by means of a sequence of … Read more

A conjugate-gradient based approach for approximate solutions of quadratic programs

This paper deals with numerical behaviour and convergence properties of a recently presented column generation approach for optimization of so called step-and-shoot radiotherapy treatment plans. The approach and variants of it have been reported to be efficient in practice, finding near-optimal solutions by generating only a low number of columns. The impact of different restrictions … Read more

Kernel Support Vector Regression with imprecise output

We consider a regression problem where uncertainty affects to the dependent variable of the elements of the database. A model based on the standard epsilon-Support Vector Regression approach is given, where two hyperplanes need to be constructed to predict the interval-valued dependent variable. By using the Hausdorff distance to measure the error between predicted and … Read more