A Perturbed Sums of Squares Theorem for Polynomial Optimization and its Applications

We consider a property of positive polynomials on a compact set with a small perturbation. When applied to a Polynomial Optimization Problem (POP), the property implies that the optimal value of the corresponding SemiDefinite Programming (SDP) relaxation with sufficiently large relaxation order is bounded from below by $(f^¥ast – ¥epsilon)$ and from above by $f^¥ast … Read more

Optimal scaling of the ADMM algorithm for distributed quadratic programming

This paper presents optimal scaling of the alternating directions method of multipliers (ADMM) algorithm for a class of distributed quadratic programming problems. The scaling corresponds to the ADMM step-size and relaxation parameter, as well as the edge-weights of the underlying communication graph. We optimize these parameters to yield the smallest convergence factor of the algorithm. … Read more

Embedded Online Optimization for Model Predictive Control at Megahertz Rates

Faster, cheaper, and more power efficient optimization solvers than those currently offered by general-purpose solutions are required for extending the use of model predictive control (MPC) to resource-constrained embedded platforms. We propose several custom computational architectures for different first-order optimization methods that can handle linear-quadratic MPC problems with input, input-rate, and soft state constraints. We … Read more

The Trust Region Subproblem with Non-Intersecting Linear Constraints

This paper studies an extended trust region subproblem (eTRS)in which the trust region intersects the unit ball with m linear inequality constraints. When m=0, m=1, or m=2 and the linear constraints are parallel, it is known that the eTRS optimal value equals the optimal value of a particular convex relaxation, which is solvable in polynomial … Read more

On the connection between the conjugate gradient method and quasi-Newton methods on quadratic problems

It is well known that the conjugate gradient method and a quasi-Newton method, using any well-defined update matrix from the one-parameter Broyden family of updates, produce identical iterates on a quadratic problem with positive-definite Hessian. This equivalence does not hold for any quasi-Newton method. We define precisely the conditions on the update matrix in the … Read more

Quadratic Outer Approximation for Convex Integer Programming

We present a quadratic outer approximation scheme for solving general convex integer programs, where suitable quadratic approximations are used to underestimate the objective function instead of classical linear approximations. As a resulting surrogate problem we consider the problem of minimizing a function given as the maximum of finitely many convex quadratic functions having the same … Read more

Quadratic combinatorial optimization using separable underestimators

Binary programs with a quadratic objective function are NP-hard in general, even if the linear optimization problem over the same feasible set is tractable. In this paper, we address such problems by computing quadratic global underestimators of the objective function that are separable but not necessarily convex. Exploiting the binary constraint on the variables, a … Read more

Hybrid LP/SDP Bounding Procedure

The principal idea of this paper is to exploit Semidefinite Programming (SDP) relaxation within the framework provided by Mixed Integer Nonlinear Programming (MINLP) solvers when tackling Binary Quadratic Problems (BQP). SDP relaxation is well-known to provide strong bounds for BQP in practice. However, the method is not typically implemented in many state-of-the-art MINLP solvers based … Read more

Obtaining Quadratic Models of Noisy Functions

When derivatives of a nonlinear objective function are unavailable, many derivative- free optimization algorithms rely on interpolation-based models of the function. But what if the function values are contaminated by noise, as in most of the simulation- based problems typically encountered in this area? We propose to obtain linear and quadratic models by using knowledge … Read more

Primal-dual relationship between Levenberg-Marquardt and central trajectories for linearly constrained convex optimization

We consider the minimization of a convex function on a compact polyhedron defined by linear equality constraints and nonnegative variables. We define the Levenberg-Marquardt (L-M) and central trajectories starting at the analytic center and using the same parameter, and show that they satisfy a primal-dual relationship, being close to each other for large values of … Read more