A new branch-and-bound algorithm for standard quadratic programming problems

In this paper we propose convex and LP bounds for Standard Quadratic Programming (StQP) problems and employ them within a branch-and-bound approach. We first compare different bounding strategies for StQPs in terms both of the quality of the bound and of the computation times. It turns out that the polyhedral bounding strategy is the best … Read more

Asymptotical Analysis of a SAA Estimator for Optimal Value of a Two Stage Problem with Quadratic Recourse

In this paper, we first consider the stability analysis of a convex quadratic programming problem and its restricted Wolfe dual in which all parameters in the problem are perturbed. We demonstrate the upper semi-continuity of solution mappings for the primal problem and the restricted Wolfe dual problem and establish the Hadamard directionally differentiability of the … Read more

Exploiting Problem Structure in Optimization under Uncertainty via Online Convex Optimization

In this paper, we consider two paradigms that are developed to account for uncertainty in optimization models: robust optimization (RO) and joint estimation-optimization (JEO). We examine recent developments on efficient and scalable iterative first-order methods for these problems, and show that these iterative methods can be viewed through the lens of online convex optimization (OCO). … Read more

An inexact dual logarithmic barrier method for solving sparse semidefinite programs

A dual logarithmic barrier method for solving large, sparse semidefinite programs is proposed in this paper. The method avoids any explicit use of the primal variable X and therefore is well-suited to problems with a sparse dual matrix S. It relies on inexact Newton steps in dual space which are computed by the conjugate gradient … Read more

A Two-Stage Active-Set Algorithm for Bound-Constrained Optimization

In this paper, we describe a two-stage method for solving optimization problems with bound constraints. It combines the active-set estimate described in [Facchinei and Lucidi, 1995] with a modification of the non-monotone line search framework recently proposed in [De Santis et al., 2012]. In the first stage, the algorithm exploits a property of the active-set … Read more

Step lengths in BFGS method for monotone gradients

In this paper, we consider how to directly apply the BFGS method to finding a zero point of any given monotone gradient and thus suggest new conditions to locate the corresponding step lengths. The suggested conditions involve curvature condition and merely use gradients’ computations. Furthermore, they can guarantee convergence without any other restrictions. Finally, preliminary … Read more

Online First-Order Framework for Robust Convex Optimization

Robust optimization (RO) has emerged as one of the leading paradigms to efficiently model parameter uncertainty. The recent connections between RO and problems in statistics and machine learning domains demand for solving RO problems in ever more larger scale. However, the traditional approaches for solving RO formulations based on building and solving robust counterparts or … Read more

Robust optimization of noisy blackbox problems using the Mesh Adaptive Direct Search algorithm

Blackbox optimization problems are often contaminated with numerical noise, and direct search methods such as the Mesh Adaptive Direct Search (MADS) algorithm may get stuck at solutions artificially created by the noise. We propose a way to smooth out the objective function of an unconstrained problem using previously evaluated function evaluations, rather than resampling points. … Read more

Towards Simulation Based Mixed-Integer Optimization with Differential Equations

We propose a decomposition based method for solving mixed-integer nonlinear optimization problems with “black-box” nonlinearities, where the latter, e.g., may arise due to differential equations or expensive simulation runs. The method alternatingly solves a mixed-integer linear master problem and a separation problem for iteratively refining the mixed-integer linear relaxation of the nonlinearity. We prove that … Read more

ALGORITHM XXX: SC-SR1: MATLAB SOFTWARE FOR SOLVING SHAPE-CHANGING L-SR1 TRUST-REGION SUBPROBLEMS

We present a MATLAB implementation of the shape-changing sym- metric rank-one (SC-SR1) method that solves trust-region subproblems when a limited-memory symmetric rank-one (L-SR1) matrix is used in place of the true Hessian matrix. The method takes advantage of two shape-changing norms [4, 3] to decompose the trust-region subproblem into two separate problems. Using one of … Read more