BFGS convergence to nonsmooth minimizers of convex functions

The popular BFGS quasi-Newton minimization algorithm under reasonable conditions converges globally on smooth convex functions. This result was proved by Powell in 1976: we consider its implications for functions that are not smooth. In particular, an analogous convergence result holds for functions, like the Euclidean norm, that are nonsmooth at the minimizer. CitationManuscript: School of … Read more

Speed optimization over a path with heterogeneous arc costs

The speed optimization problem over a path aims to find a set of speeds over each arc of the given path to minimize the total cost, while respecting the time-window constraint at each node and speed limits over each arc. In maritime transportation, the cost represents fuel cost or emissions, so study of this problem … Read more

MPC as a DVI: Implications on Sampling Rates and Accuracy

We show that the evolution of a dynamical system driven by controls obtained by the solution of an embedded optimization problem (as done in MPC) can be cast as a differential variational inequality (DVI). The DVI abstraction reveals that standard sampled-data MPC implementations (in which the control law is computed using states that are sampled … Read more

Solving sparse polynomial optimization problems with chordal structure using the sparse, bounded-degree sum-of-squares hierarchy

The sparse bounded degree sum-of-squares (sparse-BSOS) hierarchy of Weisser, Lasserre and Toh [arXiv:1607.01151,2016] constructs a sequence of lower bounds for a sparse polynomial optimization problem. Under some assumptions, it is proven by the authors that the sequence converges to the optimal value. In this paper, we modify the hierarchy to deal with problems containing equality … Read more

Comparison of Lasserre’s measure–based bounds for polynomial optimization to bounds obtained by simulated annealing

We consider the problem of minimizing a continuous function f over a compact set K. We compare the hierarchy of upper bounds proposed by Lasserre in [SIAM J. Optim. 21(3) (2011), pp. 864-885] to bounds that may be obtained from simulated annealing. We show that, when f is a polynomial and K a convex body, … Read more

An extension of Yuan’s Lemma and its applications in optimization

We prove an extension of Yuan’s Lemma to more than two matrices, as long as the set of matrices has rank at most 2. This is used to generalize the main result of [A. Baccari and A. Trad. On the classical necessary second-order optimality conditions in the presence of equality and inequality constraints. SIAM J. … Read more

Exploiting Negative Curvature in Deterministic and Stochastic Optimization

This paper addresses the question of whether it can be beneficial for an optimization algorithm to follow directions of negative curvature. Although some prior work has established convergence results for algorithms that integrate both descent and negative curvature directions, there has not yet been numerical evidence showing that such methods offer significant performance improvements. In … Read more

An Augmented Lagrangian Proximal Alternating Method for Sparse Discrete Optimization Problems

In this paper, an augmented Lagrangian proximal alternating (ALPA) method is proposed for two class of large-scale sparse discrete constrained optimization problems in which a sequence of augmented Lagrangian subproblems are solved by utilizing proximal alternating linearized minimization framework and sparse projection techniques. Under the Mangasarian-Fromovitz and the basic constraint qualification, we show that any … Read more

Proximal Mapping for Symmetric Penalty and Sparsity

This paper studies a class of problems consisting of minimizing a continuously differentiable function penalized with the so-called $\ell_0$-norm over a symmetric set. These problems are hard to solve, yet prominent in many fields and applications. We first study the proximal mapping with respect to the $\ell_0$-norm over symmetric sets, and provide an efficient method … Read more

Direct search based on probabilistic feasible descent for bound and linearly constrained problems

Direct search is a methodology for derivative-free optimization whose iterations are characterized by evaluating the objective function using a set of polling directions. In deterministic direct search applied to smooth objectives, these directions must somehow conform to the geometry of the feasible region and typically consist of positive generators of approximate tangent cones (which then … Read more