Numerical Solution of Optimal Control Problems with Switches, Switching Costs and Jumps

In this article, we present a framework for the numerical solution of optimal control problems, constrained by ordinary differential equations which can run in (finitely many) different modes, where a change of modes leads to additional switching cost in the cost function, and whenever the system changes its mode, jumps in the differential states are … Read more

On limited-memory quasi-Newton methods for minimizing a quadratic function

The main focus in this paper is exact linesearch methods for minimizing a quadratic function whose Hessian is positive definite. We give two classes of limited-memory quasi-Newton Hessian approximations that generate search directions parallel to those of the method of preconditioned conjugate gradients, and hence give finite termination on quadratic optimization problems. The Hessian approximations … Read more

Understanding the Acceleration Phenomenon via High-Resolution Differential Equations

Gradient-based optimization algorithms can be studied from the perspective of limiting or- dinary differential equations (ODEs). Motivated by the fact that existing ODEs do not distin- guish between two fundamentally different algorithms—Nesterov’s accelerated gradient method for strongly convex functions (NAG-SC) and Polyak’s heavy-ball method—we study an alter- native limiting process that yields high-resolution ODEs. We … Read more

On the Convergence to Stationary Points of Deterministic and Randomized Feasible Descent Directions Methods

We study the class of nonsmooth nonconvex problems in which the objective is to minimize the difference between a continuously differentiable function (possibly nonconvex) and a convex (possibly nonsmooth) function over a convex polytope. This general class contains many types of problems, including difference of convex functions (DC) problems, and as such, can be used … Read more

Convergence Rate Analysis of a Stochastic Trust Region Method via Supermartingales

We propose a novel framework for analyzing convergence rates of stochastic optimization algorithms with adaptive step sizes. This framework is based on analysing properties of an underlying generic stochastic process, in particular by deriving a bound on the expected stopping time of this process. We utilise this framework to analyse the bounds on expected global … Read more

A non-monotone Inexact Restoration approach for minimization with orthogonality constraints

In this work we consider the problem of minimizing a differentiable functional restricted to the set of $n\times p$ matrices with orthonormal columns. This problem appears in several fields such as statistics, signal processing, global positioning system, machine learning, physics, chemistry and others. We present an algorithm based on a recent non-monotone variation of the … Read more

A Subsampling Line-Search Method with Second-Order Results

In many contemporary optimization problems such as those arising in machine learning, it can be computationally challenging or even infeasible to evaluate an entire function or its derivatives. This motivates the use of stochastic algorithms that sample problem data, which can jeopardize the guarantees obtained through classical globalization techniques in optimization such as a trust … Read more

Non-monotone Inexact Restoration Method for nonlinear programming

This paper deals with a new variant of the Inexact Restoration Method of Fischer and Friedlander (COAP, 46, pp. 333-346, 2010). We propose an algorithm that replaces the monotone line-search performed in the tangent phase (with regard to the penalty function) by a non-monotone one. Con- vergence to feasible points satisfying the approximate gradient projection … Read more

Dynamic Optimization with Convergence Guarantees

We present a novel direct transcription method to solve optimization problems subject to nonlinear differential and inequality constraints. In order to provide numerical convergence guarantees, it is sufficient for the functions that define the problem to satisfy boundedness and Lipschitz conditions. Our assumptions are the most general to date; we do not require uniqueness, differentiability … Read more

Non-convex min-max fractional quadratic problems under quadratic constraints: copositive relaxations

In this paper we address a min-max problem of fractional quadratic (not necessarily convex) over linear functions on a feasible set described by linear and (not necessarily convex) quadratic functions. We propose a conic reformulation on the cone of completely positive matrices. By relaxation, a doubly non negative conic formulation is used to provide lower … Read more