A Filter SQP Method: Local Convergence and Numerical Results

The work by Gould, Loh, and Robinson [“A filter method with unified step computation for nonlinear optimization”, SIAM J. Optim., 24 (2014), pp. 175–209] established global convergence of a new filter line search method for finding local first-order solutions to nonlinear and nonconvex constrained optimization problems. A key contribution of that work was that the … Read more

Globally Convergent Primal-Dual Active-Set Methods with Inexact Subproblem Solves

We propose primal-dual active-set (PDAS) methods for solving large-scale instances of an important class of convex quadratic optimization problems (QPs). The iterates of the algorithms are partitions of the index set of variables, where corresponding to each partition there exist unique primal-dual variables that can be obtained by solving a (reduced) linear system. Algorithms of … Read more

An Inertia-Free Filter Line-Search Algorithm for Large-Scale Nonlinear Programming

We present a filter line-search algorithm that does not require inertia information about the linear system to ensure global convergence. The proposed approach performs curvature tests along the search step to ensure descent. This feature permits more modularity in the linear algebra, enabling the use of a wider range of iterative and decomposition strategies. We … Read more

Adaptive Augmented Lagrangian Methods: Algorithms and Practical Numerical Experience

In this paper, we consider augmented Lagrangian (AL) algorithms for solving large-scale nonlinear optimization problems that execute adaptive strategies for updating the penalty parameter. Our work is motivated by the recently proposed adaptive AL trust region method by Curtis et al. [An adaptive augmented Lagrangian method for large-scale constrained optimization, Math. Program. 152 (2015), pp.201–245.]. … Read more

Fast Projection onto the Simplex and the l1 Ball

A new algorithm is proposed to project, exactly and in finite time, a vector of arbitrary size onto a simplex or a l1-norm ball. The algorithm is demonstrated to be faster than existing methods. In addition, a wrong statement in a paper by Duchi et al. is corrected and an adversary sequence for Michelot’s algorithm … Read more

Robust Block Coordinate Descent

In this paper we present a novel randomized block coordinate descent method for the minimization of a convex composite objective function. The method uses (approximate) partial second-order (curvature) information, so that the algorithm performance is more robust when applied to highly nonseparable or ill conditioned problems. We call the method Robust Coordinate Descent (RCD). At … Read more

Projection methods in quantum information science

We consider the problem of constructing quantum operations or channels, if they exist, that transform a given set of quantum states $\{\rho_1, \dots, \rho_k\}$ to another such set $\{\hat\rho_1, \dots, \hat\rho_k\}$. In other words, we must find a {\em completely positive linear map}, if it exists, that maps a given set of density matrices to … Read more

OSGA: A fast subgradient algorithm with optimal complexity

This paper presents an algorithm for approximately minimizing a convex function in simple, not necessarily bounded convex domains, assuming only that function values and subgradients are available. No global information about the objective function is needed apart from a strong convexity parameter (which can be put to zero if only convexity is known). The worst … Read more

Distributed Optimization Methods for Large Scale Optimal Control

This thesis aims to develop and implement both nonlinear and linear distributed optimization methods that are applicable, but not restricted to the optimal control of distributed systems. Such systems are typically large scale, thus the well-established centralized solution strategies may be computationally overly expensive or impossible and the application of alternative control algorithms becomes necessary. … Read more

Eigenvalue, Quadratic Programming, and Semidefinite Programming Relaxations for a Cut Minimization Problem

We consider the problem of partitioning the node set of a graph into $k$ sets of given sizes in order to \emph{minimize the cut} obtained using (removing) the $k$-th set. If the resulting cut has value $0$, then we have obtained a vertex separator. This problem is closely related to the graph partitioning problem. In … Read more