An Augmented Lagrangian based Algorithm for Distributed Non-Convex Optimization

This paper is about distributed derivative-based algorithms for solving optimization problems with a separable (potentially nonconvex) objective function and coupled affine constraints. A parallelizable method is proposed that combines ideas from the fields of sequential quadratic programming and augmented Lagrangian algorithms. The method negotiates shared dual variables that may be interpreted as prices, a concept … Read more

Directed modified Cholesky factorizations and convex quadratic relaxations

A directed Cholesky factorization of a symmetric interval matrix \A consists of a permuted upper triangular matrix R such that for all symmetric A \in \A, the residual matrix A – R^T R is positive semidefinite with tiny entries. This must holds with full mathematical rigor, although the computations are done in floating-point arithmetic. Similarly, … Read more

A Globally Convergent Stabilized SQP Method: Superlinear Convergence

Regularized and stabilized sequential quadratic programming (SQP) methods are two classes of methods designed to resolve the numerical and theoretical difficulties associated with ill-posed or degenerate nonlinear optimization problems. Recently, a regularized SQP method has been proposed that allows convergence to points satisfying certain second-order KKT conditions (SIAM J. Optim., 23(4):1983–2010, 2013). The method is … Read more

Local Convergence of an Algorithm for Subspace Identification from Partial Data

GROUSE (Grassmannian Rank-One Update Subspace Estimation) is an iterative algorithm for identifying a linear subspace of $\R^n$ from data consisting of partial observations of random vectors from that subspace. This paper examines local convergence properties of GROUSE, under assumptions on the randomness of the observed vectors, the randomness of the subset of elements observed at … Read more

A comparison of reduced and unreduced KKT systems arising from Interior Point methods

We address the iterative solution of symmetric KKT systems arising in the solution of convex quadratic programming problems. Two strictly related and well established formulations for such systems are studied with particular emphasis on the effect of preconditioning strategies on their relation. Constraint and augmented preconditioners are considered, and the choice of the augmentation Matrix … Read more

iPiano: Inertial Proximal Algorithm for Nonconvex Optimization

In this paper we study an algorithm for solving a minimization problem composed of a differentiable (possibly nonconvex) and a convex (possibly nondifferentiable) function. The algorithm iPiano combines forward-backward splitting with an inertial force. It can be seen as a nonsmooth split version of the Heavy-ball method from Polyak. A rigorous analysis of the algorithm … Read more

Projection Methods: An Annotated Bibliography of Books and Reviews

Projections onto sets are used in a wide variety of methods in optimization theory but not every method that uses projections really belongs to the class of projection methods as we mean it here. Here projection methods are iterative algorithms that use projections onto sets while relying on the general principle that when a family … Read more

Levenberg-Marquardt methods based on probabilistic gradient models and inexact subproblem solution, with application to data assimilation

The Levenberg-Marquardt algorithm is one of the most popular algorithms for the solution of nonlinear least squares problems. Motivated by the problem structure in data assimilation, we consider in this paper the extension of the classical Levenberg-Marquardt algorithm to the scenarios where the linearized least squares subproblems are solved inexactly and/or the gradient model is … Read more

How Difficult is Nonlinear Optimization? A Practical Solver Tuning Approach, with Illustrative Results

Nonlinear optimization (NLO) per definitionem covers a vast range of problems, from trivial to practically intractable. For this reason, it is impossible to offer “guaranteed” advice to NLO software users. This fact becomes especially obvious, when facing unusually hard and/or previously unexplored NLO challenges. In the present study we offer some related practical observations, propose … Read more

On the Minimization Over Sparse Symmetric Sets: Projections, Optimality Conditions and Algorithms

We consider the problem of minimizing a general continuously differentiable function over symmetric sets under sparsity constraints. These type of problems are generally hard to solve as the sparsity constraint induces a combinatorial constraint into the problem, rendering the feasible set to be nonconvex. We begin with a study of the properties of the orthogonal … Read more