An Augmented Lagrangian based Algorithm for Distributed Non-Convex Optimization

This paper is about distributed derivative-based algorithms for solving optimization problems with a separable (potentially nonconvex) objective function and coupled affine constraints. A parallelizable method is proposed that combines ideas from the fields of sequential quadratic programming and augmented Lagrangian algorithms. The method negotiates shared dual variables that may be interpreted as prices, a concept … Read more

On Global Optimization

This paper presents a relatively “unfettered” method for finding global optima to constrained nonlinear programs. The method reformulates the given program into a bi-objective mixed-integer program that is then solved for the Nash equilibrium. A numerical example (whose solution provides a new benchmark against which other algorithms may be assessed) is included to illustrate the … Read more

Constraint aggregation for rigorous global optimization

In rigorous constrained global optimization, upper bounds on the objective function help to reduce the search space. Their construction requires finding a narrow box around an approximately feasible solution, verified to contain a feasible point. Approximations are easily found by local optimization, but the verification often fails. In this paper we show that even if … Read more

Directed modified Cholesky factorizations and convex quadratic relaxations

A directed Cholesky factorization of a symmetric interval matrix \A consists of a permuted upper triangular matrix R such that for all symmetric A \in \A, the residual matrix A – R^T R is positive semidefinite with tiny entries. This must holds with full mathematical rigor, although the computations are done in floating-point arithmetic. Similarly, … Read more

Local Cuts and Two-Period Convex Hull Closures for Big-Bucket Lot-Sizing Problems

Despite the significant attention they have drawn, big bucket lot-sizing problems remain notoriously difficult to solve. Previous work of Akartunali and Miller (2012) presented results (computational and theoretical) indicating that what makes these problems difficult are the embedded single-machine, single-level, multi-period submodels. We therefore consider the simplest such submodel, a multi-item, two-period capacitated relaxation that … Read more

Global convergence of splitting methods for nonconvex composite optimization

We consider the problem of minimizing the sum of a smooth function $h$ with a bounded Hessian, and a nonsmooth function. We assume that the latter function is a composition of a proper closed function $P$ and a surjective linear map $\M$, with the proximal mappings of $\tau P$, $\tau > 0$, simple to compute. … Read more

n-step cycle inequalities: facets for continuous n-mixing set and strong cuts for multi-module capacitated lot-sizing problem

In this paper, we introduce a generalization of the continuous mixing set (which we refer to as the continuous n-mixing set). This set is closely related to the feasible set of the multi-module capacitated lot-sizing (MML) problem with(out) backlogging. We develop new classes of valid inequalities for this set, referred to as n’-step cycle inequalities, … Read more

Local Convergence of an Algorithm for Subspace Identification from Partial Data

GROUSE (Grassmannian Rank-One Update Subspace Estimation) is an iterative algorithm for identifying a linear subspace of $\R^n$ from data consisting of partial observations of random vectors from that subspace. This paper examines local convergence properties of GROUSE, under assumptions on the randomness of the observed vectors, the randomness of the subset of elements observed at … Read more

A comparison of reduced and unreduced KKT systems arising from Interior Point methods

We address the iterative solution of symmetric KKT systems arising in the solution of convex quadratic programming problems. Two strictly related and well established formulations for such systems are studied with particular emphasis on the effect of preconditioning strategies on their relation. Constraint and augmented preconditioners are considered, and the choice of the augmentation Matrix … Read more

A Globally Convergent Stabilized SQP Method: Superlinear Convergence

Regularized and stabilized sequential quadratic programming (SQP) methods are two classes of methods designed to resolve the numerical and theoretical difficulties associated with ill-posed or degenerate nonlinear optimization problems. Recently, a regularized SQP method has been proposed that allows convergence to points satisfying certain second-order KKT conditions (SIAM J. Optim., 23(4):1983–2010, 2013). The method is … Read more