Asynchronous parallel generating set search for linearly-constrained optimization

Generating set search (GSS) is a family of direct search methods that encompasses generalized pattern search and related methods. We describe an algorithm for asynchronous linearly-constrained GSS, which has some complexities that make it different from both the asynchronous bound-constrained case as well as the synchronous linearly-constrained case. The algorithm has been implemented in the … Read more

Alternating projections on manifolds

We prove that if two smooth manifolds intersect transversally, then the method of alternating projections converges locally at a linear rate. We bound the speed of convergence in terms of the angle between the manifolds, which in turn we relate to the modulus of metric regularity for the intersection problem, a natural measure of conditioning. … Read more

Recognizing Underlying Sparsity in Optimization

Exploiting sparsity is essential to improve the efficiency of solving large optimization problems. We present a method for recognizing the underlying sparsity structure of a nonlinear partially separable problem, and show how the sparsity of the Hessian matrices of the problem’s functions can be improved by performing a nonsingular linear transformation in the space corresponding … Read more

Exploiting Equalities in Polynomial Programming

We propose a novel solution approach for polynomial programming problems with equality constraints. By means of a generic transformation, we show that solution schemes for the (typically simpler) problem without equalities can be used to address the problem with equalities. In particular, we propose new solution schemes for mixed binary programs, pure 0-1 quadratic programs, … Read more

Steepest descent method for quasiconvex minimization on Riemannian manifolds

This paper extends the full convergence of the steepest descent algorithm with a generalized Armijo search and a proximal regularization to solve quasiconvex minimization problems defined on complete Riemannian manifolds. Previous convergence results are obtained as particular cases of our approach and some examples in non Euclidian spaces are given. CitationJ. Math. Anal. Appl. 341 … Read more

Primal-dual interior point methods for PDE-constrained optimization

This paper provides a detailed analysis of a primal-dual interior-point method for PDE-constrained optimization. Considered are optimal control problems with control constraints in $L^p$. It is shown that the developed primal-dual interior-point method converges globally and locally superlinearly. Not only the easier $L^\infty$-setting is analyzed, but also a more involved $L^q$-analysis, $q

A Proximal Method for Identifying Active Manifolds

The minimization of an objective function over a constraint set can often be simplified if the “active manifold” of the constraints set can be correctly identified. In this work we present a simple subproblem, which can be used inside of any (convergent) optimization algorithm, that will identify the active manifold of a “prox-regular partly smooth” … Read more

Cubic regularization of Newton’s method for convex problems with constraints

In this paper we derive the efficiency estimates of the regularized Newton’s method as applied to constrained convex minimization problems and to variational inequalities. We study a one-step Newton’s method and its multistep accelerated version, which converges on smooth convex problems as $O({1 \over k^3})$, where $k$ is the iteration counter. We derive also the … Read more

A local convergence property of primal-dual methods for nonlinear programming

We prove a new local convergence property of a primal-dual method for solving nonlinear optimization problem. Following a standard interior point approach, the complementarity conditions of the original primal-dual system are perturbed by a parameter which is driven to zero during the iterations. The sequence of iterates is generated by a linearization of the perturbed … Read more

A Note on Sparse SOS and SDP Relaxations for Polynomial Optimization Problems over Symmetric Cones

This short note extends the sparse SOS (sum of squares) and SDP (semidefinite programming) relaxation proposed by Waki, Kim, Kojima and Muramatsu for normal POPs (polynomial optimization problems) to POPs over symmetric cones, and establishes its theoretical convergence based on the recent convergence result by Lasserre on the sparse SOS and SDP relaxation for normal … Read more