A trust-funnel method for nonlinear optimization problems with general nonlinear constraints and its application to derivative-free optimization

A trust-funnel method is proposed for solving nonlinear optimization problems with general nonlinear constraints. It extends the one presented by Gould and Toint (Math. Prog., 122(1):155-196, 2010), originally proposed for equality-constrained optimization problems only, to problems with both equality and inequality constraints and where simple bounds are also considered. As the original one, our method … Read more

Trust-region methods without using derivatives: Worst case complexity and the non-smooth case

Trust-region methods are a broad class of methods for continuous optimization that found application in a variety of problems and contexts. In particular, they have been studied and applied for problems without using derivatives. The analysis of trust-region derivative-free methods has focused on global convergence, and they have been proved to generate a sequence of … Read more

Object-Parallel Infrastructure for Implementing First-Order Methods, with an Example Application to LASSO

We describe the design of a C++ vector-manipulation substrate that allows first-order optimization algorithms to be expressed in a concise and readable manner, yet still achieve high performance in parallel computing environments. We use standard object-oriented techniques of encapsulation and operator overloading, combined with a novel “symbolic temporaries” delayed-evaluation system that greatly reduces the overhead … Read more

Convergence Conditions for Newton-type Methods Applied to Complementarity Systems with Nonisolated Solutions

We consider a class of Newton-type methods for constrained systems of equations that involve complementarity conditions. In particular, at issue are the constrained Levenberg–Marquardt method and the recently introduced Linear Programming-Newton method, designed for the difficult case when solutions need not be isolated, and the equation mapping need not be differentiable at the solutions. We … Read more

New Ranks for Even-Order Tensors and Their Applications in Low-Rank Tensor Optimization

In this paper, we propose three new tensor decompositions for even-order tensors corresponding respectively to the rank-one decompositions of some unfolded matrices. Consequently such new decompositions lead to three new notions of (even-order) tensor ranks, to be called the M-rank, the symmetric M-rank, and the strongly symmetric M-rank in this paper. We discuss the bounds … Read more

On iteratively reweighted Algorithms for Non-smooth Non-convex Optimization in Computer Vision

Natural image statistics indicate that we should use non-convex norms for most regularization tasks in image processing and computer vision. Still, they are rarely used in practice due to the challenge of optimization. Recently, iteratively reweighed $\ell_1$ minimization (IRL1) has been proposed as a way to tackle a class of non-convex functions by solving a … Read more

ADMM for Convex Quadratic Programs: Linear Convergence and Infeasibility Detection

In this paper, we analyze the convergence of Alternating Direction Method of Multipliers (ADMM) on convex quadratic programs (QPs) with linear equality and bound constraints. The ADMM formulation alternates between an equality constrained QP and a projection on the bounds. Under the assumptions of: (i) positive definiteness of the Hessian of the objective projected on … Read more

A Preconditioner for a Primal-Dual Newton Conjugate Gradients Method for Compressed Sensing Problems

In this paper we are concerned with the solution of Compressed Sensing (CS) problems where the signals to be recovered are sparse in coherent and redundant dictionaries. We extend a primal-dual Newton Conjugate Gradients (pdNCG) method for CS problems. We provide an inexpensive and provably effective preconditioning technique for linear systems using pdNCG. Numerical results … Read more

Global convergence of the Heavy-ball method for convex optimization

This paper establishes global convergence and provides global bounds of the convergence rate of the Heavy-ball method for convex optimization problems. When the objective function has Lipschitz-continuous gradient, we show that the Cesa ́ro average of the iterates converges to the optimum at a rate of $O(1/k)$ where k is the number of iterations. When … Read more

A Filter SQP Method: Local Convergence and Numerical Results

The work by Gould, Loh, and Robinson [“A filter method with unified step computation for nonlinear optimization”, SIAM J. Optim., 24 (2014), pp. 175–209] established global convergence of a new filter line search method for finding local first-order solutions to nonlinear and nonconvex constrained optimization problems. A key contribution of that work was that the … Read more