Subsampled Inexact Newton methods for minimizing large sums of convex functions

This paper deals with the minimization of large sum of convex functions by Inexact Newton (IN) methods employing subsampled Hessian approximations. The Conjugate Gradient method is used to compute the inexact Newton step and global convergence is enforced by a nonmonotone line search procedure. The aim is to obtain methods with affordable costs and fast … Read more

Multipoint secant and interpolation methods with nonmonotone line search for solving systems of nonlinear equations

Multipoint secant and interpolation methods are effective tools for solving systems of nonlinear equations. They use quasi-Newton updates for approximating the Jacobian matrix. Owing to their ability to more completely utilize the information about the Jacobian matrix gathered at the previous iterations, these methods are especially efficient in the case of expensive functions. They are … Read more

A One-Parameter Family of Middle Proximal ADMM for Constrained Separable Convex Optimization

This work is devoted to studying a family of Middle Proximal Alternating Direction Method of Multipliers (MP-ADM) for solving multi-block constrained separable convex optimization. Such one-parameter family of MP-ADM combines both Jacobian and Gauss-Seidel types of alternating direction method, and proximal point techniques are only applied to the middle subproblems to promote the convergence. We … Read more

A derivative-free Gauss-Newton method

We present DFO-GN, a derivative-free version of the Gauss-Newton method for solving nonlinear least-squares problems. As is common in derivative-free optimization, DFO-GN uses interpolation of function values to build a model of the objective, which is then used within a trust-region framework to give a globally-convergent algorithm requiring $O(\epsilon^{-2})$ iterations to reach approximate first-order criticality … Read more

A sequential optimality condition related to the quasinormality constraint qualification and its algorithmic consequences

In the present paper, we prove that the augmented Lagrangian method converges to KKT points under the quasinormality constraint qualification, which is associated with the external penalty theory. For this purpose, a new sequential optimality condition for smooth constrained optimization, called PAKKT, is defined. The new condition takes into account the sign of the dual … Read more

Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization.

A block version of the BFGS variable metric update formula and its modifications are investigated. In spite of the fact that this formula satisfies the quasi-Newton conditions with all used difference vectors and that the improvement of convergence is the best one in some sense for quadratic objective functions, for general functions it does not … Read more

First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems

We focus on nonconvex and nonsmooth minimization problems with a composite objective, where the differentiable part of the objective is freed from the usual and restrictive global Lipschitz gradient continuity assumption. This longstanding smoothness restriction is pervasive in first order methods (FOM), and was recently circumvent for convex composite optimization by Bauschke, Bolte and Teboulle, … Read more

An extension of Yuan’s Lemma and its applications in optimization

We prove an extension of Yuan’s Lemma to more than two matrices, as long as the set of matrices has rank at most 2. This is used to generalize the main result of [A. Baccari and A. Trad. On the classical necessary second-order optimality conditions in the presence of equality and inequality constraints. SIAM J. … Read more

Some theoretical limitations of second-order algorithms for smooth constrained optimization

In second-order algorithms, we investigate the relevance of the constant rank of the full set of active constraints in ensuring global convergence to a second-order stationary point. We show that second-order stationarity is not expected in the non-constant rank case if the growth of the so-called tangent multipliers, associated with a second-order complementarity measure, is … Read more

Generalized Symmetric ADMM for Separable Convex Optimization

The Alternating Direction Method of Multipliers (ADMM) has been proved to be effective for solving separable convex optimization subject to linear constraints. In this paper, we propose a Generalized Symmetric ADMM (GS-ADMM), which updates the Lagrange multiplier twice with suitable stepsizes, to solve the multi-block separable convex programming. This GS-ADMM partitions the data into two … Read more