A simple Newton method for local nonsmooth optimization

Superlinear convergence has been an elusive goal for black-box nonsmooth optimization. Even in the convex case, the subgradient method is very slow, and while some cutting plane algorithms, including traditional bundle methods, are popular in practice, local convergence is still sluggish. Faster variants depend either on problem structure or on analyses that elide sequences of … Read more

Relations Between Abs-Normal NLPs and MPCCs Part 1: Strong Constraint Qualifications

This work is part of an ongoing effort of comparing non-smooth optimization problems in abs-normal form to MPCCs. We study the general abs-normal NLP with equality and inequality constraints in relation to an equivalent MPCC reformulation. We show that kink qualifications and MPCC constraint qualifications of linear independence type and Mangasarian-Fromovitz type are equivalent. Then … Read more

Stochastic algorithms with geometric step decay converge linearly on sharp functions

Stochastic (sub)gradient methods require step size schedule tuning to perform well in practice. Classical tuning strategies decay the step size polynomially and lead to optimal sublinear rates on (strongly) convex problems. An alternative schedule, popular in nonconvex optimization, is called geometric step decay and proceeds by halving the step size after every few epochs. In … Read more

Asynchronous Stochastic Subgradient Methods for General Nonsmooth Nonconvex Optimization

Asynchronous distributed methods are a popular way to reduce the communication and synchronization costs of large-scale optimization. Yet, for all their success, little is known about their convergence guarantees in the challenging case of general non-smooth, non-convex objectives, beyond cases where closed-form proximal operator solutions are available. This is all the more surprising since these … Read more

Numerical solution of generalized minimax problems

This contribution contains the description and investigation of four numerical methods for solving generalized minimax problems, which consists in the minimization of functions which are compositions of special smooth convex functions with maxima of smooth functions (the most important problem of this type is the sum of maxima of smooth functions). Section~1 is introductory. In … Read more

New inertial factors of the Krasnoselskii-Mann iteration

In this article, we consider the Krasnosel’ski\u{\i}-Mann iteration for approximating a fixed point of any given non-expansive operator in real Hilbert spaces, and we study an inertial version proposed by Maing\'{e} recently. As a result, we suggest new conditions on the inertial factors to ensure weak convergence. They are free of iterates and depend on … Read more

A Proximal Interior Point Algorithm with Applications to Image Processing

In this article, we introduce a new proximal interior point algorithm (PIPA). This algorithm is able to handle convex optimization problems involving various constraints where the objective function is the sum of a Lipschitz differentiable term and a possibly nonsmooth one. Each iteration of PIPA involves the minimization of a merit function evaluated for decaying … Read more

Trust-region methods for the derivative-free optimization of nonsmooth black-box functions

In this paper we study the minimization of a nonsmooth black-box type function, without assuming any access to derivatives or generalized derivatives and without any knowledge about the analytical origin of the function nonsmoothness. Directional methods have been derived for such problems but to our knowledge no model-based method like a trust-region one has yet … Read more

Potential-based analyses of first-order methods for constrained and composite optimization

We propose potential-based analyses for first-order algorithms applied to constrained and composite minimization problems. We first propose “idealized” frameworks for algorithms in the strongly and non-strongly convex cases and argue based on a potential that methods following the framework achieve the best possible rate. Then we show that the geometric descent (GD) algorithm by Bubeck … Read more

On First and Second Order Optimality Conditions for Abs-Normal NLP

Structured nonsmoothness is widely present in practical optimization. A particularly attractive class of nonsmooth problems, both from a theoretical and from an algorithmic perspective, are optimization problems in so-called abs-normal form as developed by Griewank and Walther. Here we generalize their theory for the unconstrained case to nonsmooth NLPs with equality and inequality constraints in … Read more