Partial Smoothness,Tilt Stability, and Generalized Hessians

We compare two recent variational-analytic approaches to second-order conditions and sensitivity analysis for nonsmooth optimization. We describe a broad setting where computing the generalized Hessian of Mordukhovich is easy. In this setting, the idea of tilt stability introduced by Poliquin and Rockafellar is equivalent to a classical smooth second-order condition. Article Download View Partial Smoothness,Tilt … Read more

An Adaptive Gradient Sampling Algorithm for Nonsmooth Optimization

We present an algorithm for the minimization of f : Rn → R, assumed to be locally Lipschitz and continuously differentiable in an open dense subset D of Rn. The objective f may be non-smooth and/or non-convex. The method is based on the gradient sampling (GS) algorithm of Burke et al. [A robust gradient sampling … Read more

A smooth perceptron algorithm

The perceptron algorithm, introduced in the late fifties in the machine learning community, is a simple greedy algorithm for finding a solution to a finite set of linear inequalities. The algorithm’s main advantages are its simplicity and noise tolerance. The algorithm’s main disadvantage is its slow convergence rate. We propose a modified version of the … Read more

Generalized Forward-Backward Splitting

This paper introduces the generalized forward-backward splitting algorithm for minimizing convex functions of the form $F + \sum_{i=1}^n G_i$, where $F$ has a Lipschitz-continuous gradient and the $G_i$’s are simple in the sense that their Moreau proximity operators are easy to compute. While the forward-backward algorithm cannot deal with more than $n = 1$ non-smooth … Read more

Accelerated and Inexact forward-backward algorithms

We propose a convergence analysis of accelerated forward-backward splitting methods for minimizing composite functions, when the proximity operator is not available in closed form, and is thus computed up to a certain precision. We prove that the $1/k^2$ convergence rate for the function values can be achieved if the admissible errors are of a certain … Read more

Inexact and accelerated proximal point algorithms

We present inexact accelerated proximal point algorithms for minimizing a proper lower semicon- tinuous and convex function. We carry on a convergence analysis under different types of errors in the evaluation of the proximity operator, and we provide corresponding convergence rates for the objective function values. The proof relies on a generalization of the strategy … Read more

A proximal point algorithm for sequential feature extraction applications

We propose a proximal point algorithm to solve LAROS problem, that is the problem of finding a “large approximately rank-one submatrix”. This LAROS problem is used to sequentially extract features in data. We also develop a new stopping criterion for the proximal point algorithm, which is based on the duality conditions of \eps-optimal solutions of … Read more

Solving Basis Pursuit: Heuristic Optimality Check and Solver Comparison

The problem of finding a minimum l^1-norm solution to an underdetermined linear system is an important problem in compressed sensing, where it is also known as basis pursuit. We propose a heuristic optimality check as a general tool for l^1-minimization, which often allows for early termination by “guessing” a primal-dual optimal pair based on an … Read more

Approximation of rank function and its application to the nearest low-rank correlation matrix

The rank function $\rank(\cdot)$ is neither continuous nor convex which brings much difficulty to the solution of rank minimization problems. In this paper, we provide a unified framework to construct the approximation functions of $\rank(\cdot)$, and study their favorable properties. Particularly, with two families of approximation functions, we propose a convex relaxation method for the … Read more

The mesh adaptive direct search algorithm with treed Gaussian process surrogates

This work introduces the use of the treed Gaussian process (TGP) as a surrogate model within the mesh adaptive direct search (MADS) framework for constrained blackbox optimization. It extends the surrogate management framework (SMF) to nonsmooth optimization under general constraints. MADS uses TGP in two ways: one, as a surrogate for blackbox evaluations; and two, … Read more