Customized proximal point algorithms for linearly constrained convex minimization and saddle-point problems: a uniform approach

This paper takes a uniform look at the customized applications of proximal point algorithm (PPA) to two classes of problems: the linearly constrained convex minimization problem with a generic or separable objective function and a saddle-point problem. We model these two classes of problems uniformly by a mixed variational inequality, and show how PPA with … Read more

A SIMPLE APPROACH TO OPTIMALITY CONDITIONS IN MINMAX PROGRAMMING

Considering the minmax programming problem, lower and upper subdi erential optimality conditions, in the sense of Mordukhovich, are derived. The approach here, mainly based on the nonsmooth dual objects of Mordukhovich, is completely di erent from that of most of the previous works where generalizations of the alternative theorem of Farkas have been applied. The results obtained … Read more

Partial Smoothness,Tilt Stability, and Generalized Hessians

We compare two recent variational-analytic approaches to second-order conditions and sensitivity analysis for nonsmooth optimization. We describe a broad setting where computing the generalized Hessian of Mordukhovich is easy. In this setting, the idea of tilt stability introduced by Poliquin and Rockafellar is equivalent to a classical smooth second-order condition. ArticleDownload View PDF

Subdifferential of the conjugate function in general Banach spaces

We give explicit formulas for the subdifferential set of the conjugate of non necessarily convex functions defined on general Banach spaces. Even if such a subdifferential mapping takes its values in the bidual space, we show that up to a weak** closure operation it is still described by using only elements of the initial space … Read more

An Adaptive Gradient Sampling Algorithm for Nonsmooth Optimization

We present an algorithm for the minimization of f : Rn → R, assumed to be locally Lipschitz and continuously differentiable in an open dense subset D of Rn. The objective f may be non-smooth and/or non-convex. The method is based on the gradient sampling (GS) algorithm of Burke et al. [A robust gradient sampling … Read more

Linearized Alternating Direction Method with Gaussian Back Substitution for Separable Convex Programming

Recently, we have proposed to combine the alternating direction method (ADM) with a Gaussian back substitution procedure for solving the convex minimization model with linear constraints and a general separable objective function, i.e., the objective function is the sum of many functions without coupled variables. In this paper, we further study this topic and show … Read more

A First Order Method for Finding Minimal Norm-Like Solutions of Convex Optimization Problems

We consider a general class of convex optimization problems in which one seeks to minimize a strongly convex function over a closed and convex set which is by itself an optimal set of another convex problem. We introduce a gradient-based method, called the minimal norm gradient method, for solving this class of problems, and establish … Read more

A smooth perceptron algorithm

The perceptron algorithm, introduced in the late fifties in the machine learning community, is a simple greedy algorithm for finding a solution to a finite set of linear inequalities. The algorithm’s main advantages are its simplicity and noise tolerance. The algorithm’s main disadvantage is its slow convergence rate. We propose a modified version of the … Read more

Properties of a Cutting Plane Method for Semidefinite Programming

We analyze the properties of an interior point cutting plane algorithm that is based on a semi-infinite linear formulation of the dual semidefinite program. The cutting plane algorithm approximately solves a linear relaxation of the dual semidefinite program in every iteration and relies on a separation oracle that returns linear cutting planes. We show that … Read more