Characterizing and testing subdifferential regularity for piecewise smooth objective functions

Functions defined by evaluation programs involving smooth elementals and absolute values as well as the max- and min-operator are piecewise smooth. Using piecewise linearization we derived in [7] for this class of nonsmooth functions first and second order conditions for local optimality (MIN). They are necessary and sufficient, eespectively. These generalizations of the classical KKT … Read more

Exact worst-case convergence rates of the proximal gradient method for composite convex minimization

We study the worst-case convergence rates of the proximal gradient method for minimizing the sum of a smooth strongly convex function and a non-smooth convex function whose proximal operator is available. We establish the exact worst-case convergence rates of the proximal gradient method in this setting for any step size and for different standard performance … Read more

A Self-Correcting Variable-Metric Algorithm Framework for Nonsmooth Optimization

An algorithm framework is proposed for minimizing nonsmooth functions. The framework is variable-metric in that, in each iteration, a step is computed using a symmetric positive definite matrix whose value is updated as in a quasi-Newton scheme. However, unlike previously proposed variable-metric algorithms for minimizing nonsmooth functions, the framework exploits self-correcting properties made possible through … Read more

Resource Allocation for Contingency Planning: An Inexact Bundle Method for Stochastic Optimization

Resource allocation models in contingency planning aim to mitigate unexpected failures in supply chains due to disruptions with rare occurrence but disastrous consequences. This paper formulates this problems as a two-stage stochastic optimization with a risk-averse recourse function, and proposes a novel computationally tractable solution approach. The method relies on an inexact bundle method and … Read more

Relative-Continuity” for Non-Lipschitz Non-Smooth Convex Optimization using Stochastic (or Deterministic) Mirror Descent

The usual approach to developing and analyzing first-order methods for non-smooth (stochastic or deterministic) convex optimization assumes that the objective function is uniformly Lipschitz continuous with parameter $M_f$. However, in many settings the non-differentiable convex function $f(\cdot)$ is not uniformly Lipschitz continuous — for example (i) the classical support vector machine (SVM) problem, (ii) the … Read more

Derivative-Free Robust Optimization by Outer Approximations

We develop an algorithm for minimax problems that arise in robust optimization in the absence of objective function derivatives. The algorithm utilizes an extension of methods for inexact outer approximation in sampling a potentially infinite-cardinality uncertainty set. Clarke stationarity of the algorithm output is established alongside desirable features of the model-based trust-region subproblems encountered. We … Read more

Manifold Sampling for Optimization of Nonconvex Functions that are Piecewise Linear Compositions of Smooth Components

We develop a manifold sampling algorithm for the minimization of a nonsmooth composite function $f \defined \psi + h \circ F$ when $\psi$ is smooth with known derivatives, $h$ is a known, nonsmooth, piecewise linear function, and $F$ is smooth but expensive to evaluate. The trust-region algorithm classifies points in the domain of $h$ as … Read more

Clustering and Multifacility Location with Constraints via Distance Function Penalty Method and DC Programming

This paper is a continuation of our effort in using mathematical optimization involving DC programming in clustering and multifacility location. We study a penalty method based on distance functions and apply it particularly to a number of problems in clustering and multifacility location in which the centers to be found must lie in some given … Read more

Non-smooth Non-convex Bregman Minimization: Unification and new Algorithms

We propose a unifying algorithm for non-smooth non-convex optimization. The algorithm approximates the objective function by a convex model function and finds an approximate (Bregman) proximal point of the convex model. This approximate minimizer of the model function yields a descent direction, along which the next iterate is found. Complemented with an Armijo-like line search … Read more

Chambolle-Pock and Tseng’s methods: relationship and extension to the bilevel optimization

In the first part of the paper we focus on two problems: (a) regularized least squares and (b) nonsmooth minimization over an affine subspace. For these problems we establish the connection between the primal-dual method of Chambolle-Pock and Tseng’s proximal gradient method. For problem (a) it allows us to derive a nonergodic $O(1/k^2)$ convergence rate … Read more