A Semismooth Conjugate Gradients Method — Theoretical Analysis

In large scale applications, deterministic and stochastic variants of Cauchy’s steepest descent method are widely used for the minimization of objectives that are only piecewise smooth. In this paper we analyse a  deterministic descent method based on the generalization of rescaled conjugate gradients proposed by Philip Wolfe in 1975 for objectives that are convex. Without … Read more

An active signature method for constrained abs-linear minimization

In this paper we consider the solution of optimization tasks with a piecewise linear objective function and piecewise linear constraints. First, we state optimality conditions for that class of problems using the abs-linearization approach and prove that they can be verified in polynomial time. Subsequently, we propose an algorithm called Constrained Active Signature Method that … Read more

On the abs-polynomial expansion of piecewise smooth functions

Tom Streubel has observed that for functions in abs-normal form, generalized Taylor expansions of arbitrary order $\bd \!- \!1$ can be generated by algorithmic piecewise differentiation. Abs-normal form means that the real or vector valued function is defined by an evaluation procedure that involves the absolute value function $|\cdot|$ apart from arithmetic operations and $\bd$ … Read more

An Algorithm for Piecewise Linear Optimization of Objective Functions in Abs-normal Form

In the paper [11] we derived first order (KKT) and second order (SSC) optimality conditions for functions defined by evaluation programs involving smooth elementals and absolute values. For this class of problems we showed in [12] that the natural algorithm of successive piecewise linear optimization with a proximal term (SPLOP) achieves a linear or even … Read more

Relaxing kink qualifications and proving convergence rates in piecewise smooth optimization

Abstract. In the paper [9] we derived first order (KKT) and second order (SSC) optimality conditions for functions defined by evaluation programs involving smooth elementals and absolute values. In that analysis, a key assumption on the local piecewise linearization was the Linear Independence Kink Qualification (LIKQ), a generalization of the Linear Independence Constraint Qualification (LICQ) … Read more

Characterizing and testing subdifferential regularity for piecewise smooth objective functions

Functions defined by evaluation programs involving smooth elementals and absolute values as well as the max- and min-operator are piecewise smooth. Using piecewise linearization we derived in [7] for this class of nonsmooth functions first and second order conditions for local optimality (MIN). They are necessary and sufficient, eespectively. These generalizations of the classical KKT … Read more

Algorithmic Differentiation for Piecewise Smooth Functions: A Case Study for Robust Optimization

This paper presents a minimization method for Lipschitz continuous, piecewise smooth objective functions based on algorithmic differentiation (AD). We assume that all nondifferentiabilities are caused by abs(), min(), and max(). The optimization method generates successively piecewise linearizations in abs-normal form and solves these local subproblems by exploiting the resulting kink structure. Both, the generation of … Read more

An Algorithm for Nonsmooth Optimization by Successive Piecewise Linearization

We present an optimization method for Lipschitz continuous, piecewise smooth (PS) objective functions based on successive piecewise linearization. Since, in many realistic cases, nondifferentiabilities are caused by the occurrence of abs(), max(), and min(), we concentrate on these nonsmooth elemental functions. The method’s idea is to locate an optimum of a PS objective function by … Read more

First and second order optimality conditions for piecewise smooth objective functions

Any piecewise smooth function that is specified by an evaluation procedures involving smooth elemental functions and piecewise linear functions like min and max can be represented in the so-called abs-normal form. By an extension of algorithmic, or automatic differentiation, one can then compute certain first and second order derivative vectors and matrices that represent a … Read more

On Lipschitz optimization based on gray-box piecewise linearization

We address the problem of minimizing objectives from the class of piecewise differentiable functions whose nonsmoothness can be encapsulated in the absolute value function. They possess local piecewise linear approximations with a discrepancy that can be bounded by a quadratic proximal term. This overestimating local model is continuous but generally nonconvex. It can be generated … Read more