Algorithmic Differentiation for Piecewise Smooth Functions: A Case Study for Robust Optimization

This paper presents a minimization method for Lipschitz continuous, piecewise smooth objective functions based on algorithmic differentiation (AD). We assume that all nondifferentiabilities are caused by abs(), min(), and max(). The optimization method generates successively piecewise linearizations in abs-normal form and solves these local subproblems by exploiting the resulting kink structure. Both, the generation of … Read more

An Algorithm for Nonsmooth Optimization by Successive Piecewise Linearization

We present an optimization method for Lipschitz continuous, piecewise smooth (PS) objective functions based on successive piecewise linearization. Since, in many realistic cases, nondifferentiabilities are caused by the occurrence of abs(), max(), and min(), we concentrate on these nonsmooth elemental functions. The method’s idea is to locate an optimum of a PS objective function by … Read more

First and second order optimality conditions for piecewise smooth objective functions

Any piecewise smooth function that is specified by an evaluation procedures involving smooth elemental functions and piecewise linear functions like min and max can be represented in the so-called abs-normal form. By an extension of algorithmic, or automatic differentiation, one can then compute certain first and second order derivative vectors and matrices that represent a … Read more

On Theoretical and Numerical Aspects of the Shape Sensitivity Analysis for the 3D Time-dependent Maxwell’s Equations

We propose a novel approach using shape derivatives to solve inverse optimization problems governed by Maxwell’s equations, focusing on identifying hidden geometric objects in a predefined domain. The target functional is of tracking type and determines the distance between the solution of a 3D time-dependent Maxwell problem and given measured data in an $L_2$-norm. Minimization … Read more

On an Extension of One-Shots Methods to Incorporate Additional Constraints

For design optimization tasks, quite often a so-called one-shot approach is used. It augments the solution of the state equation with a suitable adjoint solver yielding approximate reduced derivatives that can be used in an optimization iteration to change the design. The coordination of these three iterative processes is well established when only the state … Read more

On an inexact trust-region SQP-filter method for constrained nonlinear optimization

A class of trust-region algorithms is developed and analyzed for the solution of optimization problems with nonlinear equality and inequality constraints. Based on composite-step trust region methods and a filter approach, the resulting algorithm also does not require the computation of exact Jacobians; only Jacobian vector products are used along with approximate Jacobian matrices. As … Read more

On Lipschitz optimization based on gray-box piecewise linearization

We address the problem of minimizing objectives from the class of piecewise differentiable functions whose nonsmoothness can be encapsulated in the absolute value function. They possess local piecewise linear approximations with a discrepancy that can be bounded by a quadratic proximal term. This overestimating local model is continuous but generally nonconvex. It can be generated … Read more

Adjoint Broyden a la GMRES

It is shown that a compact storage implementation of a quasi-Newton method based on the adjoint Broyden update reduces in the affine case exactly to the well established GMRES procedure. Generally, storage and linear algebra effort per step are small multiples of n k, where n is the number of variables and k the number … Read more