The structure of conservative gradient fields

The classical Clarke subdifferential alone is inadequate for understanding automatic differentiation in nonsmooth contexts. Instead, we can sometimes rely on enlarged generalized gradients called “conservative fields”, defined through the natural path-wise chain rule: one application is the convergence analysis of gradient-based deep learning algorithms. In the semi-algebraic case, we show that all conservative fields are … Read more

Polyhedral Separation via Difference of Convex (DC) Programming

We consider polyhedral separation of sets as a possible tool in supervised classification. In particular we focus on the optimization model introduced by Astorino and Gaudioso and adopt its reformulation in Difference of Convex (DC) form. We tackle the problem by adapting the algorithm for DC programming known as DCA. We present the results of … Read more

A Structure Exploiting Algorithm for Non-Smooth Semi-Linear Elliptic Optimal Control Problems

We investigate optimization problems with a non-smooth partial differential equation as constraint, where the non-smoothness is assumed to be caused by Nemytzkii operators generated by the functions abs, min and max. For the efficient as well as robust solution of such problems, we propose a new optimization method based on abs-linearization, i.e., a special handling … Read more

Moreau envelope of supremum functions with applications to infinite and stochastic programming

In this paper, we investigate the Moreau envelope of the supremum of a family of convex, proper, and lower semicontinuous functions. Under mild assumptions, we prove that the Moreau envelope of a supremum is the supremum of Moreau envelopes, which allows us to approximate possibly nonsmooth supremum functions by smooth functions that are also the … Read more

A Primal-Dual Algorithm for Risk Minimization

In this paper, we develop an algorithm to efficiently solve risk-averse optimization problems posed in reflexive Banach space. Such problems often arise in many practical applications as, e.g., optimization problems constrained by partial differential equations with uncertain inputs. Unfortunately, for many popular risk models including the coherent risk measures, the resulting risk-averse objective function is … Read more

Exterior-point Optimization for Nonconvex Learning

In this paper we present the nonconvex exterior-point optimization solver (NExOS)—a novel first-order algorithm tailored to constrained nonconvex learning problems. We consider the problem of minimizing a convex function over nonconvex constraints, where the projection onto the constraint set is single-valued around local minima. A wide range of nonconvex learning problems have this structure including … Read more

Faster Lagrangian-Based Methods in Convex Optimization

In this paper, we aim at unifying, simplifying, and improving the convergence rate analysis of Lagrangian-based methods for convex optimization problems. We first introduce the notion of nice primal algorithmic map, which plays a central role in the unification and in the simplification of the analysis of all Lagrangian-based methods. Equipped with a nice primal … Read more

Convergence of Proximal Gradient Algorithm in the Presence of Adjoint Mismatch

We consider the proximal gradient algorithm for solving penalized least-squares minimization problems arising in data science. This first-order algorithm is attractive due to its flexibility and minimal memory requirements allowing to tackle large-scale minimization problems involving non-smooth penalties. However, for problems such as X-ray computed tomography, the applicability of the algorithm is dominated by the … Read more

New efficient approach in finding a zero of a maximal monotone operator

In the paper, we provide a new efficient approach to find a zero of a maximal monotone operator under very mild assumptions. Using a regularization technique and the proximal point algorithm, we can construct a sequence that converges strongly to a solution with at least linear convergence rate. ArticleDownload View PDF

Finding the strongest stable massless column with a follower load and relocatable concentrated masses

We consider the problem of optimal placement of concentrated masses along a massless elastic column that is clamped at one end and loaded by a nonconservative follower force at the free end. The goal is to find the largest possible interval such that the variation in the loading parameter within this interval preserves stability of … Read more