Are we there yet? Manifold identification of gradient-related proximal methods

In machine learning, models that generalize better often generate outputs that lie on a low-dimensional manifold. Recently, several works have separately shown finite-time manifold identification by some proximal methods. In this work we provide a unified view by giving a simple condition under which any proximal method using a constant step size can achieve finite-iteration … Read more

On First and Second Order Optimality Conditions for Abs-Normal NLP

Structured nonsmoothness is widely present in practical optimization. A particularly attractive class of nonsmooth problems, both from a theoretical and from an algorithmic perspective, are optimization problems in so-called abs-normal form as developed by Griewank and Walther. Here we generalize their theory for the unconstrained case to nonsmooth NLPs with equality and inequality constraints in … Read more

Generalized conditional subgradient and generalized mirror descent: duality, convergence, and symmetry

We provide new insight into a generalized conditional subgradient algorithm and a generalized mirror descent algorithm for the convex minimization problem \[\min_x \; \{f(Ax) + h(x)\}.\] As Bach showed in [SIAM J. Optim., 25 (2015), pp. 115–129], applying either of these two algorithms to this problem is equivalent to applying the other one to its … Read more

Status Determination by Interior-Point Methods for Convex Optimization Problems in Domain-Driven Form

We study the geometry of convex optimization problems given in a Domain-Driven form and categorize possible statuses of these problems using duality theory. Our duality theory for the Domain-Driven form, which accepts both conic and non-conic constraints, lets us determine and certify statuses of a problem as rigorously as the best approaches for conic formulations … Read more

Minimization of nonsmooth nonconvex functions using inexact evaluations and its worst-case complexity

An adaptive regularization algorithm using inexact function and derivatives evaluations is proposed for the solution of composite nonsmooth nonconvex optimization. It is shown that this algorithm needs at most O(|log(epsilon)|.epsilon^{-2}) evaluations of the problem’s functions and their derivatives for finding an $\epsilon$-approximate first-order stationary point. This complexity bound therefore generalizes that provided by [Bellavia, Gurioli, … Read more

Fast and Faster Convergence of SGD for Over-Parameterized Models and an Accelerated Perceptron

Modern machine learning focuses on highly expressive models that are able to fit or interpolate the data completely, resulting in zero training loss. For such models, we show that the stochastic gradients of common loss functions satisfy a strong growth condition. Under this condition, we prove that constant step-size stochastic gradient descent (SGD) with Nesterov … Read more

Weak subgradient algorithm for solving nonsmooth nonconvex unconstrained optimization problems

This paper presents a weak subgradient based method for solving nonconvex unconstrained optimization problems. The method uses a weak subgradient of the objective function at a current point, to generate a new one at every iteration. The concept of the weak subgradient is based on the idea of using supporting cones to the graph of … Read more

An optimal control theory for accelerated optimization

Accelerated optimization algorithms can be generated using a double-integrator model for the search dynamics imbedded in an optimal control problem. CitationunpublishedArticleDownload View PDF

Exploiting Sparsity for Semi-Algebraic Set Volume Computation

We provide a systematic deterministic numerical scheme to approximate the volume (i.e. the Lebesgue measure) of a basic semi-algebraic set whose description follows a sparsity pattern. As in previous works (without sparsity), the underlying strategy is to consider an infinite-dimensional linear program on measures whose optimal value is the volume of the set. This is … Read more

Subdifferentials and SNC property of scalarization functionals with uniform level sets and applications

This paper deals with necessary conditions for minimal solutions of constrained and unconstrained optimization problems with respect to general domination sets by using a well-known nonlinear scalarization functional with uniform level sets (called Gerstewitz’ functional in the literature). The primary objective of this work is to establish revised formulas for basic and singular subdifferentials of … Read more