Fast and Faster Convergence of SGD for Over-Parameterized Models and an Accelerated Perceptron

Modern machine learning focuses on highly expressive models that are able to fit or interpolate the data completely, resulting in zero training loss. For such models, we show that the stochastic gradients of common loss functions satisfy a strong growth condition. Under this condition, we prove that constant step-size stochastic gradient descent (SGD) with Nesterov … Read more

Basis Pursuit Denoise with Nonsmooth Constraints

Level-set optimization formulations with data-driven constraints minimize a regularization functional subject to matching observations to a given error level. These formulations are widely used, particularly for matrix completion and sparsity promotion in data interpolation and denoising. The misfit level is typically measured in the l2 norm, or other smooth metrics. In this paper, we present … Read more

Optimization of univariate functions on bounded intervals by interpolation and semidefinite programming

We consider the problem of minimizing a univariate, real-valued function f on an interval [a,b]. When f is a polynomial, we review how this problem may be reformulated as a semidefinite programming (SDP) problem, and review how to extract all global minimizers from the solution of the SDP problem. For general f, we approximate the … Read more

Non Convergence Result for Conformal Approximation ofVariational Problems Subject to a Convexity Constraint

In this article, we are interested in the minimization of functionals in the set of convex functions. We investigate the discretization of the convexity through various numerical methods and find a geometrical obstruction confirmed by numerical simulations. We prove that there exist some convex functions that cannot be the limit of any conformal $P_1$ Finite … Read more