On proximal point-type algorithms for weakly convex functions and their connection to the backward Euler method

In this article we study the connection between proximal point methods for nonconvex optimization and the backward Euler method from numerical Ordinary Differential Equations (ODEs). We establish conditions for which these methods are equivalent. In the case of weakly convex functions, for small enough parameters, the implicit steps can be solved using a strongly convex objective function. In practice, this method can be faster than gradient descent. In this paper we find the optimal value of the regularization parameter.

Article

Download

View PDF