Minimizing the difference of convex and weakly convex functions via bundle method

We consider optimization problems with objective and constraint being the difference of convex and weakly convex functions. This framework covers a vast family of nonsmooth and nonconvex optimization problems, particularly those involving Difference-of-Convex (DC) functions with known DC decomposition, functions whose gradient is Lipschitz continuous, as well as problems that comprise certain classes of composite and nonconvex value functions. We investigate several stationary conditions and present a proximal bundle method to compute critical points for problems of this class. Our algorithm, which employs an improvement function and an original rule to update the proximal parameter to ensure convergence, relies on cutting-plane approximations of the convex functions and linearizations of the weakly convex ones to construct a strictly convex quadratic master subproblem yielding new iterates. Numerical experiments on some nonconvex stochastic problems illustrate the practical performance of the method.

Article

Download

View Minimizing the difference of convex and weakly convex functions via bundle method