Minimizing the difference of convex and weakly convex functions via bundle method

We consider optimization problems with objective and constraint being the difference of convex and weakly convex functions. This framework covers a vast family of nonsmooth and nonconvex optimization problems, particularly those involving certain classes of composite and nonconvex value functions. We investigate several stationary conditions and extend the proximal bundle algorithm of [van Ackooij et al., Comput. Optim. Appl., 78 (2021), pp. 451–490 ] to compute critical points for problems in this class. Our modifications on that algorithm boil down to a different master program and an original rule to update the proximal parameter to ensure convergence in this more general setting. Thanks to this new rule, no pre-estimation of the underlying weakly-convex moduli is needed, opening the way to deal with optimization problems for which no practical and mathematically sound algorithms exist. Numerical experiments on some nonconvex stochastic problems illustrate the practical performance of the method.

Article

Download

View Minimizing the difference of convex and weakly convex functions via bundle method