A Quasi-Newton Algorithm for Nonconvex, Nonsmooth Optimization with Global Convergence Guarantees

A line search algorithm for minimizing nonconvex and/or nonsmooth objective functions is presented. The algorithm is a hybrid between a standard Broyden–Fletcher–Goldfarb–Shanno (BFGS) and an adaptive gradient sampling (GS) method. The BFGS strategy is employed because it typically yields fast convergence to the vicinity of a stationary point, and together with the adaptive GS strategy … Read more

An Adaptive Gradient Sampling Algorithm for Nonsmooth Optimization

We present an algorithm for the minimization of f : Rn → R, assumed to be locally Lipschitz and continuously differentiable in an open dense subset D of Rn. The objective f may be non-smooth and/or non-convex. The method is based on the gradient sampling (GS) algorithm of Burke et al. [A robust gradient sampling … Read more