# A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization

Let $f$ be a continuous function on $\Rl^n$, and suppose $f$ is continuously differentiable on an open dense subset. Such functions arise in many applications, and very often minimizers are points at which $f$ is not differentiable. Of particular interest is the case where $f$ is not convex, and perhaps not even locally Lipschitz, but whose gradient is easily computed where it is defined. We present a practical, robust algorithm to locally minimize such functions, based on \emph{gradient sampling}. No subgradient information is required by the algorithm. When $f$ is locally Lipschitz and has bounded level sets, and the sampling radius $\eps$ is fixed, we show that, with probability one, the algorithm generates a sequence with a cluster point that is Clarke $\eps$-stationary. Furthermore, we show that if $f$ has a unique Clarke stationary point $\bar x$, then the set of all cluster points generated by the algorithm converges to $\bar x$ as $\eps$ is reduced to zero. Numerical results are presented demonstrating the robustness of the algorithm and its applicability in a wide variety of contexts, including cases where $f$ is not locally Lipschitz at minimizers. We report approximate local minimizers for functions in the applications literature which have not, to our knowledge, been obtained previously. When the termination criteria of the algorithm are satisfied, a precise statement about nearness to Clarke $\eps$-stationarity is available. A \matlab\ implementation of the algorithm is posted on the Web.

## Citation

Technical Report, Courant Institute of Mathematical Sciences, Oct 2003. Submitted to SIAM J. Optim.