Nonsmooth Optimization via BFGS

We investigate the BFGS algorithm with an inexact line search when applied to nonsmooth functions, not necessarily convex. We define a suitable line search and show that it generates a sequence of nested intervals containing points satisfying the Armijo and weak Wolfe conditions, assuming only absolute continuity. We also prove that the line search terminates for all semi-algebraic functions. The analysis of the convergence of BFGS using this line search seems very challenging; our theoretical results are limited to the univariate case. However, we systematically investigate the numerical behavior of BFGS with the inexact line search on various classes of examples. The method consistently converges to local minimizers on all but the most difficult class of examples, and even in that case, the method converges to points that are apparently Clarke stationary. Furthermore, the convergence rate is observed to be linear with respect to the number of function evaluations, with a rate of convergence that varies in an unexpectedly consistent way with the problem parameters. When the problem is sufficiently difficult, convergence may not be observed, but this seems to be due to rounding error caused by ill-conditioning. We try to give insight into \emph{why} BFGS works as well as it does, and we conclude with a bold conjecture.

Citation

Submitted to SIAM J. Optimization.

Article

Download

View PDF