We investigate the behavior of the BFGS algorithm with an exact line search on nonsmooth functions. We show that it may fail on a simple polyhedral example, but that it apparently always succeeds on the Euclidean norm function, spiraling into the origin with a Q-linear rate of convergence; we prove this in the case of two variables. Dixon’s theorem implies that the result for the norm holds for all methods in the Broyden class of variable metric methods; we investigate how the limiting behavior of the steplengths depends on the Broyden parameter. Numerical experiments indicate that the convergence properties for $\|x\|$ extend to $\|Ax\|$, where $A$ is an $n \times n$ nonsingular matrix, and that the rate of convergence is independent of $A$ for fixed $n$. Finally, we show that steepest descent with an exact line search converges linearly for any positively homogeneous function that is $C^2$ everywhere except at the origin, but its rate of convergence for $\|Ax\|$ depends on the condition number of $A$, in contrast to BFGS.
Citation
Submitted to SIAM J. Optimization