The first part of this paper discusses convergence properties of a new line search method for the optimization of continuously differentiable functions with Lipschitz continuous gradient. The line search uses (apart from the gradient at the current best point) function values only. After deriving properties of the new, in general curved, line search, global convergence conditions for an unconstrained optimization algorithm are derived and applied to prove the global convergence of a new nonlinear conjugate gradient (CG) method. This method works with the new, gradient-free line search - unlike traditional nonlinear CG methods that require line searches satisfying the Wolfe condition. In the second part, a class of algorithms is developed for bound constrained optimization. The new scheme uses the gradient-free line search along bent search paths. Unlike traditional algorithms for bound constrained optimization, our algorithm ensures that the reduced gradient becomes arbitrarily small. It is also proved that all strongly active variables are found and fixed after finitely many iterations. A Matlab implementation of a bound constrained solver based on the new theory is discussed in the companion paper LMBOPT - a limited memory program for bound-constrained optimization by M. Kimiaei and the present authors.
Article
View Line search and convergence in bound-constrained optimization