An active set method for bound-constrained optimization

In this paper, a class of algorithms is developed for bound-constrained optimization. The new scheme uses the gradient-free line search along bent search paths. Unlike traditional algorithms for bound-constrained optimization, our algorithm ensures that the reduced gradient becomes arbitrarily small. It is also proved that all strongly active variables are found and fixed after finitely … Read more

Globally linearly convergent nonlinear conjugate gradients without Wolfe line search

This paper introduces a new nonlinear conjugate gradient (CG) method using an efficient gradient-free line search method. Unless function values diverge to $-\infty$, global convergence to a stationary point is proved for continuously differentiable objective functions with Lipschitz continuous gradient, and global linear convergence if this stationary point is a strong local minimizer. The $n$-iterations … Read more

LMBOPT — a limited memory method for bound-constrained optimization

Recently, Neumaier and Azmi gave a comprehensive convergence theory for a generic algorithm for bound constrained optimization problems with a continuously differentiable objective function. The algorithm combines an active set strategy with a gradient-free line search CLS along a piecewise linear search path defined by directions chosen to reduce zigzagging. This paper describes LMBOPT, an … Read more

Line search and convergence in bound-constrained optimization

The first part of this paper discusses convergence properties of a new line search method for the optimization of continuously differentiable functions with Lipschitz continuous gradient. The line search uses (apart from the gradient at the current best point) function values only. After deriving properties of the new, in general curved, line search, global convergence … Read more