Sequential Subspace Optimization Method for Large-Scale Unconstrained Problems

We present the Sequential Subspace Optimization (SESOP) method for large scale smooth unconstrained problems. At each iteration we search for a minimum of the objective function over a subspace spanned by the current gradient and by directions of few previous steps. We also include into this subspace the direction from the starting point to the current point, and a weighted sum of all previous gradients, following [Nemirovski-1982]. This safeguard measure provides an optimal worst case convergence rate of order 1/N^2 (for convex problems), where N is the iteration count. In the case of quadratic objective, the method is equivalent to the conjugate gradients method. We identify an important class of problems, where subspace optimization can be implemented extremely fast. This happens when the objective function is a combination of expensive linear mappings with computationally cheap non-linear functions. This is a typical situation in many applications, like tomography, signal and image denoising with Basis Pursuit, pattern recognition with Support Vector Machine, and many others. We demonstrate highly competitive numerical results using examples from the mentioned areas.

Citation

Tech. Report CCIT No 559, EE Dept., Technion, Haifa, Israel, September 2005

Article

Download

View PDF