SESOP-TN is a method for very large scale unconstrained optimization of smooth functions. It combines ideas of Sequential Subspace Optimization (SESOP) [Narkiss-Zibulevsky-2005] with those of the Truncated Newton (TN) method . Replacing TN line search with subspace optimization, we allow Conjugate Gradient (CG) iterations to stay matched through consequent TN steps. This resolves the problem of TN sensitivity to an early break of the CG process. For example, when an objective function is quadratic, the SESOP-TN trajectory coincides with the trajectory of CG as applied directly to the objective. Standard TN lacks this property and converges more slowly. Numerical experiments illustrate the effectiveness of the method. Matlab code is available at http://ie.technion.ac.il/~mcib/sesoptn.html
Citation
Technical Report, CS Department, Technion, Israel, 29.09.2008