SESOP-TN: Combining Sequential Subspace Optimization with Truncated Newton method

SESOP-TN is a method for very large scale unconstrained optimization of smooth functions. It combines ideas of Sequential Subspace Optimization (SESOP) [Narkiss-Zibulevsky-2005] with those of the Truncated Newton (TN) method . Replacing TN line search with subspace optimization, we allow Conjugate Gradient (CG) iterations to stay matched through consequent TN steps. This resolves the problem … Read more

Coordinate and Subspace Optimization Methods for Linear Least Squares with Non-Quadratic Regularization

This work addresses the problem of regularized linear least squares (RLS) with non-quadratic separable regularization. Despite being frequently deployed in many applications, the RLS problem is often hard to solve using standard iterative methods. In a recent work [10], a new iterative method called Parallel Coordinate Descent (PCD) was devised. We provide herein a convergence … Read more

Sequential Subspace Optimization Method for Large-Scale Unconstrained Problems

We present the Sequential Subspace Optimization (SESOP) method for large scale smooth unconstrained problems. At each iteration we search for a minimum of the objective function over a subspace spanned by the current gradient and by directions of few previous steps. We also include into this subspace the direction from the starting point to the … Read more

Support Vector Machine via Sequential Subspace Optimization

We present an optimization engine for large scale pattern recognition using Support Vector Machine (SVM). Our treatment is based on conversion of soft-margin SVM constrained optimization problem to an unconstrained form, and solving it using newly developed Sequential Subspace Optimization (SESOP) method. SESOP is a general tool for large-scale smooth unconstrained optimization. At each iteration … Read more

Blind Source Separation using Relative Newton Method combined with Smoothing Method of Multipliers

We study a relative optimization framework for quasi-maximum likelihood blind source separation and relative Newton method as its particular instance. The structure of the Hessian allows its fast approximate inversion. In the second part we present Smoothing Method of Multipliers (SMOM) for minimization of sum of pairwise maxima of smooth functions, in particular sum of … Read more

Smoothing Method of Multipliers for Sum-Max Problems

We study nonsmooth unconstrained optimization problem, which includes sum of pairwise maxima of smooth functions. Minimum $l_1$-norm approximation is a particular case of this problem. Combining ideas Lagrange multipliers with smooth approximation of max-type function, we obtain a new kind of nonquadratic augmented Lagrangian. Our approach does not require artificial variables, and preserves sparse structure … Read more