A Block Coordinate Descent Method for Regularized Multi-Convex Optimization with Applications to Nonnegative Tensor Factorization and Completion

This paper considers regularized block multi-convex optimization, where the feasible set and objective function are generally non-convex but convex in each block of variables. We review some of its interesting examples and propose a generalized block coordinate descent method. (Using proximal updates, we further allow non-convexity over some blocks.) Under certain conditions, we show that … Read more

A Proximal-Gradient Homotopy Method for the Sparse Least-Squares Problem

We consider solving the $\ell_1$-regularized least-squares ($\ell_1$-LS) problem in the context of sparse recovery, for applications such as compressed sensing. The standard proximal gradient method, also known as iterative soft-thresholding when applied to this problem, has low computational cost per iteration but a rather slow convergence rate. Nevertheless, when the solution is sparse, it often … Read more

Trace Norm Regularization: Reformulations, Algorithms, and Multi-task Learning

We consider a recently proposed optimization formulation of multi-task learning based on trace norm regularized least squares. While this problem may be formulated as a semidefinite program (SDP), its size is beyond general SDP solvers. Previous solution approaches apply proximal gradient methods to solve the primal problem. We derive new primal and dual reformulations of … Read more