An Alternating Manifold Proximal Gradient Method for Sparse PCA and Sparse CCA

Sparse principal component analysis (PCA) and sparse canonical correlation analysis (CCA) are two essential techniques from high-dimensional statistics and machine learning for analyzing large-scale data. Both problems can be formulated as an optimization problem with nonsmooth objective and nonconvex constraints. Since non-smoothness and nonconvexity bring numerical difficulties, most algorithms suggested in the literature either solve … Read more

Proximal Gradient Method for Nonsmooth Optimization over the Stiefel Manifold

We consider optimization problems over the Stiefel manifold whose objective function is the summation of a smooth function and a nonsmooth function. Existing methods for solving this kind of problems can be classified into three classes. Algorithms in the first class rely on information of the subgradients of the objective function and thus tend to … Read more

Stochastic Primal-Dual Method for Empirical Risk Minimization with O(1) Per-Iteration Complexity

Regularized empirical risk minimization problem with linear predictor appears frequently in machine learning. In this paper, we propose a new stochastic primal-dual method to solve this class of problems. Different from existing methods, our proposed methods only require O(1) operations in each iteration. We also develop a variance-reduction variant of the algorithm that converges linearly. … Read more

Low-M-Rank Tensor Completion and Robust Tensor PCA

In this paper, we propose a new approach to solve low-rank tensor completion and robust tensor PCA. Our approach is based on some novel notion of (even-order) tensor ranks, to be called the M-rank, the symmetric M-rank, and the strongly symmetric M-rank. We discuss the connections between these new tensor ranks and the CP-rank and … Read more

Efficient Optimization Algorithms for Robust Principal Component Analysis and Its Variants

Robust PCA has drawn significant attention in the last decade due to its success in numerous application domains, ranging from bio-informatics, statistics, and machine learning to image and video processing in computer vision. Robust PCA and its variants such as sparse PCA and stable PCA can be formulated as optimization problems with exploitable special structures. … Read more

An ADMM-Based Interior-Point Method for Large-Scale Linear Programming

In this paper, we propose a new framework to implement interior point method (IPM) in order to solve some very large scale linear programs (LP). Traditional IPMs typically use Newton’s method to approximately solve a subproblem that aims to minimize a log-barrier penalty function at each iteration. Due its connection to Newton’s method, IPM is … Read more

Robust Principal Component Analysis using Facial Reduction

We study algorithms for robust principal component analysis (RPCA) for a partially observed data matrix. The aim is to recover the data matrix as a sum of a low-rank matrix and a sparse matrix so as to eliminate erratic noise (outliers). This problem is known to be NP-hard in general. A classical way to solve … Read more

Primal-Dual Optimization Algorithms over Riemannian Manifolds: an Iteration Complexity Analysis

In this paper we study nonconvex and nonsmooth multi-block optimization over Riemannian manifolds with coupled linear constraints. Such optimization problems naturally arise from machine learning, statistical learning, compressive sensing, image processing, and tensor PCA, among others. We develop an ADMM-like primal-dual approach based on decoupled solvable subroutines such as linearized proximal mappings. First, we introduce … Read more

Vector Transport-Free SVRG with General Retraction for Riemannian Optimization: Complexity Analysis and Practical Implementation

In this paper, we propose a vector transport-free stochastic variance reduced gradient (SVRG) method with general retraction for empirical risk minimization over Riemannian manifold. Existing SVRG methods on manifold usually consider a specific retraction operation, and involve additional computational costs such as parallel transport or vector transport. The vector transport-free SVRG with general retraction we … Read more

Geometric descent method for convex composite minimization

In this paper, we extend the geometric descent method recently proposed by Bubeck, Lee and Singh to tackle nonsmooth and strongly convex composite problems. We prove that our proposed algorithm, dubbed geometric proximal gradient method (GeoPG), converges with a linear rate $(1-1/\sqrt{\kappa})$ and thus achieves the optimal rate among first-order methods, where $\kappa$ is the … Read more