Low-M-Rank Tensor Completion and Robust Tensor PCA

In this paper, we propose a new approach to solve low-rank tensor completion and robust tensor PCA. Our approach is based on some novel notion of (even-order) tensor ranks, to be called the M-rank, the symmetric M-rank, and the strongly symmetric M-rank. We discuss the connections between these new tensor ranks and the CP-rank and … Read more

An ADMM-Based Interior-Point Method for Large-Scale Linear Programming

In this paper, we propose a new framework to implement interior point method (IPM) in order to solve some very large scale linear programs (LP). Traditional IPMs typically use Newton’s method to approximately solve a subproblem that aims to minimize a log-barrier penalty function at each iteration. Due its connection to Newton’s method, IPM is … Read more

Tightness of a new and enhanced semidefinite relaxation for MIMO detection

In this paper, we consider a fundamental problem in modern digital communications known as multi-input multi-output (MIMO) detection, which can be formulated as a complex quadratic programming problem subject to unit-modulus and discrete argument constraints. Various semidefinite relaxation (SDR) based algorithms have been proposed to solve the problem in the literature. In this paper, we … Read more

Primal-Dual Optimization Algorithms over Riemannian Manifolds: an Iteration Complexity Analysis

In this paper we study nonconvex and nonsmooth multi-block optimization over Riemannian manifolds with coupled linear constraints. Such optimization problems naturally arise from machine learning, statistical learning, compressive sensing, image processing, and tensor PCA, among others. We develop an ADMM-like primal-dual approach based on decoupled solvable subroutines such as linearized proximal mappings. First, we introduce … Read more

Vector Transport-Free SVRG with General Retraction for Riemannian Optimization: Complexity Analysis and Practical Implementation

In this paper, we propose a vector transport-free stochastic variance reduced gradient (SVRG) method with general retraction for empirical risk minimization over Riemannian manifold. Existing SVRG methods on manifold usually consider a specific retraction operation, and involve additional computational costs such as parallel transport or vector transport. The vector transport-free SVRG with general retraction we … Read more

Randomized Primal-Dual Proximal Block Coordinate Updates

In this paper we propose a randomized primal-dual proximal block coordinate updating framework for a general multi-block convex optimization model with coupled objective function and linear constraints. Assuming mere convexity, we establish its $O(1/t)$ convergence rate in terms of the objective value and feasibility measure. The framework includes several existing algorithms as special cases such … Read more

Structured Nonconvex and Nonsmooth Optimization: Algorithms and Iteration Complexity Analysis

Nonconvex optimization problems are frequently encountered in much of statistics, business, science and engineering, but they are not yet widely recognized as a technology. A reason for this relatively low degree of popularity is the lack of a well developed system of theory and algorithms to support the applications, as is the case for its … Read more

Global Convergence of Unmodified 3-Block ADMM for a Class of Convex Minimization Problems

The alternating direction method of multipliers (ADMM) has been successfully applied to solve structured convex optimization problems due to its superior practical performance. The convergence properties of the 2-block ADMM have been studied extensively in the literature. Specifically, it has been proven that the 2-block ADMM globally converges for any penalty parameter $\gamma>0$. In this … Read more

First-Order Algorithms for Convex Optimization with Nonseparate Objective and Coupled Constraints

In this paper we consider a block-structured convex optimization model, where in the objective the block-variables are nonseparable and they are further linearly coupled in the constraint. For the 2-block case, we propose a number of first-order algorithms to solve this model. First, the alternating direction method of multipliers (ADMM) is extended, assuming that it … Read more

Iteration Complexity Analysis of Multi-Block ADMM for a Family of Convex Minimization without Strong Convexity

The alternating direction method of multipliers (ADMM) is widely used in solving structured convex optimization problems due to its superior practical performance. On the theoretical side however, a counterexample was shown in [7] indicating that the multi-block ADMM for minimizing the sum of $N$ $(N\geq 3)$ convex functions with $N$ block variables linked by linear … Read more