A Generalized Formulation for Group Selection via ADMM

This paper studies a statistical learning model where the model coefficients have a pre-determined non-overlapping group sparsity structure. We consider a combination of a loss function and a regularizer to recover the desired group sparsity patterns, which can embrace many existing works. We analyze the directional stationary solution of the proposed formulation, obtaining a sufficient … Read more

Iteratively Reweighted Group Lasso based on Log-composite Regularization

This paper addresses supervised learning problems with structured sparsity, where subsets of model coefficients form distinct groups. We introduce a novel log-composite regularizer in a bi-criteria optimization problem together with a loss (e.g., least squares) in order to reconstruct the desired group sparsity structure. We develop an iteratively reweighted algorithm that solves the group LASSO … Read more

High-Order Evaluation Complexity for Convexly-Constrained Optimization with Non-Lipschitzian Group Sparsity Terms

This paper studies high-order evaluation complexity for partially separable convexly-constrained optimization involving non-Lipschitzian group sparsity terms in a nonconvex objective function. We propose a partially separable adaptive regularization algorithm using a $p$-th order Taylor model and show that the algorithm can produce an (epsilon,delta)-approximate q-th-order stationary point in at most O(epsilon^{-(p+1)/(p-q+1)}) evaluations of the objective … Read more

Group Sparse Optimization by Alternating Direction Method

This paper proposes efficient algorithms for group sparse optimization with mixed L21-regularization, which arises from the reconstruction of group sparse signals in compressive sensing, and the group Lasso problem in statistics and machine learning. It is known that encoding the group information in addition to sparsity will lead to better signal recovery/feature selection. The L21-regularization … Read more

Efficient Block-coordinate Descent Algorithms for the Group Lasso

We present two algorithms to solve the Group Lasso problem [Yuan & Lin]. First, we propose a general version of the Block Coordinate Descent (BCD) algorithm for the Group Lasso that employs an efficient approach for optimizing each subproblem. We show that it exhibits excellent performance when the groups are of moderate sizes. For large … Read more

Sparse optimization with least-squares constraints

The use of convex optimization for the recovery of sparse signals from incomplete or compressed data is now common practice. Motivated by the success of basis pursuit in recovering sparse vectors, new formulations have been proposed that take advantage of different types of sparsity. In this paper we propose an efficient algorithm for solving a … Read more