A Generalized Formulation for Group Selection via ADMM

This paper studies a statistical learning model where the model coefficients have a pre-determined non-overlapping group sparsity structure. We consider a combination of a loss function and a regularizer to recover the desired group sparsity patterns, which can embrace many existing works. We analyze the directional stationary solution of the proposed formulation, obtaining a sufficient … Read more

Iteratively Reweighted Group Lasso based on Log-composite Regularization

This paper addresses supervised learning problems with structured sparsity, where subsets of model coefficients form distinct groups. We introduce a novel log-composite regularizer in a bi-criteria optimization problem together with a loss (e.g., least squares) in order to reconstruct the desired group sparsity structure. We develop an iteratively reweighted algorithm that solves the group LASSO … Read more

Consistency Bounds and Support Recovery of D-stationary Solutions of Sparse Sample Average Approximations

This paper studies properties of the d(irectional)-stationary solutions of sparse sample average approximation (SAA) problems involving difference-of-convex (dc) sparsity functions under a deterministic setting. Such properties are investigated with respect to a vector which satisfies a verifiable assumption to relate the empirical SAA problem to the expectation minimization problem defined by an underlying data distribution. … Read more