This paper addresses supervised learning problems with structured sparsity, where subsets of model coefficients form distinct groups. We introduce a novel log-composite regularizer in a bi-criteria optimization problem together with a loss (e.g., least squares) in order to reconstruct the desired group sparsity structure. We develop an iteratively reweighted algorithm that solves the group LASSO problem (a classic method that sums up the l2 norm of each group) iteratively until convergent, while updating group weights. Our approach enforces a group of smaller coefficients from the previous iterate to be more likely to set to zero, compared to the traditional group LASSO. Theoretical results include a minimizing property of the proposed model as well as the convergence of the iterative algorithm to a (directional) stationary solution under mild conditions. We conduct extensive experiments on synthetic and real datasets, indicating our method yields superior performance over the state-of-the-art methods in linear regression and binary classification.