Composite optimization models via proximal gradient method with increasing adaptive stepsizes

We first consider the convex composite optimization models without globally Lipschitz condition imposed on the gradient of the differentiable term. The classical method which is proximal gradient will be studied with our new strategy of stepsize selection. The idea for constructing such a stepsize is motivated by the one in \cite{hoai} that used for gradient … Read more