Composite optimization models via proximal gradient method with increasing adaptive stepsizes

We first consider the convex composite optimization models with locally Lipschitz condition imposed on the gradient of the differentiable term. The classical method which is proximal gradient will be studied with our new strategy of stepsize selection. Our proposed stepsize can be computed conveniently by explicit forms. The sequence of our stepsizes is proved to … Read more