Composite optimization models via proximal gradient method with a novel enhanced adaptive stepsize

We first consider the convex composite optimization models with the local Lipschitzness condition imposed on the gradient of the differentiable term. The classical proximal gradient method will be studied with our novel enhanced adaptive stepsize selection. To obtain the convergence of the proposed algorithm, we establish a sufficient decrease type inequality associated with our new … Read more

A new proximal gradient algorithm for solving mixed variational inequality problems with a novel explicit stepsize and applications

In this paper, we propose a new algorithm for solving monotone mixed variational inequality problems in real Hilbert spaces based on proximal gradient method. Our new algorithm uses a novel explicit stepsize which is proved to be increasing to a positive value. This property plays an important role in improving the speed of the algorithm. … Read more