Lyapunov-based Analysis on First Order Method for Composite Strong-Weak Convex Functions

The Nesterov’s accelerated gradient (NAG) method generalizes the classical gradient descent algorithm by improving the convergence rate from $\mathcal{O}\left(\frac{1}{t}\right)$ to $\mathcal{O}\left(\frac{1}{t^2}\right)$ in convex optimization. This study examines the proximal gradient framework for additively separable composite functions with smooth and non-smooth components. We demonstrate that Nesterov’s accelerated proximal gradient (NAPG$_\alpha$) method attains a convergence rate of … Read more

Douglas-Rachford method for the feasibility problem involving a circle and a disc

The Douglas-Rachford algorithm is a classical and a successful method for solving the feasibility problems. Here, we provide a region for global convergence of the algorithm for the feasibility problem involving a disc and a circle in the Euclidean space of dimension two. Citation1. Borwein, J.M., Sims, B.: The Douglas-Rachford algorithm in the absence of … Read more

An Approximate Lagrange Multiplier Rule

In this paper, we show that for a large class of optimization problems, the Lagrange multiplier rule can be derived from the so-called approximate multiplier rule. In establishing the link between the approximate and the exact multiplier rule we first derive an approximate multiplier rule for a very general class of optimization problems using the … Read more