Convergence Analysis of an Inertial Dynamical System with Hessian-Driven Damping under θ-Parametrized Implicit–Explicit Discretization

In this paper, we consider an unconstrained composite convex optimisation problem. We propose an inertial forward–backward algorithm derived from an implicit– explicit discretisation of a second-order dynamical system with Hessian-driven damping. For α ≥ 3, we establish an O(1/d^2) convergence rate for the objective value gap. Furthermore, when α > 3, we prove that the … Read more

Lyapunov-based Analysis on First Order Method for Composite Strong-Weak Convex Functions

The Nesterov’s accelerated gradient (NAG) method generalizes the classical gradient descent algorithm by improving the convergence rate from $\mathcal{O}\left(\frac{1}{t}\right)$ to $\mathcal{O}\left(\frac{1}{t^2}\right)$ in convex optimization. This study examines the proximal gradient framework for additively separable composite functions with smooth and non-smooth components. We demonstrate that Nesterov’s accelerated proximal gradient (NAPG$_\alpha$) method attains a convergence rate of … Read more