We consider the minimization of a function $G$ defined on $R^N$, which is the sum of a (non necessarily convex) differentiable function and a (non necessarily differentiable) convex function. Moreover, we assume that $G$ satisfies the Kurdyka-Lojasiewicz property. Such a problem can be solved with the Forward-Backward algorithm. However, the latter algorithm may suffer from slow convergence. We propose an acceleration strategy based on the use of variable metrics and of the Majorize-Minimize principle. We give conditions under which the sequence generated by the resulting Variable Metric Forward-Backward algorithm converges to a critical point of $G$. Numerical results illustrate the performance of the proposed algorithm in an image reconstruction application.
View Variable Metric Forward-Backward algorithm for minimizing the sum of a differentiable function and a convex function