Unmatched Preconditioning of the Proximal Gradient Algorithm

This works addresses the resolution of penalized least-squares problems using the proximal gradient algorithm (PGA). It is known that PGA can be accelerated by preconditioning strategies. However, typical effective choices of preconditioners may correspond to intricate matrices that are not easily inverted, and lead to an increased complexity in the computation of the proximity step. … Read more

SABRINA: A Stochastic Subspace Majorization-Minimization Algorithm

A wide class of problems involves the minimization of a coercive and differentiable function $F$ on $\mathbb{R}^N$ whose gradient cannot be evaluated in an exact manner. In such context, many existing convergence results from standard gradient-based optimization literature cannot be directly applied and robustness to errors in the gradient is not necessarily guaranteed. This work … Read more

Sparse Approximations with Interior Point Methods

Large-scale optimization problems that seek sparse solutions have become ubiquitous. They are routinely solved with various specialized first-order methods. Although such methods are often fast, they usually struggle with not-so-well conditioned problems. In this paper, specialized variants of an interior point-proximal method of multipliers are proposed and analyzed for problems of this class. Computational experience … Read more

Convergence of Proximal Gradient Algorithm in the Presence of Adjoint Mismatch

We consider the proximal gradient algorithm for solving penalized least-squares minimization problems arising in data science. This first-order algorithm is attractive due to its flexibility and minimal memory requirements allowing to tackle large-scale minimization problems involving non-smooth penalties. However, for problems such as X-ray computed tomography, the applicability of the algorithm is dominated by the … Read more

New convergence results for the inexact variable metric forward-backward method

Forward–backward methods are valid tools to solve a variety of optimization problems where the objective function is the sum of a smooth, possibly nonconvex term plus a convex, possibly nonsmooth function. The corresponding iteration is built on two main ingredients: the computation of the gradient of the smooth part and the evaluation of the proximity … Read more

Convergence of Inexact Forward–Backward Algorithms Using the Forward–Backward Envelope

This paper deals with a general framework for inexact forward–backward algorithms aimed at minimizing the sum of an analytic function and a lower semicontinuous, subanalytic, convex term. Such framework relies on an implementable inexactness condition for the computation of the proximal operator, and a linesearch procedure which is possibly performed whenever a variable metric is … Read more

A search direction inspired primal-dual method for saddle point problems

The primal-dual hybrid gradient algorithm (PDHG), which is indeed the Arrow-Hurwicz method, has been widely used in image processing areas. However, the convergence of PDHG was established only under some restrictive conditions in the literature, and it is still missing for the case without extra constraints. In this paper, from a perspective of the variational … Read more

Deep Unfolding of a Proximal Interior Point Method for Image Restoration

Variational methods are widely applied to ill-posed inverse problems for they have the ability to embed prior knowledge about the solution. However, the level of performance of these methods significantly depends on a set of parameters, which can be estimated through computationally expensive and time-consuming methods. In contrast, deep learning offers very generic and efficient … Read more

ACQUIRE: an inexact iteratively reweighted norm approach for TV-based Poisson image restoration

We propose a method, called ACQUIRE, for the solution of constrained optimization problems modeling the restoration of images corrupted by Poisson noise. The objective function is the sum of a generalized Kullback-Leibler divergence term and a TV regularizer, subject to nonnegativity and possibly other constraints, such as flux conservation. ACQUIRE is a line-search method that … Read more

Derivative-Free Superiorization With Component-Wise Perturbations

Superiorization reduces, not necessarily minimizes, the value of a target function while seeking constraints-compatibility. This is done by taking a solely feasibility-seeking algorithm, analyzing its perturbations resilience, and proactively perturbing its iterates accordingly to steer them toward a feasible point with reduced value of the target function. When the perturbation steps are computationally efficient, this … Read more