LSOS: Line-search Second-Order Stochastic optimization methods for nonconvex finite sums

We develop a line-search second-order algorithmic framework for minimizing finite sums. We do not make any convexity assumptions, but require the terms of the sum to be continuously differentiable and have Lipschitz-continuous gradients. The methods fitting into this framework combine line searches and suitably decaying step lengths. A key issue is a two-step sampling at … Read more

Directional TGV-based image restoration under Poisson noise

We are interested in the restoration of noisy and blurry images where the texture mainly follows a single direction (i.e., directional images). Problems of this type arise, for example, in microscopy or computed tomography for carbon or glass fibres. In order to deal with these problems, the Directional Total Generalized Variation (DTGV) was developed by … Read more

Sparse Approximations with Interior Point Methods

Large-scale optimization problems that seek sparse solutions have become ubiquitous. They are routinely solved with various specialized first-order methods. Although such methods are often fast, they usually struggle with not-so-well conditioned problems. In this paper, specialized variants of an interior point-proximal method of multipliers are proposed and analyzed for problems of this class. Computational experience … Read more

Spatially Adaptive Regularization in Image Segmentation

We modify the total-variation-regularized image segmentation model proposed by Chan, Esedoglu and Nikolova [SIAM Journal on Applied Mathematics 66, 2006] by introducing local regularization that takes into account spatial image information. We propose some techniques for defining local regularization parameters, based on the cartoon-texture decomposition of the given image, on the mean and median filters, … Read more

Using gradient directions to get global convergence of Newton-type methods

The renewed interest in Steepest Descent (SD) methods following the work of Barzilai and Borwein [IMA Journal of Numerical Analysis, 8 (1988)] has driven us to consider a globalization strategy based on SD, which is applicable to any line-search method. In particular, we combine Newton-type directions with scaled SD steps to have suitable descent directions. … Read more

A subspace-accelerated split Bregman method for sparse data recovery with joint l1-type regularizers

We propose a subspace-accelerated Bregman method for the linearly constrained minimization of functions of the form f(u)+tau_1 ||u||_1 + tau_2 ||D*u||_1, where f is a smooth convex function and D represents a linear operator, e.g. a finite difference operator, as in anisotropic Total Variation and fused-lasso regularizations. Problems of this type arise in a wide … Read more

Constraint-Preconditioned Krylov Solvers for Regularized Saddle-Point Systems

We consider the iterative solution of regularized saddle-point systems. When the leading block is symmetric and positive semi-definite on an appropriate subspace, Dollar, Gould, Schilders, and Wathen (SIAM J. Matrix Anal. Appl., 28(1), 2006) describe how to apply the conjugate gradient (CG) method coupled with a constraint preconditioner, a choice that has proved to be … Read more

ACQUIRE: an inexact iteratively reweighted norm approach for TV-based Poisson image restoration

We propose a method, called ACQUIRE, for the solution of constrained optimization problems modeling the restoration of images corrupted by Poisson noise. The objective function is the sum of a generalized Kullback-Leibler divergence term and a TV regularizer, subject to nonnegativity and possibly other constraints, such as flux conservation. ACQUIRE is a line-search method that … Read more

A two-phase gradient method for quadratic programming problems with a single linear constraint and bounds on the variables

We propose a gradient-based method for quadratic programming problems with a single linear constraint and bounds on the variables. Inspired by the GPCG algorithm for bound-constrained convex quadratic programming [J.J. More’ and G. Toraldo, SIAM J. Optim. 1, 1991], our approach alternates between two phases until convergence: an identification phase, which performs gradient projection iterations … Read more

On the steplength selection in gradient methods for unconstrained optimization

The seminal paper by Barzilai and Borwein [IMA J. Numer. Anal. 8 (1988)] has given rise to an extensive investigation aimed at developing effective gradient methods, able to deal with large-scale optimization problems. Several steplength rules have been first designed for unconstrained quadratic problems and then extended to general nonlinear problems; these rules share the … Read more