When Does Primal Interior Point Method Beat Primal-dual in Linear Optimization?

The primal-dual interior point method (IPM) is widely regarded as the most efficient IPM variant for linear optimization. In this paper, we demonstrate that the improved stability of the pure primal IPM can allow speedups relative to a primal-dual solver, particularly as the IPM approaches convergence.  The stability of the primal scaling matrix makes it … Read more

Gradient Methods with Online Scaling

We introduce a framework to accelerate the convergence of gradient-based methods with online learning. The framework learns to scale the gradient at each iteration through an online learning algorithm and provably accelerates gradient-based methods asymptotically. In contrast with previous literature, where convergence is established based on worst-case analysis, our framework provides a strong convergence guarantee … Read more

Preconditioned Barzilai-Borwein Methods for Multiobjective Optimization Problems

Preconditioning is a powerful approach for solving ill-conditioned problems in optimization, where a preconditioning matrix is used to reduce the condition number and speed up the convergence of first-order method. Unfortunately, it is impossible to capture the curvature of all objective functions with a single preconditioning matrix in multiobjective optimization. Instead, second-order methods for multiobjective … Read more

Preconditioning for Generelized Jacobians with the ω-Condition Number

Preconditioning is essential in iterative methods for solving linear systems of equations. We study a nonclassic matrix condition number, the ω-condition number, in the context of optimal conditioning for low rank updating of positive definite matrices. For a positive definite matrix, this condition measure is the ratio of the arithmetic and geometric means of the … Read more

Acceleration of Primal-Dual Methods by Preconditioning and Simple Subproblem Procedures

Primal-Dual Hybrid Gradient (PDHG) and Alternating Direction Method of Multipliers (ADMM) are two widely-used first-order optimization methods. They reduce a difficult problem to simple subproblems, so they are easy to implement and have many applications. As first-order methods, however, they are sensitive to problem conditions and can struggle to reach the desired accuracy. To improve … Read more

Interior Point Methods and Preconditioning for PDE-Constrained Optimization Problems Involving Sparsity Terms

PDE-constrained optimization problems with control or state constraints are challenging from an analytical as well as numerical perspective. The combination of these constraints with a sparsity-promoting L1 term within the objective function requires sophisticated optimization methods. We propose the use of an Interior Point scheme applied to a smoothed reformulation of the discretized problem, and … Read more

A note on preconditioning weighted linear least squares, with consequences for weakly-constrained variational data assimilation

The effect of preconditioning linear weighted least-squares using an approximation of the model matrix is analyzed, showing the interplay of the eigenstructures of both the model and weighting matrices. A small example is given illustrating the resulting potential inefficiency of such preconditioners. Consequences of these results in the context of the weakly-constrained 4D-Var data assimilation … Read more

Preconditioning PDE-constrained optimization with L^1-sparsity and control constraints

PDE-constrained optimization aims at finding optimal setups for partial differential equations so that relevant quantities are minimized. Including sparsity promoting terms in the formulation of such problems results in more practically relevant computed controls but adds more challenges to the numerical solution of these problems. The needed L^1-terms as well as additional inclusion of box … Read more

A basis-free null space method for solving generalized saddle point problems

Using an augmented Lagrangian matrix approach, we analytically solve in this paper a broad class of linear systems that includes symmetric and nonsymmetric problems in saddle point form. To this end, some mild assumptions are made and a preconditioning is specially designed to improve the sensitivity of the systems before the calculation of their solutions. … Read more

A Class of Randomized Primal-Dual Algorithms for Distributed Optimization

Based on a preconditioned version of the randomized block-coordinate forward-backward algorithm recently proposed in [Combettes,Pesquet,2014], several variants of block-coordinate primal-dual algorithms are designed in order to solve a wide array of monotone inclusion problems. These methods rely on a sweep of blocks of variables which are activated at each iteration according to a random rule, … Read more