An algorithm to compute the Hoffman constant of a system of linear constraints

We propose a combinatorial algorithm to compute the Hoffman constant of a system of linear equations and inequalities. The algorithm is based on a characterization of the Hoffman constant as the largest of a finite canonical collection of easy-to-compute Hoffman constants. Our algorithm and characterization extend to the more general context where some of the … Read more

Fast Multilevel Algorithms for Compressive Principle Component Pursuit

Recovering a low-rank matrix from highly corrupted measurements arises in compressed sensing of structured high-dimensional signals (e.g., videos and hyperspectral images among others). Robust principal component analysis (RPCA), solved via principal component pursuit (PCP), recovers a low-rank matrix from sparse corruptions that are of unknown value and support by decomposing the observation matrix into two … Read more

Primal-Dual Interior-Point Methods for Domain-Driven Formulations: Algorithms

We study infeasible-start primal-dual interior-point methods for convex optimization problems given in a typically natural form we denote as Domain-Driven formulation. Our algorithms extend many advantages of primal-dual interior-point techniques available for conic formulations, such as the current best complexity bounds, and more robust certificates of approximate optimality, unboundedness, and infeasibility, to Domain-Driven formulations. The … Read more

Complexity of gradient descent for multiobjective optimization

A number of first-order methods have been proposed for smooth multiobjective optimization for which some form of convergence to first order criticality has been proved. Such convergence is global in the sense of being independent of the starting point. In this paper we analyze the rate of convergence of gradient descent for smooth unconstrained multiobjective … Read more

New inertial factors of a splitting method for monotone inclusions

In this article, we consider monotone inclusions of two operators in real Hilbert spaces, in which one is further assumed to be Lipschitz continuous, and we suggest adding an inertial term to a splitting method at each iteration. The associated weak convergence is analyzed under standard assumptions. The way of choosing steplength is self-adaptive via … Read more

Block Coordinate Proximal Gradient Method for Nonconvex Optimization Problems: Convergence Analysis

We propose a block coordinate proximal gradient method for a composite minimization problem with two nonconvex function components in the objective while only one of them is assumed to be differentiable. Under some per-block Lipschitz-like conditions based on Bregman distance, but without the global Lipschitz continuity of the gradient of the differentiable function, we prove … Read more

An algorithm for solving infinite horizon Markov dynamic programmes

We consider a general class of infinite horizon dynamic programmes where state and control sets are convex and compact subsets of Euclidean spaces and (convex) costs are discounted geometrically. The aim of this work is to provide a convergence result for these problems under as few restrictions as possible. Under certain assumptions on the cost … Read more

BBCPOP: A Sparse Doubly Nonnegative Relaxation of Polynomial Optimization Problems with Binary, Box and Complementarity Constraints

The software package BBCPOP is a MATLAB implementation of a hierarchy of sparse doubly nonnegative (DNN) relaxations of a class of polynomial optimization (minimization) problems (POPs) with binary, box and complementarity (BBC) constraints. Given a POP in the class and a relaxation order, BBCPOP constructs a simple conic optimization problem (COP), which serves as a … Read more

User Manual for BBCPOP: A Sparse Doubly Nonnegative Relaxation of Polynomial Optimization Problems with Binary, Box and Complementarity Constraints

BBCPOP proposed in [4] is a MATLAB implementation of a hierarchy of sparse doubly nonnegative (DNN) relaxations of a class of polynomial optimization (minimization) problems (POPs) with binary, box and complementarity constraints. Given a POP in the class and a relaxation order (or a hierarchy level), BBCPOP constructs a simple conic optimization problem (COP), which … Read more

Derivative-Free Superiorization With Component-Wise Perturbations

Superiorization reduces, not necessarily minimizes, the value of a target function while seeking constraints-compatibility. This is done by taking a solely feasibility-seeking algorithm, analyzing its perturbations resilience, and proactively perturbing its iterates accordingly to steer them toward a feasible point with reduced value of the target function. When the perturbation steps are computationally efficient, this … Read more