Asynchronous Sequential Inertial Iterations for Common Fixed Points Problems with an Application to Linear Systems

The common fixed points problem requires finding a point in the intersection of fixed points sets of a finite collection of operators. Quickly solving problems of this sort is of great practical importance for engineering and scientific tasks (e.g., for computed tomography). Iterative methods for solving these problems often employ a Krasnosel’skii-Mann type iteration. We … Read more

Accelerated Bregman Proximal Gradient Methods for Relatively Smooth Convex Optimization

We consider the problem of minimizing the sum of two convex functions: one is differentiable and relatively smooth with respect to a reference convex function, and the other can be nondifferentiable but simple to optimize. The relatively smooth condition is much weaker than the standard assumption of uniform Lipschitz continuity of the gradients, thus significantly … Read more

Positive semidefinite matrix approximation with a trace constraint

We propose an efficient algorithm to solve positive a semidefinite matrix approximation problem with a trace constraint. Without constraints, it is well known that positive semidefinite matrix approximation problem can be easily solved by one-time eigendecomposition of a symmetric matrix. In this paper, we confirmed that one-time eigendecomposition is also sufficient even if a trace … Read more

Inexact Variable Metric Stochastic Block-Coordinate Descent for Regularized Optimization

Block-coordinate descent (BCD) is a popular framework for large-scale regularized optimization problems with block-separable structure. Existing methods have several limitations. They often assume that subproblems can be solved exactly at each iteration, which in practical terms usually restricts the quadratic term in the subproblem to be diagonal, thus losing most of the benefits of higher-order … Read more

Generalized Stochastic Frank-Wolfe Algorithm with Stochastic “Substitute” Gradient for Structured Convex Optimization

The stochastic Frank-Wolfe method has recently attracted much general interest in the context of optimization for statistical and machine learning due to its ability to work with a more general feasible region. However, there has been a complexity gap in the guaranteed convergence rate for stochastic Frank-Wolfe compared to its deterministic counterpart. In this work, … Read more

Minimizing convex quadratics with variable precision Krylov methods

Iterative algorithms for the solution of convex quadratic optimization problems are investigated, which exploit inaccurate matrix-vector products. Theoretical bounds on the performance of a Conjugate Gradients and a Full-Orthormalization methods are derived, the necessary quantities occurring in the theoretical bounds estimated and new practical algorithms derived. Numerical experiments suggest that the new methods have significant … Read more

The Cyclic Douglas-Rachford Algorithm with r-sets-Douglas-Rachford Operators

The Douglas-Rachford (DR) algorithm is an iterative procedure that uses sequential reflections onto convex sets and which has become popular for convex feasibility problems. In this paper we propose a structural generalization that allows to use r-sets-DR operators in a cyclic fashion. We prove convergence and present numerical illustrations of the potential advantage of such … Read more

Efficient Solution of Maximum-Entropy Sampling Problems

We consider a new approach for the maximum-entropy sampling problem (MESP) that is based on bounds obtained by maximizing a function of the form ldet M(x) over linear constraints, where M(x)is linear in the n-vector x. These bounds can be computed very efficiently and are superior to all previously known bounds for MESP on most … Read more

On positive duality gaps in semidefinite programming

We study semidefinite programs (SDPs) with positive duality gaps, i.e., different optimal values in the primal and dual problems. the primal and dual problems differ. These SDPs are considered extremely pathological, they are often unsolvable, and they also serve as models of more general pathological convex programs. We first fully characterize two variable SDPs with … Read more

On the Complexity of Detecting Convexity over a Box

It has recently been shown that the problem of testing global convexity of polynomials of degree four is {strongly} NP-hard, answering an open question of N.Z. Shor. This result is minimal in the degree of the polynomial when global convexity is of concern. In a number of applications however, one is interested in testing convexity … Read more