Regularized Interior Proximal Alternating Direction Method for Separable Convex Optimization Problems

In this article we present a version of the proximal alternating direction method for a convex problem with linear constraints and a separable objective function, in which the standard quadratic regularizing term is replaced with an interior proximal metric for those variables that are required to satisfy some additional convex constraints. Moreover, the proposed method … Read more

Accelerated First-Order Methods for Hyperbolic Programming

A framework is developed for applying accelerated methods to general hyperbolic programming, including linear, second-order cone, and semidefinite programming as special cases. The approach replaces a hyperbolic program with a convex optimization problem whose smooth objective function is explicit, and for which the only constraints are linear equations (one more linear equation than for the … Read more

Polytope conditioning and linear convergence of the Frank-Wolfe algorithm

It is known that the gradient descent algorithm converges linearly when applied to a strongly convex function with Lipschitz gradient. In this case the algorithm’s rate of convergence is determined by condition number of the function. In a similar vein, it has been shown that a variant of the Frank-Wolfe algorithm with away steps converges … Read more

Solving Conic Systems via Projection and Rescaling

We propose a simple {\em projection and algorithm} to solve the feasibility problem \[ \text{ find } x \in L \cap \Omega, \] where $L$ and $\Omega$ are respectively a linear subspace and the interior of a symmetric cone in a finite-dimensional vector space $V$. This projection and rescaling algorithm is inspired by previous work … Read more

New Douglas-Rachford algorithmic structures and their convergence analyses

In this paper we study new algorithmic structures with Douglas- Rachford (DR) operators to solve convex feasibility problems. We propose to embed the basic two-set-DR algorithmic operator into the String-Averaging Projections (SAP) and into the Block-Iterative Pro- jection (BIP) algorithmic structures, thereby creating new DR algo- rithmic schemes that include the recently proposed cyclic Douglas- … Read more

Generalized Conjugate Gradient Methods for $\ell_1$ Regularized Convex Quadratic Programming with Finite Convergence

The conjugate gradient (CG) method is an efficient iterative method for solving large-scale strongly convex quadratic programming (QP). In this paper we propose some generalized CG (GCG) methods for solving the $\ell_1$-regularized (possibly not strongly) convex QP that terminate at an optimal solution in a finite number of iterations. At each iteration, our methods first … Read more

Acceleration of the PDHGM on strongly convex subspaces

We propose several variants of the primal-dual method due to Chambolle and Pock. Without requiring full strong convexity of the objective functions, our methods are accelerated on subspaces with strong convexity. This yields mixed rates, $O(1/N^2)$ with respect to initialisation and $O(1/N)$ with respect to the dual sequence, and the residual part of the primal … Read more

An Extended Frank-Wolfe Method with “In-Face” Directions, and its Application to Low-Rank Matrix Completion

We present an extension of the Frank-Wolfe method that is designed to induce near-optimal solutions on low-dimensional faces of the feasible region. We present computational guarantees for the method that trade off efficiency in computing near-optimal solutions with upper bounds on the dimension of minimal faces of iterates. We apply our method to the low-rank … Read more

The rate of convergence of Nesterov’s accelerated forward-backward method is actually (k^{-2})$

The {\it forward-backward algorithm} is a powerful tool for solving optimization problems with a {\it additively separable} and {\it smooth} + {\it nonsmooth} structure. In the convex setting, a simple but ingenious acceleration scheme developed by Nesterov has been proved useful to improve the theoretical rate of convergence for the function values from the standard … Read more

Fast convergence of inertial dynamics and algorithms with asymptotic vanishing damping

In a Hilbert space setting $\mathcal H$, we study the fast convergence properties as $t \to + \infty$ of the trajectories of the second-order differential equation \begin{equation*} \ddot{x}(t) + \frac{\alpha}{t} \dot{x}(t) + \nabla \Phi (x(t)) = g(t), \end{equation*} where $\nabla\Phi$ is the gradient of a convex continuously differentiable function $\Phi: \mathcal H \to \mathbb R$, … Read more