ADMM for Convex Quadratic Programs: Linear Convergence and Infeasibility Detection

In this paper, we analyze the convergence of Alternating Direction Method of Multipliers (ADMM) on convex quadratic programs (QPs) with linear equality and bound constraints. The ADMM formulation alternates between an equality constrained QP and a projection on the bounds. Under the assumptions of: (i) positive definiteness of the Hessian of the objective projected on … Read more

Behavior of BFGS with an Exact Line Search on Nonsmooth Examples

We investigate the behavior of the BFGS algorithm with an exact line search on nonsmooth functions. We show that it may fail on a simple polyhedral example, but that it apparently always succeeds on the Euclidean norm function, spiraling into the origin with a Q-linear rate of convergence; we prove this in the case of … Read more

A Coordinate Gradient Descent Method for L_1-regularized Convex Minimization

In applications such as signal processing and statistics, many problems involve finding sparse solutions to under-determined linear systems of equations. These problems can be formulated as a structured nonsmooth optimization problems, i.e., the problem of minimizing L_1-regularized linear least squares problems. In this paper, we propose a block coordinate gradient descent method (abbreviated as CGD) … Read more

On the convergence rate of the Cauchy algorithm in the l2 norm

This paper presents a convergence rate for the sequence generated by the Cauchy algorithm. The method is applied to a convex quadratic function with exact line search. Instead of using the norm induced by the hessian matrix, the q-linear convergence is shown for the l2 (or Euclidean) norm. CitationTecnhical Report, Dep. Mathematics, Federal University of … Read more