A Flexible ADMM Algorithm for Big Data Applications

We present a flexible Alternating Direction Method of Multipliers (F-ADMM) algorithm for solving optimization problems involving a strongly convex objective function that is separable into $n \geq 2$ blocks, subject to (non-separable) linear equality constraints. The F-ADMM algorithm uses a \emph{Gauss-Seidel} scheme to update blocks of variables, and a regularization term is added to each … Read more

A weighted Mirror Descent algorithm for nonsmooth convex optimization problem

Large scale nonsmooth convex optimization is a common problem for a range of computational areas including machine learning and computer vision. Problems in these areas contain special domain structures and characteristics. Special treatment of such problem domains, exploiting their structures, can significantly improve the the computational burden. We present a weighted Mirror Descent method to … Read more

A Framework for Applying Subgradient Methods to Conic Optimization Problems (version 2)

A framework is presented whereby a general convex conic optimization problem is transformed into an equivalent convex optimization problem whose only constraints are linear equations and whose objective function is Lipschitz continuous. Virtually any subgradient method can be applied to solve the equivalent problem. Two methods are analyzed. (In version 2, the development of algorithms … Read more

Second order forward-backward dynamical systems for monotone inclusion problems

We begin by considering second order dynamical systems of the from $\ddot x(t) + \Gamma (\dot x(t)) + \lambda(t)B(x(t))=0$, where $\Gamma: {\cal H}\rightarrow{\cal H}$ is an elliptic bounded self-adjoint linear operator defined on a real Hilbert space ${\cal H}$, $B: {\cal H}\rightarrow{\cal H}$ is a cocoercive operator and $\lambda:[0,+\infty)\rightarrow [0,+\infty)$ is a relaxation function depending … Read more

Convergence rate of a proximal multiplier algorithm for separable convex minimization

The proximal multiplier method with proximal distances (PMAPD) proposed by O. Sarmiento C., E. A. Papa Quiroz and P. R. Oliveira, applied to solve a convex program with separable structure unified the works of Chen and Teboulle (PCPM method), Kyono and Fukushima (NPCPMM) and Auslender and Teboulle (EPDM) and extended the convergence properties for the … Read more

Smooth Strongly Convex Interpolation and Exact Worst-case Performance of First-order Methods

We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convex programs. Finding the worst-case performance of a black-box first-order method is formulated as an optimization problem over a set of smooth (strongly) convex functions and initial conditions. We develop … Read more

A corrected semi-proximal ADMM for multi-block convex optimization and its application to DNN-SDPs

In this paper we propose a corrected semi-proximal ADMM (alternating direction method of multipliers) for the general $p$-block $(p\!\ge 3)$ convex optimization problems with linear constraints, aiming to resolve the dilemma that almost all the existing modified versions of the directly extended ADMM, although with convergent guarantee, often perform substantially worse than the directly extended … Read more

A note on the ergodic convergence of symmetric alternating proximal gradient method

We consider the alternating proximal gradient method (APGM) proposed to solve a convex minimization model with linear constraints and separable objective function which is the sum of two functions without coupled variables. Inspired by Peaceman-Rachford splitting method (PRSM), a nature idea is to extend APGM to the symmetric alternating proximal gradient method (SAPGM), which can … Read more

ADMM for Convex Quadratic Programs: Linear Convergence and Infeasibility Detection

In this paper, we analyze the convergence of Alternating Direction Method of Multipliers (ADMM) on convex quadratic programs (QPs) with linear equality and bound constraints. The ADMM formulation alternates between an equality constrained QP and a projection on the bounds. Under the assumptions of: (i) positive definiteness of the Hessian of the objective projected on … Read more

Looking for strong polynomiality in Linear Programming : Arguments, conjectures, experiments, findings, and conclusion.

Until now it has been an open question whether the Linear Programming (LP) problem can be solved in strong polynomial time. The simplex algorithm with its combinatorial nature does not even offer a polynomial bound, whereas the complexity of the polynomial algorithms by Khachiyan and Karmarkar is based on the number of variables n, and … Read more