On full Jacobian decomposition of the augmented Lagrangian method for separable convex programming

The augmented Lagrangian method (ALM) is a benchmark for solving the convex minimization problem with linear constraints. We consider the special case where the objective is in form of the sum of m functions without coupled variables. For solving this separable convex programming model, it is usually required to decompose the ALM subproblem at each … Read more

Robust convex relaxation for the planted clique and densest k-subgraph problems

We consider the problem of identifying the densest k-node subgraph in a given graph. We write this problem as an instance of rank-constrained cardinality minimization and then relax using the nuclear and l1 norms. Although the original combinatorial problem is NP-hard, we show that the densest k-subgraph can be recovered from the solution of our … Read more

Lagrangian Transformation and Interior Ellipsoid Methods in Convex Optimization

The rediscovery of the affine scaling method in the late 80s was one of the turning points which led to a new chapter in Modern Optimization – the interior point methods (IPMs). Simultaneously and independently the theory of exterior point methods for convex optimization arose. The two seemingly unconnected fields turned out to be intrinsically … Read more

An interior proximal point method with phi-divergence for Equilibrium Problems

In this paper, we consider the problem of general equilibrium in a  finite-dimensional space on a closed convex set. For solving this problem, we developed an interior proximal point algorithm with phi-divergence. Under reasonable assumptions, we prove that the sequence generated by the algorithm converges to a solution of the Equilibrium Problem, when the regularization … Read more

Minimal Residual Methods for Complex Symmetric, Skew Symmetric, and Skew Hermitian Systems

While there is no lack of efficient Krylov subspace solvers for Hermitian systems, there are few for complex symmetric, skew symmetric, or skew Hermitian systems, which are increasingly important in modern applications including quantum dynamics, electromagnetics, and power systems. For a large consistent complex symmetric system, one may apply a non-Hermitian Krylov subspace method disregarding … Read more

Optimal Primal-Dual Methods for a Class of Saddle Point Problems

We present a novel accelerated primal-dual (APD) method for solving a class of deterministic and stochastic saddle point problems (SPP). The basic idea of this algorithm is to incorporate a multi-step acceleration scheme into the primal-dual method without smoothing the objective function. For deterministic SPP, the APD method achieves the same optimal rate of convergence … Read more

Faster, but Weaker, Relaxations for Quadratically Constrained Quadratic Programs

We introduce a new relaxation framework for nonconvex quadratically constrained quadratic programs (QCQPs). In contrast to existing relaxations based on semidefinite programming (SDP), our relaxations incorporate features of both SDP and second order cone programming (SOCP) and, as a result, solve more quickly than SDP. A downside is that the calculated bounds are weaker than … Read more

Tail bounds for stochastic approximation

Stochastic-approximation gradient methods are attractive for large-scale convex optimization because they offer inexpensive iterations. They are especially popular in data-fitting and machine-learning applications where the data arrives in a continuous stream, or it is necessary to minimize large sums of functions. It is known that by appropriately decreasing the variance of the error at each … Read more

On the use of semi-closed sets and functions in convex analysis

The main aim of this short note is to show that the sub\-differentiability and duality results established by Laghdir (2005), Laghdir and Benabbou (2007), and Alimohammady \emph{et al.}\ (2011), stated in Fréchet spaces, are consequences of the corresponding known results using Moreau–Rockafellar type conditions. ArticleDownload View PDF

A splitting minimization method on geodesic spaces

We present in this paper the alternating minimization method on CAT(0) spaces for solving unconstraints convex optimization problems. Under the assumption that the function being minimize is convex, we prove that the sequence generated by our algorithm converges to a minimize point. The results presented in this paper are the first ones of this type … Read more