Convergence rate and iteration complexity on the alternating direction method of multipliers with a substitution procedure for separable convex programming

Recently, in [17] we have showed the first possibility of combining the Douglas-Rachford alternating direction method of multipliers (ADMM) with a Gaussian back substitution procedure for solving a convex minimization model with a general separable structure. This paper is a further study on theoretical aspects of this theme. We first derive a general algorithmic framework … Read more

On RIC bounds of Compressed Sensing Matrices for Approximating Sparse Solutions Using Lq Quasi Norms

This paper follows the recent discussion on the sparse solution recovery with quasi-norms Lq; q\in(0,1) when the sensing matrix possesses a Restricted Isometry Constant \delta_{2k} (RIC). Our key tool is an improvement on a version of “the converse of a generalized Cauchy-Schwarz inequality” extended to the setting of quasi-norm. We show that, if \delta_{2k}\le 1/2, … Read more

Alternating Proximal Gradient Method for Convex Minimization

In this paper, we propose an alternating proximal gradient method that solves convex minimization problems with three or more separable blocks in the objective function. Our method is based on the framework of alternating direction method of multipliers. The main computational effort in each iteration of the proposed method is to compute the proximal mappings … Read more

Compressed Sensing Off the Grid

We consider the problem of estimating the frequency components of a mixture of s complex sinusoids from a random subset of n regularly spaced samples. Unlike previous work in compressed sensing, the frequencies are not assumed to lie on a grid, but can assume any values in the normalized frequency domain [0, 1]. We propose … Read more

Gradient consistency for integral-convolution smoothing functions

Chen and Mangasarian (1995) developed smoothing approximations to the plus function built on integral-convolution with density functions. X. Chen (2012) has recently picked up this idea constructing a large class of smoothing functions for nonsmooth minimization through composition with smooth mappings. In this paper, we generalize this idea by substituting the plus function for an … Read more

On Stable Piecewise Linearization and Generalized Algorithmic Differentiation

It is shown how functions that are defined by evaluation programs involving the absolute value function (besides smooth elementals), can be approximated locally by piecewise-linear models in the style of algorithmic, or automatic, differentiation (AD). The model can be generated by a minor modification of standard AD tools and it is Lipschitz continuous with respect … Read more

Second-order variational analysis and characterizations of tilt-stable optimal solutions in finite and infinite dimensions

The paper is devoted to developing second-order tools of variational analysis and their applications to characterizing tilt-stable local minimizers of constrained optimization problems in finite-dimensional and infinite-dimensional spaces. The importance of tilt stability has been well recognized from both theoretical and numerical aspects of optimization. Based on second-order generalized differentiation, we obtain qualitative and quantitative … Read more

Primal-dual relationship between Levenberg-Marquardt and central trajectories for linearly constrained convex optimization

We consider the minimization of a convex function on a compact polyhedron defined by linear equality constraints and nonnegative variables. We define the Levenberg-Marquardt (L-M) and central trajectories starting at the analytic center and using the same parameter, and show that they satisfy a primal-dual relationship, being close to each other for large values of … Read more

Epi-convergent Smoothing with Applications to Convex Composite Functions

Smoothing methods have become part of the standard tool set for the study and solution of nondifferentiable and constrained optimization problems as well as a range of other variational and equilibrium problems. In this note we synthesize and extend recent results due to Beck and Teboulle on infimal convolution smoothing for convex functions with those … Read more

Primal-dual subgradient method for Huge-Scale Linear Conic Problems

In this paper we develop a {\em primal-dual} subgradient method for solving huge-scale Linear Conic Optimization Problems. Our main assumption is that the primal cone is formed as a direct product of many small-dimensional convex cones, and that the matrix $A$ of corresponding linear operator is {\em uniformly sparse}. In this case, our method can … Read more