Efficiency of coordinate descent methods on huge-scale optimization problems

In this paper we propose new methods for solving huge-scale optimization problems. For problems of this size, even the simplest full-dimensional vector operations are very expensive. Hence, we propose to apply an optimization technique based on random partial update of decision variables. For these methods, we prove the global estimates for the rate of convergence. … Read more

^phBcnorms, log-barriers and Cramer transform in optimization

We show that the Laplace approximation of a supremum by $L^p$-norms has interesting consequences in optimization. For instance, the logarithmic barrier functions (LBF) of a primal convex problem $P$ and its dual $P^*$ appear naturally when using this simple approximation technique for the value function $g$ of $P$ or its Legendre-Fenchel conjugate $g^*$. In addition, … Read more

Stability of error bounds for convex constraint systems in Banach spaces

This paper studies stability of error bounds for convex constraint systems in Banach spaces. We show that certain known sufficient conditions for local and global error bounds actually ensure error bounds for the family of functions being in a sense small perturbations of the given one. A single inequality as well as semi-infinite constraint systems … Read more

A Fast Algorithm for Total Variation Image Reconstruction from Random Projections

Total variation (TV) regularization is popular in image restoration and reconstruction due to its ability to preserve image edges. To date, most research activities on TV models concentrate on image restoration from blurry and noisy observations, while discussions on image reconstruction from random projections are relatively fewer. In this paper, we propose, analyze, and test … Read more

Convergence to the optimal value for barrier methods combined with Hessian Riemannian gradient flows and generalized proximal algorithms

We consider the problem $\min_{x\in\R^n}\{f(x)\mid Ax=b, \ x\in\overline{C},\ g_j(x)\le0,\ j=1,\ldots,s\}$, where $b\in\R^m$, $A\in\R^{m\times n}$ is a full rank matrix, $\overline{C}$ is the closure of a nonempty, open and convex subset $C$ of $\R^n$, and $g_j(\cdot)$, $ j=1,\ldots,s$, are nonlinear convex functions. Our strategy consists firstly in to introduce a barrier-type penalty for the constraints $g_j(x)\le0$, … Read more

Recovering low-rank and sparse components of matrices from incomplete and noisy observations

Many applications arising in a variety of fields can be well illustrated by the task of recovering the low-rank and sparse components of a given matrix. Recently, it is discovered that this NP-hard task can be well accomplished, both theoretically and numerically, via heuristically solving a convex relaxation problem where the widely-acknowledged nuclear norm and … Read more

Hedge Algorithm and Subgradient Methods

We show that the Hedge Algorithm, a method that is widely used in Machine Learning, can be interpreted as a particular subgradient algorithm for minimizing a well-chosen convex function, namely as a Mirror Descent Scheme. Using this reformulation, we establish three modificitations and extensions of the Hedge Algorithm that are better or at least as … Read more

Alternating Direction Algorithms for $\ell_1hBcProblems in Compressive Sensing

In this paper, we propose and study the use of alternating direction algorithms for several $\ell_1$-norm minimization problems arising from sparse solution recovery in compressive sensing, including the basis pursuit problem, the basis-pursuit denoising problems of both unconstrained and constrained forms, as well as others. We present and investigate two classes of algorithms derived from … Read more

Extension of the semidefinite characterization of sum of squares functional systems to algebraic structures

We extend Nesterov’s semidefinite programming (SDP) characterization of the cone of functions that can be expressed as sums of squares (SOS) of functions in finite dimensional linear functional spaces. Our extension is to algebraic systems that are endowed with a binary operation which map two elements of a finite dimensional vector space to another vector … Read more

Alternating directions based contraction method for generally separable linearly constrained convex programming problems

The classical alternating direction method (ADM) has been well studied in the context of linearly constrained convex programming problems and variational inequalities where both the involved operators and constraints are separable into two parts. In particular, recentness has witnessed a number of novel applications arising in diversified areas (e.g. Image Processing and Statistics), for which … Read more