Improved RIP-Based Bounds for Guaranteed Performance of Two Compressed Sensing Algorithms

Iterative hard thresholding (IHT) and compressive sampling matching pursuit (CoSaMP) are two mainstream compressed sensing algorithms using the hard thresholding operator. The guaranteed performance of the two algorithms for signal recovery was mainly analyzed in terms of the restricted isometry property (RIP) of sensing matrices. At present, the best known bound using RIP of order … Read more

A generalized block-iterative projection method for the common fixed point problem induced by cutters

The block-iterative projections (BIP) method of Aharoni and Censor [Block-iterative projection methods for parallel computation of solutions to convex feasibility problems, Linear Algebra and its Applications 120, (1989), 165-175] is an iterative process for finding asymptotically a point in the nonempty intersection of a family of closed convex subsets. It employs orthogonal projections onto the … Read more

Convexity and continuity of specific set-valued maps and their extremal value functions

In this paper, we study several classes of set-valued maps, which can be used in set-valued optimization and its applications, and their respective maximum and minimum value functions. The definitions of these maps are based on scalar-valued, vector-valued, and cone-valued maps. Moreover, we consider those extremal value functions which are obtained when optimizing linear functionals … Read more

Convergence Results for Primal-Dual Algorithms in the Presence of Adjoint Mismatch

Most optimization problems arising in imaging science involve high-dimensional linear operators and their adjoints. In the implementations of these operators, approximations may be introduced for various practical considerations (e.g., memory limitation, computational cost, convergence speed), leading to an adjoint mismatch. This occurs for the X-ray tomographic inverse problems found in Computed Tomography (CT), where the … Read more

Convergence analysis of an inexact relaxed augmented Lagrangian method

In this paper, we develop an Inexact Relaxed Augmented Lagrangian Method (IR-ALM) for solving a class of convex optimization problems. Flexible relative error criteria are designed for approximately solving the resulting subproblem, and a relaxation step is exploited to accelerate its convergence numerically. By a unified variational analysis, we establish the global convergence of this … Read more

A new sufficient condition for non-convex sparse recovery via weighted $\ell_r\!-\!\ell_1$ minimization

In this letter, we discuss the reconstruction of sparse signals from undersampled data, which belongs to the core content of compressed sensing. A new sufficient condition in terms of the restricted isometry constant (RIC) and restricted orthogonality constant (ROC) is first established for the performance guarantee of recently proposed non-convex weighted $\ell_r-\ell_1$ minimization in recovering … Read more

A Decomposition Algorithm for Two-Stage Stochastic Programs with Nonconvex Recourse

In this paper, we have studied a decomposition method for solving a class of nonconvex two-stage stochastic programs, where both the objective and constraints of the second-stage problem are nonlinearly parameterized by the first-stage variable. Due to the failure of the Clarke regularity of the resulting nonconvex recourse function, classical decomposition approaches such as Benders … Read more

The Null Space Property of the Weighted $\ell_r-\ell_1$ Minimization

The null space property (NSP), which relies merely on the null space of the sensing matrix column space, has drawn numerous interests in sparse signal recovery. This article studies NSP of the weighted $\ell_r-\ell_1$ minimization. Several versions of NSP of the weighted $\ell_r-\ell_1$ minimization including the weighted $\ell_r-\ell_1$ NSP, the weighted $\ell_r-\ell_1$ stable NSP, the … Read more

An Asynchronous Proximal Bundle Method

We develop a fully asynchronous proximal bundle method for solving non-smooth, convex optimization problems. The algorithm can be used as a drop-in replacement for classic bundle methods, i.e., the function must be given by a first-order oracle for computing function values and subgradients. The algorithm allows for an arbitrary number of master problem processes computing … Read more

Spectral Projected Subgradient Method for Nonsmooth Convex Optimization Problems

We consider constrained optimization problems with a nonsmooth objective function in the form of mathematical expectation. The Sample Average Approximation (SAA) is used to estimate the objective function and variable sample size strategy is employed. The proposed algorithm combines an SAA subgradient with the spectral coefficient in order to provide a suitable direction which improves … Read more