Constant Depth Decision Rules for multistage optimization under uncertainty

In this paper, we introduce a new class of decision rules, referred to as Constant Depth Decision Rules (CDDRs), for multistage optimization under linear constraints with uncertainty-affected right-hand sides. We consider two uncertainty classes: discrete uncertainties which can take at each stage at most a fixed number d of different values, and polytopic uncertainties which, … Read more

Non-asymptotic confidence bounds for the optimal value of a stochastic program

We discuss a general approach to building non-asymptotic confidence bounds for stochastic optimization problems. Our principal contribution is the observation that a Sample Average Approximation of a problem supplies upper and lower bounds for the optimal value of the problem which are essentially better than the quality of the corresponding optimal solutions. At the same … Read more

Accuracy guarantees for ℓ1-recovery

We discuss two new methods of recovery of sparse signals from noisy observation based on ℓ1- minimization. They are closely related to the well-known techniques such as Lasso and Dantzig Selector. However, these estimators come with efficiently verifiable guaranties of performance. By optimizing these bounds with respect to the method parameters we are able to … Read more

L1 Minimization via Randomized First Order Algorithms

In this paper we propose randomized first-order algorithms for solving bilinear saddle points problems. Our developments are motivated by the need for sublinear time algorithms to solve large-scale parametric bilinear saddle point problems where cheap online assessment of solution quality is crucial. We present the theoretical efficiency estimates of our algorithms and discuss a number … Read more

On Low Rank Matrix Approximations with Applications to Synthesis Problem in Compressed Sensing

We consider the synthesis problem of Compressed Sensing: given $s$ and an $M\times n$ matrix $A$, extract from $A$ an $m\times n$ submatrix $A_m$, with $m$ as small as possible, which is $s$-good, that is, every signal $x$ with at most $s$ nonzero entries can be recovered from observation $A_m x$ by $\ell_1$ minimization: $x … Read more

Verifiable conditions of $\ell_1hBcrecovery for sparse signals with sign restrictions

We propose necessary and sufficient conditions for a sensing matrix to be “$s$-semigood” — to allow for exact $\ell_1$-recovery of sparse signals with at most $s$ nonzero entries under sign restrictions on part of the entries. We express error bounds for imperfect $\ell_1$-recovery in terms of the characteristics underlying these conditions. These characteristics, although difficult … Read more

On Verifiable Sufficient Conditions for Sparse Signal Recovery via L1 Minimization

We propose novel necessary and sufficient conditions for a sensing matrix to be “s-good” — to allow for exact L1-recovery of sparse signals with s nonzero entries when no measurement noise is present. Then we express the error bounds for imperfect L1-recovery (nonzero measurement noise, nearly s-sparse signal, near-optimal solution of the optimization problem yielding … Read more

Large Deviations of Vector-valued Martingales in 2-Smooth Normed Spaces

In this paper, we derive exponential bounds on probabilities of large deviations for “light tail” martingales taking values in finite-dimensional normed spaces. Our primary emphasis is on the case where the bounds are dimension-independent or nearly so. We demonstrate that this is the case when the norm on the space can be approximated, within an … Read more

Stochastic Approximation approach to Stochastic Programming

In this paper we consider optimization problems where the objective function is given in a form of the expectation. A basic difficulty of solving such stochastic optimization problems is that the involved multidimensional integrals (expectations) cannot be computed with high accuracy. The aim of this paper is to compare two computational approaches based on Monte … Read more

Nonparametric Estimation via Convex Programming

In the paper, we focus primarily on the problem of recovering a linear form g’*x of unknown “signal” x known to belong to a given convex compact set X in R^n from N independent realizations of a random variable taking values in a finite set, the distribution p of the variable being affinely parameterized by … Read more