The block mutual coherence property condition for signal recovery

Compressed sensing shows that a sparse signal can stably be recovered from incomplete linear measurements. But, in practical applications, some signals have additional structure, where the nonzero elements arise in some blocks. We call such signals as block-sparse signals. In this paper, the $\ell_2/\ell_1-\alpha\ell_2$ minimization method for the stable recovery of block-sparse signals is investigated. … Read more

Manifold Identification for Ultimately Communication-Efficient Distributed Optimization

This work proposes a progressive manifold identification approach for distributed optimization with sound theoretical justifications to greatly reduce both the rounds of communication and the bytes communicated per round for partly-smooth regularized problems such as the $\ell_1$- and group-LASSO-regularized ones. Our two-stage method first uses an inexact proximal quasi-Newton method to iteratively identify a sequence … Read more

Proximity in Concave Integer Quadratic Programming

A classic result by Cook, Gerards, Schrijver, and Tardos provides an upper bound of n∆ on the proximity of optimal solutions of an Integer Linear Programming problem and its standard linear relaxation. In this bound, n is the number of variables and ∆ denotes the maximum of the absolute values of the subdeterminants of the … Read more

Adversarial Classification via Distributional Robustness with Wasserstein Ambiguity

We study a model for adversarial classification based on distributionally robust chance constraints. We show that under Wasserstein ambiguity, the model aims to minimize the conditional value-at-risk of the distance to misclassification, and we explore links to adversarial classification models proposed earlier and to maximum-margin classifiers. We also provide a reformulation of the distributionally robust … Read more

A Personalized Switched Systems Approach for the Optimal Control of Ventricular Assist Devices based on Atrioventricular Plane Displacement

Objective: A promising treatment for congestive heart failure is the implementation of a left ventricular assist device (LVAD) that works as a mechanical pump. Modern LVADs work with adjustable constant rotor speed and provide therefore continuous blood flow; however, recently undertaken efforts try to mimic pulsatile blood flow by oscillating the pump speed. This work … Read more

Riemannian conjugate gradient methods with inverse retraction

We propose a new class of Riemannian conjugate gradient (CG) methods, in which inverse retraction is used instead of vector transport for search direction construction. In existing methods, differentiated retraction is often used for vector transport to move the previous search direction to the current tangent space. However, a different perspective is adopted here, motivated … Read more

Two novel gradient methods with optimal step sizes

In this work we introduce two new Barzilai and Borwein-like steps sizes for the classical gradient method for strictly convex quadratic optimization problems. The proposed step sizes employ second-order information in order to obtain faster gradient-type methods. Both step sizes are derived from two unconstrained optimization models that involve approximate information of the Hessian of … Read more

Solving nonlinear systems of equations via spectral residual methods: stepsize selection and applications

Spectral residual methods are derivative-free and low-cost per iteration procedures for solving nonlinear systems of equations. They are generally coupled with a nonmonotone linesearch strategy and compare well with Newton-based methods for large nonlinear systems and sequences of nonlinear systems. The residual vector is used as the search direction and choosing the steplength has a … Read more

High-order Evaluation Complexity of a Stochastic Adaptive Regularization Algorithm for Nonconvex Optimization Using Inexact Function Evaluations and Randomly Perturbed Derivatives

A stochastic adaptive regularization algorithm allowing random noise in derivatives and inexact function values is proposed for computing strong approximate minimizers of any order for inexpensively constrained smooth optimization problems. For an objective function with Lipschitz continuous p-th derivative in a convex neighbourhood of the feasible set and given an arbitrary optimality order q, it … Read more

A derivative-free method for structured optimization problems

Structured optimization problems are ubiquitous in fields like data science and engineering. The goal in structured optimization is using a prescribed set of points, called atoms, to build up a solution that minimizes or maximizes a given function. In the present paper, we want to minimize a black-box function over the convex hull of a … Read more