Sharpness and well-conditioning of nonsmooth convex formulations in statistical signal recovery

We study a sample complexity vs. conditioning tradeoff in modern signal recovery problems where convex optimization problems are built from sampled observations. We begin by introducing a set of condition numbers related to sharpness in \(\ell_p\) or Schatten-p norms (\(p\in[1,2]\)) based on nonsmooth reformulations of a class of convex optimization problems, including sparse recovery, low-rank … Read more

The nonsmooth landscape of phase retrieval

We consider a popular nonsmooth formulation of the real phase retrieval problem. We show that under standard statistical assumptions, a simple subgradient method converges linearly when initialized within a constant relative distance of an optimal solution. Seeking to understand the distribution of the stationary points of the problem, we complete the paper by proving that … Read more

First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems

We focus on nonconvex and nonsmooth minimization problems with a composite objective, where the differentiable part of the objective is freed from the usual and restrictive global Lipschitz gradient continuity assumption. This longstanding smoothness restriction is pervasive in first order methods (FOM), and was recently circumvent for convex composite optimization by Bauschke, Bolte and Teboulle, … Read more

Globally Convergent Levenberg-Marquardt Method For Phase Retrieval

In this paper, we consider a nonlinear least squares model for the phase retrieval problem. Since the Hessian matrix may not be positive definite and the Gauss-Newton (GN) matrix is singular at any optimal solution, we propose a modified Levenberg-Marquardt (LM) method, where the Hessian is substituted by a summation of the GN matrix and … Read more