Analysis non-sparse recovery for non-convex relaxed $\ell_q$ minimization

This paper studies construction of signals, which are sparse or nearly sparse with respect to a tight frame $D$ from underdetermined linear systems. In the paper, we propose a non-convex relaxed $\ell_q(0 Article Download View Analysis non-sparse recovery for non-convex relaxed $ell_q$ minimization

Modeling Design and Control Problems Involving Neural Network Surrogates

We consider nonlinear optimization problems that involve surrogate models represented by neural net-works. We demonstrate first how to directly embed neural network evaluation into optimization models, highlight a difficulty with this approach that can prevent convergence, and then characterize stationarity of such models. We then present two alternative formulations of these problems in the specific … Read more

Global Complexity Bound of a Proximal ADMM for Linearly-Constrained Nonseperable Nonconvex Composite Programming

This paper proposes and analyzes a dampened proximal alternating direction method of multipliers (DP.ADMM) for solving linearly-constrained nonconvex optimization problems where the smooth part of the objective function is nonseparable. Each iteration of DP.ADMM consists of: (ii) a sequence of partial proximal augmented Lagrangian (AL) updates, (ii) an under-relaxed Lagrange multiplier update, and (iii) a … Read more

An Accelerated Inexact Dampened Augmented Lagrangian Method for Linearly-Constrained Nonconvex Composite Optimization Problems

This paper proposes and analyzes an accelerated inexact dampened augmented Lagrangian (AIDAL) method for solving linearly-constrained nonconvex composite optimization problems. Each iteration of the AIDAL method consists of: (i) inexactly solving a dampened proximal augmented Lagrangian (AL) subproblem by calling an accelerated composite gradient (ACG) subroutine; (ii) applying a dampened and under-relaxed Lagrange multiplier update; … Read more

MPCC Strategies for Nonsmooth NLPs

This paper develops solution strategies for large-scale nonsmooth optimization problems. We transform nonsmooth programs into equivalent mathematical programs with complementarity constraints (MPCCs), and then employ NLP-based strategies for their so- lution. For this purpose, two NLP formulations based on complementarity relaxations are put forward, one of which applies a parameterized formulation and operates with a … Read more

Inertial-relaxed splitting for composite monotone inclusions

In a similar spirit of the extension of the proximal point method developed by Alves et al. \cite{alvegm20}, we propose in this work an Inertial-Relaxed primal-dual splitting method to address the problem of decomposing the minimization of the sum of three convex functions, one of them being smooth, and considering a general coupling subspace. A … Read more

Comparing Solution Paths of Sparse Quadratic Minimization with a Stieltjes Matrix

This paper studies several solution paths of sparse quadratic minimization problems as a function of the weighing parameter of the bi-objective of estimation loss versus solution sparsity. Three such paths are considered: the “L0-path” where the discontinuous L0-function provides the exact sparsity count; the “L1-path” where the L1-function provides a convex surrogate of sparsity count; … Read more

A Local MM Subspace Method for Solving Constrained Variational Problems in Image Recovery

This article introduces a new Penalized Majorization-Minimization Subspace algorithm (P-MMS) for solving smooth, constrained optimization problems. In short, our approach consists of embedding a subspace algorithm in an inexact exterior penalty procedure. The subspace strategy, combined with a Majoration-Minimization step-size search, takes great advantage of the smoothness of the penalized cost function, while the penalty … Read more

Subgradient methods near active manifolds: saddle point avoidance, local convergence, and asymptotic normality

Nonsmooth optimization problems arising in practice, whether in signal processing, statistical estimation, or modern machine learning, tend to exhibit beneficial smooth substructure: their domains stratify into “active manifolds” of smooth variation, which common proximal algorithms “identify” in finite time. Identification then entails a transition to smooth dynamics, and permits the use of second-order information for … Read more

On Properties of Univariate Max Functions at Local Maximizers

More than three decades ago, Boyd and Balakrishnan established a regularity result for the two-norm of a transfer function at maximizers. Their result extends easily to the statement that the maximum eigenvalue of a univariate real analytic Hermitian matrix family is twice continuously differentiable, with Lipschitz second derivative, at all local maximizers, a property that … Read more