Variational Analysis of the Spectral Abscissa at a Matrix with a Nongeneric Multiple Eigenvalue

The spectral abscissa is a fundamental map from the set of complex matrices to the real numbers. Denoted $\alpha$ and defined as the maximum of the real parts of the eigenvalues of a matrix $X$, it has many applications in stability analysis of dynamical systems. The function $\alpha$ is nonconvex and is non-Lipschitz near matrices … Read more

Reducing the Number of Function Evaluations in Mesh Adaptive Direct Search Algorithms

The mesh adaptive direct search (MADS) class of algorithms is designed for nonsmooth optimization, where the objective function and constraints are typically computed by launching a time-consuming computer simulation. Each iteration of a MADS algorithm attempts to improve the current best-known solution by launching the simulation at a finite number of trial points. Common implementations … Read more

On parallelizing dual decomposition in stochastic integer programming

For stochastic mixed-integer programs, we revisit the dual decomposition algorithm of Car\o{}e and Schultz from a computational perspective with the aim of its parallelization. We address an important bottleneck of parallel execution by identifying a formulation that permits the parallel solution of the \textit{master} program by using structure-exploiting interior-point solvers. Our results demonstrate the potential … Read more

Iterative Reweighted Minimization Methods for $ Regularized Unconstrained Nonlinear Programming

In this paper we study general $l_p$ regularized unconstrained minimization problems. In particular, we derive lower bounds for nonzero entries of first- and second-order stationary points, and hence also of local minimizers of the $l_p$ minimization problems. We extend some existing iterative reweighted $l_1$ (IRL1) and $l_2$ (IRL2) minimization methods to solve these problems and … Read more

Variable Neighborhood Search for parameter tuning in Support Vector Machines

As in most Data Mining procedures, how to tune the parameters of a Support Vector Machine (SVM) is a critical, though not sufficiently explored, issue. The default approach is a grid search in the parameter space, which becomes prohibitively time-consuming even when just a few parameters are to be tuned. For this reason, for models … Read more

On RIC bounds of Compressed Sensing Matrices for Approximating Sparse Solutions Using Lq Quasi Norms

This paper follows the recent discussion on the sparse solution recovery with quasi-norms Lq; q\in(0,1) when the sensing matrix possesses a Restricted Isometry Constant \delta_{2k} (RIC). Our key tool is an improvement on a version of “the converse of a generalized Cauchy-Schwarz inequality” extended to the setting of quasi-norm. We show that, if \delta_{2k}\le 1/2, … Read more

On Stable Piecewise Linearization and Generalized Algorithmic Differentiation

It is shown how functions that are defined by evaluation programs involving the absolute value function (besides smooth elementals), can be approximated locally by piecewise-linear models in the style of algorithmic, or automatic, differentiation (AD). The model can be generated by a minor modification of standard AD tools and it is Lipschitz continuous with respect … Read more

Second-order variational analysis and characterizations of tilt-stable optimal solutions in finite and infinite dimensions

The paper is devoted to developing second-order tools of variational analysis and their applications to characterizing tilt-stable local minimizers of constrained optimization problems in finite-dimensional and infinite-dimensional spaces. The importance of tilt stability has been well recognized from both theoretical and numerical aspects of optimization. Based on second-order generalized differentiation, we obtain qualitative and quantitative … Read more

On the Global and Linear Convergence of the Generalized Alternating Direction Method of Multipliers

The formulation min f(x)+g(y) subject to Ax+By=b, where f and g are extended-value convex functions, arises in many application areas such as signal processing, imaging and image processing, statistics, and machine learning either naturally or after variable splitting. In many common problems, one of the two objective functions is strictly convex and has Lipschitz continuous … Read more

An Efficient Augmented Lagrangian Method with Applications to Total Variation Minimization

Based on the classic augmented Lagrangian multiplier method, we propose, analyze and test an algorithm for solving a class of equality-constrained non-smooth optimization problems (chiefly but not necessarily convex programs) with a particular structure. The algorithm effectively combines an alternating direction technique with a nonmonotone line search to minimize the augmented Lagrangian function at each … Read more