Dependence of bilevel programming on irrelevant data

In 1997, Macal and Hurter have found that adding a constraint to the lower level problem, which is not active at the computed global optimal solution, can destroy global optimality. In this paper this property is reconsidered and it is shown that this solution remains locally optimal under inner semicontinuity of the original solution set … Read more

Structured Sparsity via Alternating Direction Methods

We consider a class of sparse learning problems in high dimensional feature space regularized by a structured sparsity-inducing norm which incorporates prior knowledge of the group structure of the features. Such problems often pose a considerable challenge to optimization algorithms due to the non-smoothness and non-separability of the regularization term. In this paper, we focus … Read more

An Infeasible-Point Subgradient Method Using Adaptive Approximate Projections

We propose a new subgradient method for the minimization of convex functions over a convex set. Common subgradient algorithms require an exact projection onto the feasible region in every iteration, which can be efficient only for problems that admit a fast projection. In our method we use inexact adaptive projections requiring to move within a … Read more

Explicit Solutions for Root Optimization of a Polynomial Family with One Affine Constraint

Given a family of real or complex monic polynomials of fixed degree with one affine constraint on their coefficients, consider the problem of minimizing the root radius (largest modulus of the roots) or root abscissa (largest real part of the roots). We give constructive methods for efficiently computing the globally optimal value as well as … Read more

FAST FIRST-ORDER METHODS FOR COMPOSITE CONVEX OPTIMIZATION WITH BACKTRACKING

We propose new versions of accelerated first order methods for convex composite optimization, where the prox parameter is allowed to increase from one iteration to the next. In particular we show that a full backtracking strategy can be used within the FISTA \cite{Beck-Teboulle-2009} and FALM algorithms \cite{Goldfarb-Ma-Scheinberg-2010} while preserving their worst-case iteration complexities of $O(\sqrt{L(f)/\epsilon})$. … Read more

Snow water equivalent estimation using blackbox optimization

Accurate measurements of snow water equivalent (SWE) is an important factor in managing water resources for hydroelectric power generation. SWE over a catchment area may be estimated via kriging on measures obtained by snow monitoring devices positioned at strategic locations. The question studied in this paper is to find the device locations that minimize the … Read more

Use of quadratic models with mesh adaptive direct search for constrained black box optimization

We consider a derivative-free optimization, and in particular black box optimization, where the functions to be minimized and the functions representing the constraints are given by black boxes without derivatives. Two fundamental families of methods are available: model-based methods and directional direct search algorithms. This work exploits the flexibility of the second type of methods … Read more

Convergence analysis of a proximal Gauss-Newton method

An extension of the Gauss-Newton algorithm is proposed to find local minimizers of penalized nonlinear least squares problems, under generalized Lipschitz assumptions. Convergence results of local type are obtained, as well as an estimate of the radius of the convergence ball. Some applications for solving constrained nonlinear equations are discussed and the numerical performance of … Read more

The dimension of semialgebraic subdifferential graphs.

Examples exist of extended-real-valued closed functions on $\R^n$ whose subdifferentials (in the standard, limiting sense) have large graphs. By contrast, if such a function is semi-algebraic, then its subdifferential graph must have everywhere constant local dimension $n$. This result is related to a celebrated theorem of Minty, and surprisingly may fail for the Clarke subdifferential. … Read more

SOME REGULARITY RESULTS FOR THE PSEUDOSPECTRAL ABSCISSA AND PSEUDOSPECTRAL RADIUS OF A MATRIX

The $\epsilon$-pseudospectral abscissa $\alpha_\epsilon$ and radius $\rho_\epsilon$ of an n x n matrix are respectively the maximal real part and the maximal modulus of points in its $\epsilon$-pseudospectrum, defined using the spectral norm. It was proved in [A.S. Lewis and C.H.J. Pang. Variational analysis of pseudospectra. SIAM Journal on Optimization, 19:1048-1072, 2008] that for fixed … Read more