Direct Search Methods on Reductive Homogeneous Spaces

Direct search methods are mainly designed for use in problems with no equality constraints. However, there are many instances where the feasible set is of measure zero in the ambient space and no mesh point lies within it. There are methods for working with feasible sets that are (Riemannian) manifolds, but not all manifolds are … Read more

A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications

For a symmetric positive semidefinite linear system of equations $\mathcal{Q} {\bf x} = {\bf b}$, where ${\bf x} = (x_1,\ldots,x_s)$ is partitioned into $s$ blocks, with $s \geq 2$, we show that each cycle of the classical block symmetric Gauss-Seidel (block sGS) method exactly solves the associated quadratic programming (QP) problem but added with an … Read more

BFGS convergence to nonsmooth minimizers of convex functions

The popular BFGS quasi-Newton minimization algorithm under reasonable conditions converges globally on smooth convex functions. This result was proved by Powell in 1976: we consider its implications for functions that are not smooth. In particular, an analogous convergence result holds for functions, like the Euclidean norm, that are nonsmooth at the minimizer. Citation Manuscript: School … Read more

CONVEX GEOMETRY OF THE GENERALIZED MATRIX-FRACTIONAL FUNCTION

Generalized matrix-fractional (GMF) functions are a class of matrix support functions introduced by Burke and Hoheisel as a tool for unifying a range of seemingly divergent matrix optimization problems associated with inverse problems, regularization and learning. In this paper we dramatically simplify the support function representation for GMF functions as well as the representation of … Read more

Algorithmic Differentiation for Piecewise Smooth Functions: A Case Study for Robust Optimization

This paper presents a minimization method for Lipschitz continuous, piecewise smooth objective functions based on algorithmic differentiation (AD). We assume that all nondifferentiabilities are caused by abs(), min(), and max(). The optimization method generates successively piecewise linearizations in abs-normal form and solves these local subproblems by exploiting the resulting kink structure. Both, the generation of … Read more

On Nonconvex Decentralized Gradient Descent

Consensus optimization has received considerable attention in recent years. A number of decentralized algorithms have been proposed for {convex} consensus optimization. However, to the behaviors or consensus \emph{nonconvex} optimization, our understanding is more limited. When we lose convexity, we cannot hope our algorithms always return global solutions though they sometimes still do sometimes. Somewhat surprisingly, … Read more

A TVSCAD approach for image deblurring with impulsive noise

We consider image deblurring problem in the presence of impulsive noise. It is known that \emph{total variation} (TV) regularization with L1-norm penalized data fitting (TVL1 for short) works reasonably well only when the level of impulsive noise is relatively low. For high level impulsive noise, TVL1 works poorly. The reason is that all data, both … Read more

Convergence rate bounds for a proximal ADMM with over-relaxation stepsize parameter for solving nonconvex linearly constrained problems

This paper establishes convergence rate bounds for a variant of the proximal alternating direction method of multipliers (ADMM) for solving nonconvex linearly constrained optimization problems. The variant of the proximal ADMM allows the inclusion of an over-relaxation stepsize parameter belonging to the interval (0,2). To the best of our knowledge, all related papers in the … Read more

On level regularization with normal solutions in decomposition methods for multistage stochastic programming problems

We consider well-known decomposition techniques for multistage stochastic programming and a new scheme based on normal solutions for stabilizing iterates during the solution process. The given algorithms combine ideas from finite perturbation of convex programs and level bundle methods to regularize the so-called forward step of these decomposition methods. Numerical experiments on a hydrothermal scheduling … Read more

A Limited-Memory Quasi-Newton Algorithm for Bound-Constrained Nonsmooth Optimization

We consider the problem of minimizing a continuous function that may be nonsmooth and nonconvex, subject to bound constraints. We propose an algorithm that uses the L-BFGS quasi-Newton approximation of the problem’s curvature together with a variant of the weak Wolfe line search. The key ingredient of the method is an active-set selection strategy that … Read more