On the Convergence and Properties of a Proximal-Gradient Method on Hadamard Manifolds

In this paper, we address composite optimization problems on Hadamard manifolds, where the objective function is given by the sum of a smooth term (not necessarily convex) and a convex term (not necessarily differentiable). To solve this problem, we develop a proximal gradient method defined directly on the manifold, employing a strategy that enforces monotonicity … Read more

A second-order cone representable class of nonconvex quadratic programs

We consider the problem of minimizing a sparse nonconvex quadratic function over the unit hypercube. By developing an extension of the Reformulation Linearization Technique (RLT) to continuous quadratic sets, we propose a novel second-order cone (SOC) representable relaxation for this problem. By exploiting the sparsity of the quadratic function, we establish a sufficient condition under … Read more

A Graphical Global Optimization Framework for Parameter Estimation of Statistical Models with Nonconvex Regularization Functions

Optimization problems with norm-bounding constraints appear in various applications, from portfolio optimization to machine learning, feature selection, and beyond. A widely used variant of these problems relaxes the norm-bounding constraint through Lagrangian relaxation and moves it to the objective function as a form of penalty or regularization term. A challenging class of these models uses … Read more

The improvement function reformulation for graphs of minimal point mappings

Graphs of minimal point mappings of parametric optimization problems appear in the definition of feasible sets of bilevel optimization problems and of semi-infinite optimization problems, and the intersection of multiple such graphs defines (generalized) Nash equilibria. This paper shows how minimal point graphs of nonconvex parametric optimization problems can be written with the help of … Read more

The Least Singular Value Function in Variational Analysis

Metric regularity is among the central concepts of nonlinear and variational analysis, constrained optimization, and their numerous applications. However, met- ric regularity can be elusive for some important ill-posed classes of problems includ- ing polynomial equations, parametric variational systems, smooth reformulations of complementarity systems with degenerate solutions, etc. The study of stability issues for such … Read more

New Sufficient and Necessary Conditions for Constrained and Unconstrained Lipschitzian Error Bounds

Local error bounds play a fundamental role in mathematical programming and variational analysis. They are used e.g. as constraint qualifications in optimization, in developing calculus rules for generalized derivatives in nonsmooth and set-valued analysis, and they serve as a key ingredient in the design and convergence analysis of Newton-type methods for solving systems of possibly … Read more

On Sum-Rules for Second-Order Contingent Derivatives

We are concerned with contingent derivatives and their second-order counterparts (introduced by Ngai et al.) of set-valued mappings. Special attention is given to the development of new sum-rules for second-order contingent derivatives. To be precise, we want to find conditions under which the second-order contingent derivative of the sum of a smooth and a set-valued … Read more

Extended Triangle Inequalities for Nonconvex Box-Constrained Quadratic Programming

Let Box_n = {x in R^n : 0<= x <= e }, and let QPB_n denote the convex hull of {(1, x’)'(1, x’) : x  in Box_n}. The quadratic programming problem min{x’Q x + q’x : x in Box_n} where Q is not positive semidefinite (PSD), is equivalent to a linear optimization problem over QPB_n … Read more

Sparse Polynomial Matrix Optimization

A polynomial matrix inequality is a statement that a symmetric polynomial matrix is positive semidefinite over a given constraint set. Polynomial matrix optimization concerns minimizing the smallest eigenvalue of a symmetric polynomial matrix subject to a tuple of polynomial matrix inequalities. This work explores the use of sparsity methods in reducing the complexity of sum-of-squares … Read more

Global Optimization Algorithm through High-Resolution Sampling

We present an optimization algorithm that can identify a global minimum of a potentially nonconvex smooth function with high probability, assuming the Gibbs measure of the potential satisfies a logarithmic Sobolev inequality. Our contribution is twofold: on the one hand we propose a global optimization method, which is built on an oracle sampling algorithm producing … Read more