Lipschitz Based Lower Bound Construction for Surrogate Optimization

Bounds play a vital role in guiding optimization algorithms by enhancing convergence, improving solution quality, and quantifying optimality gaps. While Lipschitz-based lower bounds are well-established, their effectiveness is often constrained by the function’s topological properties. To address these limitations, we propose an approach that integrates nonlinear distance metrics with surrogate approximations, yielding more adaptive and … Read more

Refined TSSOS

The moment-sum of squares hierarchy by Lasserre has become an established technique for solving polynomial optimization problems. It provides a monotonically increasing series of tight bounds, but has well-known scalability limitations. For structured optimization problems, the term-sparsity SOS (TSSOS) approach scales much better due to block-diagonal matrices, obtained by completing the connected components of adjacency … Read more

Sparse Polynomial Optimization with Unbounded Sets

This paper considers sparse polynomial optimization with unbounded sets. When the problem possesses correlative sparsity, we propose a sparse homogenized Moment-SOS hierarchy with perturbations to solve it. The new hierarchy introduces one extra auxiliary variable for each variable clique according to the correlative sparsity pattern. Under the running intersection property, we prove that this hierarchy … Read more

A low-rank augmented Lagrangian method for large-scale semidefinite programming based on a hybrid convex-nonconvex approach

This paper introduces HALLaR, a new first-order method for solving large-scale semidefinite programs (SDPs) with bounded domain. HALLaR is an inexact augmented Lagrangian (AL) method where the AL subproblems are solved by a novel hybrid low-rank (HLR) method. The recipe behind HLR is based on two key ingredients: 1) an adaptive inexact proximal point method … Read more

Design Guidelines for Noise-Tolerant Optimization with Applications in Robust Design

The development of nonlinear optimization algorithms capable of performing reliably in the presence of noise has garnered considerable attention lately. This paper advocates for strategies to create noise-tolerant nonlinear optimization algorithms by adapting classical deterministic methods. These adaptations follow certain design guidelines described here, which make use of estimates of the noise level in the … Read more

Using generalized simplex methods to approximate derivatives

This paper presents two methods for approximating a proper subset of the entries of a Hessian using only function evaluations. These approximations are obtained using the techniques called generalized simplex Hessian and generalized centered simplex Hessian. We show how to choose the matrices of directions involved in the computation of these two techniques depending on … Read more

The cosine measure relative to a subspace

The cosine measure was introduced in 2003 to quantify the richness of a finite positive spanning sets of directions in the context of derivative-free directional methods. A positive spanning set is a set of vectors whose nonnegative linear combinations span the whole space. The present work extends the definition of cosine measure. In particular, the … Read more

Quadratic Optimization Through the Lens of Adjustable Robust Optimization

Quadratic optimization (QO) has been studied extensively in the literature due to its applicability in many practical problems. While practical, it is known that QO problems are generally NP-hard. So, researchers developed many approximation methods to find good solutions. In this paper, we analyze QO problems using robust optimization techniques. To this end, we first … Read more

A proof for multilinear error bounds

We derive the error bounds for multilinear terms in $[0,1]^n$ using a proof methodology based on the polyhedral representation of the convex hull. We extend the result for multilinear terms in $[\boldsymbol{L},\boldsymbol{0}] \times [\boldsymbol{0},\boldsymbol{U}]\subset\mathbb{R}^n$. ArticleDownload View PDF

Block Majorization Minimization with Extrapolation and Application to $\beta$-NMF

We propose a Block Majorization Minimization method with Extrapolation (BMMe) for solving a class of multi-convex optimization problems. The extrapolation parameters of BMMe are updated using a novel adaptive update rule. By showing that block majorization minimization can be reformulated as a block mirror descent method, with the Bregman divergence adaptively updated at each iteration, … Read more