Moment-sos and spectral hierarchies for polynomial optimization on the sphere and quantum de Finetti theorems

We revisit the convergence analysis of two approximation hierarchies for polynomial optimization on the unit sphere. The first one is based on the moment-sos approach and gives semidefinite bounds for which Fang and Fawzi (2021) showed an analysis in \(O(1/r^2)\) for the r-th level bound, using the polynomial kernel method. The second hierarchy was recently … Read more

A Riemannian AdaGrad-Norm Method

We propose a manifold AdaGrad-Norm method (\textsc{MAdaGrad}), which extends the norm version of AdaGrad (AdaGrad-Norm) to Riemannian optimization. In contrast to line-search schemes, which may require several exponential map computations per iteration, \textsc{MAdaGrad} requires only one. Assuming the objective function $f$ has Lipschitz continuous Riemannian gradient, we show that the method requires at most $\mathcal{O}(\varepsilon^{-2})$ … Read more

On the Convergence and Properties of a Proximal-Gradient Method on Hadamard Manifolds

In this paper, we address composite optimization problems on Hadamard manifolds, where the objective function is given by the sum of a smooth term (not necessarily convex) and a convex term (not necessarily differentiable). To solve this problem, we develop a proximal gradient method defined directly on the manifold, employing a strategy that enforces monotonicity … Read more

New insights and algorithms for optimal diagonal preconditioning

Preconditioning (scaling) is essential in many areas of mathematics, and in particular in optimization. In this work, we study the problem of finding an optimal diagonal preconditioner. We focus on minimizing two different notions of condition number: the classical, worst-case type, \(\kappa\)-condition number, and the more averaging motivated \(\omega\)-condition number. We provide affine based pseudoconvex … Read more

Approximating inequality systems within probability functions: studying implications for problems and consistency of first-order information

In this work, we are concerned with the study of optimization problems featuring so-called probability or chance constraints. Probability constraints measure the level of satisfaction of an underlying random inequality system and ensure that this level is high enough. Such an underlying inequality system could be expressed by an abstractly known or perhaps costly to … Read more

Bregman Douglas-Rachford Splitting Method

In this paper, we propose the Bregman Douglas-Rachford splitting (BDRS) method and its variant Bregman Peaceman-Rachford splitting method for solving maximal monotone inclusion problem. We show that BDRS is equivalent to a Bregman alternating direction method of multipliers (ADMM) when applied to the dual of the problem. A special case of the Bregman ADMM is … Read more

Gradient Methods with Online Scaling Part II. Practical Aspects

Part I of this work [Gao25] establishes online scaled gradient methods (OSGM), a framework that utilizes online convex optimization to adapt stepsizes in gradient methods. This paper focuses on the practical aspects of OSGM. We leverage the OSGM framework to design new adaptive first-order methods and provide insights into their empirical behavior. The resulting method, … Read more

On the convergence rate of the Douglas-Rachford splitting algorithm

This work is concerned with the convergence rate analysis of the Dou- glas–Rachford splitting (DRS) method for finding a zero of the sum of two maximally monotone operators. We obtain an exact rate of convergence for the DRS algorithm and demonstrate its sharpness in the setting of convex feasibility problems. Further- more, we investigate the … Read more

Alternating Iteratively Reweighted \(\ell_1\) and Subspace Newton Algorithms for Nonconvex Sparse Optimization

This paper presents a novel hybrid algorithm for minimizing the sum of a continuously differentiable loss function and a nonsmooth, possibly nonconvex, sparsity‑promoting regularizer. The proposed method adaptively switches between solving a reweighted \(\ell_1\)-regularized subproblem and performing an inexact subspace Newton step. The reweighted \(\ell_1\)-subproblem admits an efficient closed-form solution via the soft-thresholding operator, thereby … Read more

Polyconvex double well functions

We investigate polyconvexity of the double well function $f(X) := |X-X_1|^2|X-X_2|^2$ for given matrices $X_1, X_2 \in \R^{n \times n}$. Such functions are fundamental in the modeling of phase transitions in materials, but their non-convex nature presents challenges for the analysis of variational problems. We prove that $f$ is polyconvex if and only if the … Read more