An inertial Riemannian gradient ADMM for nonsmooth manifold optimization

The Alternating Direction Method of Multipliers (ADMM) is widely recognized for its efficiency in solving separable optimization problems. However, its application to optimization on Riemannian manifolds remains a significant challenge. In this paper, we propose a novel inertial Riemannian gradient ADMM (iRG-ADMM) to solve Riemannian optimization problems with nonlinear constraints. Our key contributions are as … Read more

Accelerating Proximal Gradient Descent via Silver Stepsizes

Surprisingly, recent work has shown that gradient descent can be accelerated without using momentum–just by judiciously choosing stepsizes. An open question raised by several papers is whether this phenomenon of stepsize-based acceleration holds more generally for constrained and/or composite convex optimization via projected and/or proximal versions of gradient descent. We answer this in the affirmative … Read more

Acceleration by Random Stepsizes: Hedging, Equalization, and the Arcsine Stepsize Schedule

We show that for separable convex optimization, random stepsizes fully accelerate Gradient Descent. Specifically, using inverse stepsizes i.i.d. from the Arcsine distribution improves the iteration complexity from $O(k)$ to $O(k^{1/2})$, where $k$ is the condition number. No momentum or other algorithmic modifications are required. This result is incomparable to the (deterministic) Silver Stepsize Schedule which … Read more

Reduced Sample Complexity in Scenario-Based Control System Design via Constraint Scaling

The scenario approach is widely used in robust control system design and chance-constrained optimization, maintaining convexity without requiring assumptions about the probability distribution of uncertain parameters. However, the approach can demand large sample sizes, making it intractable for safety-critical applications that require very low levels of constraint violation. To address this challenge, we propose a … Read more

Jordan and isometric cone automorphisms in Euclidean Jordan algebras

Every symmetric cone K arises as the cone of squares in a Euclidean Jordan algebra V. As V is a real inner-product space, we may denote by Isom(V) its group of isometries. The groups JAut(V) of its Jordan-algebra automorphisms and Aut(K) of the linear cone automorphisms are then related. For certain inner products, JAut(V) = … Read more

New efficient accelerated and stochastic gradient descent algorithms based on locally Lipschitz gradient constants

In this paper, we revisit the recent stepsize applied for the gradient descent scheme which is called NGD proposed by [Hoai et al., A novel stepsize for gradient descent method, Operations Research Letters (2024) 53, doi: \href{10.1016/j.orl.2024.107072}{10.1016/j.orl.2024.107072}]. We first investigate NGD stepsize with two well-known accelerated techniques which are Heavy ball and Nesterov’s methods. In … Read more

Gradient Methods with Online Scaling

We introduce a framework to accelerate the convergence of gradient-based methods with online learning. The framework learns to scale the gradient at each iteration through an online learning algorithm and provably accelerates gradient-based methods asymptotically. In contrast with previous literature, where convergence is established based on worst-case analysis, our framework provides a strong convergence guarantee … Read more

New results related to cutters and to an extrapolated block-iterative method for finding a common fixed point of a collection of them

Given a Hilbert space and a finite family of operators defined on the space, the common fixed point problem (CFPP) is to find a point in the intersection of the fixed point sets of these operators.  Instances of the problem have numerous applications in science and engineering. We consider an extrapolated block-iterative method with dynamic … Read more

Parameter-free proximal bundle methods with adaptive stepsizes for hybrid convex composite optimization problems

This paper develops a parameter-free adaptive proximal bundle method with two important features: 1) adaptive choice of variable prox stepsizes that “closely fits” the instance under consideration; and 2) adaptive criterion for making the occurrence of serious steps easier. Computational experiments show that our method performs substantially fewer consecutive null steps (i.e., a shorter cycle) … Read more

Global non-asymptotic super-linear convergence rates of regularized proximal quasi-Newton methods on non-smooth composite problems

In this paper, we propose two regularized proximal quasi-Newton methods with symmetric rank-1 update of the metric (SR1 quasi-Newton) to solve non-smooth convex additive composite problems. Both algorithms avoid using line search or other trust region strategies. For each of them, we prove a super-linear convergence rate that is independent of the initialization of the … Read more