Accuracy Certificates for Convex Optimization at Accelerated Rates via Primal-Dual Averaging

Many works in convex optimization provide rates for achieving a small primal gap. However, this quantity is typically unavailable in practice. In this work, we show that solving a regularized surrogate with algorithms based on simple primal-dual averaging provides non-asymptotic convergence guarantees for a computable optimality certificate. We first analyze primal and dual methods based … Read more

A semi-smooth Newton method for the nonlinear conic problem with generalized simplicial cones

In this work we develop and analyze a semi-smooth Newton method for the general non- linear conic programming problem. In particular, we study the problem with a generalized simplicial cone, i.e., the image of a symmetric cone under a linear mapping. We generalize Robinson’s normal equations to a conic setting, yielding what we call the … Read more

A unified convergence theory for adaptive first-order methods in the nonconvex case, including AdaNorm, full and diagonal AdaGrad, Shampoo and Muon

A unified framework for first-order optimization algorithms for nonconvex unconstrained optimization is proposed that uses adaptively preconditioned gradients and includes popular methods such as full and diagonal AdaGrad, AdaNorm, as well as adpative variants of Shampoo and Muon. This framework also allows combining heterogeneous geometries across different groups of variables while preserving a unified convergence … Read more

Complexity of an inexact stochastic SQP algorithm for equality constrained optimization

In this paper, we consider nonlinear optimization problems with a stochastic objective function and deterministic equality constraints. We propose an inexact two-stepsize stochastic sequential quadratic programming (SQP) algorithm and analyze its worst-case complexity under mild assumptions. The method utilizes a step decomposition strategy and handles stochastic gradient estimates by assigning different stepsizes to different components … Read more

A flexible block coordinate descent method for unconstrained optimization under Hölder continuity

In this work, we propose a flexible block coordinate method for unconstrained optimization problems under Hölder continuity assumptions. The method guarantees convergence to stationary points and has worst-case complexity results comparable to those obtained by single-block methods that assume Lipschitz or Hölder continuity. The approach is based on quadratic models of the objective function combined … Read more

Adaptive Newton-CG methods with global and local analysis for unconstrained optimization with Hölder continuous Hessian

In this paper, we study Newton-conjugate gradient (Newton-CG) methods for minimizing a nonconvex function $f$ whose Hessian is $(H_f,\nu)$-H\”older continuous with modulus $H_f>0$ and exponent \(\nu\in(0,1]\). Recently proposed Newton-CG methods for this problem \cite{he2025newton} adopt (i) non-adaptive regularization and (ii) a nested line-search procedure, where (i) often leads to inefficient early progress and the loss … Read more

Separable QCQPs and Their Exact SDP Relaxations

This paper studies exact semidefinite programming relaxations (SDPRs) for separable quadratically constrained quadratic programs (QCQPs). We consider the construction of a larger separable QCQP from multiple QCQPs with exact SDPRs. We show that exactness is preserved when such QCQPs are combined through a separable horizontal connection, where the coupling is induced through the right-hand-side parameters … Read more

A Gradient Sampling Algorithm for Noisy Nonsmooth Optimization

An algorithm is proposed, analyzed, and tested for minimizing locally Lipschitz objective functions that may be nonconvex and/or nonsmooth. The algorithm, which is built upon the gradient-sampling methodology, is designed specifically for cases when objective function and generalized gradient values might be subject to bounded uncontrollable errors. Similarly to state-of-the-art guarantees for noisy smooth optimization … Read more

An objective-function-free algorithm for nonconvex stochastic optimization with deterministic equality and inequality constraints

An algorithm is proposed for solving optimization problems with stochastic objective and deterministic equality and inequality constraints. This algorithm is objective-function-free in the sense that it only uses the objective’s gradient and never evaluates the function value. It is based on an adaptive selection of function-decreasing and constraint-improving iterations, the first ones using an Adagrad-type … Read more

Beyond binarity: Semidefinite programming for ternary quadratic problems

We study the ternary quadratic problem (TQP), a quadratic optimization problem with linear constraints where the variables take values in {0,±1}. While semidefinite programming (SDP) techniques are well established for {0,1}- and {±1}-valued quadratic problems, no dedicated integer semidefinite programming framework exists for the ternary case. In this paper, we introduce a ternary SDP formulation … Read more