Sequential test sampling for stochastic derivative-free optimization

In many derivative-free optimization algorithms, a sufficient decrease condition decides whether to accept a trial step in each iteration. This condition typically requires that the potential objective function value decrease of the trial step, i.e., the true reduction in the objective function value that would be achieved by moving from the current point to the … Read more

Gradient Methods with Online Scaling Part II. Practical Aspects

Part I of this work [Gao25] establishes online scaled gradient methods (OSGM), a framework that utilizes online convex optimization to adapt stepsizes in gradient methods. This paper focuses on the practical aspects of OSGM. We leverage the OSGM framework to design new adaptive first-order methods and provide insights into their empirical behavior. The resulting method, … Read more

Properties of Enclosures in Multiobjective Optimization

A widely used approximation concept in multiobjective optimization is the concept of enclosures. These are unions of boxes defined by lower and upper bound sets that are used to cover optimal sets of multiobjective optimization problems in the image space. The width of an enclosure is taken as a quality measure. In this paper, we … Read more

Visiting exactly once all the vertices of {0,1,2}^3 with a 13-segment path that avoids self-crossing

In the Euclidean space \(\mathbb{R}^3\), we ask whether one can visit each of the \(27\) vertices of the grid \(G_3:=\{0,1,2\}^3\) exactly once using as few straight-line segments, connected end to end, as possible (an optimal polygonal chain). We give a constructive proof that there exists a \(13\)-segment perfect simple path (i.e., an optimal chain that … Read more

Alternating Iteratively Reweighted \(\ell_1\) and Subspace Newton Algorithms for Nonconvex Sparse Optimization

This paper presents a novel hybrid algorithm for minimizing the sum of a continuously differentiable loss function and a nonsmooth, possibly nonconvex, sparsity‑promoting regularizer. The proposed method adaptively switches between solving a reweighted \(\ell_1\)-regularized subproblem and performing an inexact subspace Newton step. The reweighted \(\ell_1\)-subproblem admits an efficient closed-form solution via the soft-thresholding operator, thereby … Read more

Nonlinear Model Predictive Control with an Infinite Horizon Approximation

Current nonlinear model predictive control (NMPC) strategies are formulated as finite predictive horizon nonlinear programs (NLPs), which maintain NMPC stability and recursive feasibility through the construction of terminal cost functions and/or terminal constraints. However, computing these terminal properties may pose formidable challenges with a fixed horizon, particularly in the context of nonlinear dynamic processes. Motivated … Read more

Active-Set Identification in Noisy and Stochastic Optimization

Identifying active constraints from a point near an optimal solution is important both theoretically and practically in constrained continuous optimization, as it can help identify optimal Lagrange multipliers and essentially reduces an inequality-constrained problem to an equality-constrained one. Traditional active-set identification guarantees have been proved under assumptions of smoothness and constraint qualifications, and assume exact … Read more

AS-BOX: Additional Sampling Method for Weighted Sum Problems with Box Constraints

A class of optimization problems characterized by a weighted finite-sum objective function subject to box constraints is considered. We propose a novel stochastic optimization method, named AS-BOX (Additional Sampling for BOX constraints), that combines projected gradient directions with adaptive variable sample size strategies and nonmonotone line search. The method dynamically adjusts the batch size based … Read more

Active-set Newton-MR methods for nonconvex optimization problems with bound constraints

This paper presents active-set methods for minimizing nonconvex twice-continuously differentiable functions subject to bound constraints. Within the faces of the feasible set, we employ descent methods with Armijo line search, utilizing approximated Newton directions obtained through the Minimum Residual (MINRES) method. To escape the faces, we investigate the use of the Spectral Projected Gradient (SPG) … Read more

A user manual for cuHALLaR: A GPU accelerated low-rank semidefinite programming Solver

We present a Julia-based interface to the precompiled HALLaR and cuHALLaR binaries for large-scale semidefinite programs (SDPs). Both solvers are established as fast and numerically stable, and accept problem data in formats compatible with SDPA and a new enhanced data format taking advantage of Hybrid Sparse Low-Rank (HSLR) structure. The interface allows users to load … Read more