Visiting exactly once all the vertices of {0,1,2}^3 with a 13-segment path that avoids self-crossing

In the Euclidean space \(\mathbb{R}^3\), we ask whether one can visit each of the \(27\) vertices of the grid \(G_3:=\{0,1,2\}^3\) exactly once using as few straight-line segments, connected end to end, as possible (an optimal polygonal chain). We give a constructive proof that there exists a \(13\)-segment perfect simple path (i.e., an optimal chain that … Read more

Alternating Iteratively Reweighted \(\ell_1\) and Subspace Newton Algorithms for Nonconvex Sparse Optimization

This paper presents a novel hybrid algorithm for minimizing the sum of a continuously differentiable loss function and a nonsmooth, possibly nonconvex, sparsity‑promoting regularizer. The proposed method adaptively switches between solving a reweighted \(\ell_1\)-regularized subproblem and performing an inexact subspace Newton step. The reweighted \(\ell_1\)-subproblem admits an efficient closed-form solution via the soft-thresholding operator, thereby … Read more

Nonlinear Model Predictive Control with an Infinite Horizon Approximation

Current nonlinear model predictive control (NMPC) strategies are formulated as finite predictive horizon nonlinear programs (NLPs), which maintain NMPC stability and recursive feasibility through the construction of terminal cost functions and/or terminal constraints. However, computing these terminal properties may pose formidable challenges with a fixed horizon, particularly in the context of nonlinear dynamic processes. Motivated … Read more

Active-Set Identification in Noisy and Stochastic Optimization

Identifying active constraints from a point near an optimal solution is important both theoretically and practically in constrained continuous optimization, as it can help identify optimal Lagrange multipliers and essentially reduces an inequality-constrained problem to an equality-constrained one. Traditional active-set identification guarantees have been proved under assumptions of smoothness and constraint qualifications, and assume exact … Read more

AS-BOX: Additional Sampling Method for Weighted Sum Problems with Box Constraints

A class of optimization problems characterized by a weighted finite-sum objective function subject to box constraints is considered. We propose a novel stochastic optimization method, named AS-BOX (Additional Sampling for BOX constraints), that combines projected gradient directions with adaptive variable sample size strategies and nonmonotone line search. The method dynamically adjusts the batch size based … Read more

Active-set Newton-MR methods for nonconvex optimization problems with bound constraints

This paper presents active-set methods for minimizing nonconvex twice-continuously differentiable functions subject to bound constraints. Within the faces of the feasible set, we employ descent methods with Armijo line search, utilizing approximated Newton directions obtained through the Minimum Residual (MINRES) method. To escape the faces, we investigate the use of the Spectral Projected Gradient (SPG) … Read more

A user manual for cuHALLaR: A GPU accelerated low-rank semidefinite programming Solver

We present a Julia-based interface to the precompiled HALLaR and cuHALLaR binaries for large-scale semidefinite programs (SDPs). Both solvers are established as fast and numerically stable, and accept problem data in formats compatible with SDPA and a new enhanced data format taking advantage of Hybrid Sparse Low-Rank (HSLR) structure. The interface allows users to load … Read more

Polyconvex double well functions

We investigate polyconvexity of the double well function $f(X) := |X-X_1|^2|X-X_2|^2$ for given matrices $X_1, X_2 \in \R^{n \times n}$. Such functions are fundamental in the modeling of phase transitions in materials, but their non-convex nature presents challenges for the analysis of variational problems. We prove that $f$ is polyconvex if and only if the … Read more

On the boundedness of multipliers in augmented Lagrangian methods for mathematical programs with complementarity constraints

In this paper, we present a theoretical analysis of augmented Lagrangian (AL) methods applied to mathematical programs with complementarity constraints (MPCCs). Our focus is on a variant that reformulates the complementarity constraints using slack variables, where these constraints are handled directly in the subproblems rather than being penalized. We introduce specialized constraint qualifications (CQs) of … Read more

A First Order Algorithm on an Optimization Problem with Improved Convergence when Problem is Convex

We propose a first order algorithm, a modified version of FISTA, to solve an optimization problem with an objective function that is the sum of a possibly nonconvex function, with Lipschitz continuous gradient, and a convex function which can be nonsmooth. The algorithm is shown to have an iteration complexity of \(\mathcal{O}(\epsilon^{-2})\) to find an … Read more