A first-order method for nonconvex-nonconcave minimax problems under a local Kurdyka-Łojasiewicz condition

We study a class of nonconvex–nonconcave minimax problems in which the inner maximization problem satisfies a local Kurdyka–Łojasiewicz (KL) condition that may vary with the outer minimization variable. In contrast to the global KL or Polyak–Łojasiewicz (PL) conditions commonly assumed in the literature—which are significantly stronger and often too restrictive in practice—this local KL condition … Read more

General Perturbation Resilient Dynamic String-Averaging for Inconsistent Problems with Superiorization

In this paper we introduce a General Dynamic String-Averaging (GDSA) iterative scheme and investigate its convergence properties in the inconsistent case, that is, when the input operators don’t have a common fixed point. The Dynamic String-Averaging Projection (DSAP) algorithm itself was introduced in an 2013 paper, where its strong convergence and bounded perturbation resilience were … Read more

Swapping objectives accelerates Davis-Yin splitting

In this work, we investigate the application of Davis–Yin splitting (DYS) to convex optimization problems and demonstrate that swapping the roles of the two nonsmooth convex functions can result in a faster convergence rate. Such a swap typically yields a different sequence of iterates, but its impact on convergence behavior has been largely understudied or … Read more

Preconditioning for rational approximation

In this paper, we show that minimax rational approximations can be enhanced by introducing a controlling parameter on the denominator of the rational function. This is implemented by adding a small set of linear constraints to the underlying optimization problem. The modification integrates naturally into approximation models formulated as linear programming problems. We demonstrate our … Read more

Optimized methods for composite optimization: a reduction perspective

Recent advances in convex optimization have leveraged computer-assisted proofs to develop optimized first-order methods that improve over classical algorithms. However, each optimized method is specially tailored for a particular problem setting, and it is a well-documented challenge to extend optimized methods to other settings due to their highly bespoke design and analysis. We provide a general … Read more

First-order methods for stochastic and finite-sum convex optimization with deterministic constraints

In this paper, we study a class of stochastic and finite-sum convex optimization problems with deterministic constraints. Existing methods typically aim to find an \(\epsilon\)-expectedly feasible stochastic optimal solution, in which the expected constraint violation and expected optimality gap are both within a prescribed tolerance ϵ. However, in many practical applications, constraints must be nearly … Read more

A Variational Analysis Approach for Bilevel Hyperparameter Optimization with Sparse Regularization

We study a bilevel optimization framework for hyperparameter learning in variational models, with a focus on sparse regression and classification tasks. In particular, we consider a weighted elastic-net regularizer, where feature-wise regularization parameters are learned through a bilevel formulation. A key novelty of our approach is the use of a Forward-Backward (FB) reformulation of the … Read more

Lipschitz Stability for a Class of Parametric Optimization Problems with Polyhedral Feasible Set Mapping

This paper is devoted to the Lipschitz analysis of the solution sets and optimal values for a class of parametric optimization problems involving a polyhedral feasible set mapping and a quadratic objective function with arametric linear part. Recall that a multifunction is said to be polyhedral if its graph is the union of finitely many polyhedral … Read more

Efficient QUIC-Based Damped Inexact Iterative Reweighting for Sparse Inverse Covariance Estimation with Nonconvex Partly Smooth Regularization

In this paper, we study sparse inverse covariance matrix estimation incorporating partly smooth nonconvex regularizers. To solve the resulting regularized log-determinant problem, we develop DIIR-QUIC—a novel Damped Inexact Iteratively Reweighted algorithm based on QUadratic approximate Inverse Covariance (QUIC) method. Our approach generalizes the classic iteratively reweighted \(\ell_1\) scheme through damped fixed-point updates. A key novelty … Read more

Dual certificates of primal cone membership

We discuss easily verifiable cone membership certificates, that is, certificates proving relations of the form \( b\in K \) for convex cones \(K\) that consist of vectors in the dual cone \(K^*\). Vectors in the dual cone are usually associated with separating hyperplanes, and so they are interpreted as certificates of non-membership in the standard … Read more