Sub-sampled Trust-Region Methods with Deterministic Worst-Case Complexity Guarantees

In this paper, we develop and analyze sub-sampled trust-region methods for solving finite-sum optimization problems. These methods employ subsampling strategies to approximate the gradient and Hessian of the objective function, significantly reducing the overall computational cost. We propose a novel adaptive procedure for deterministically adjusting the sample size used for gradient (or gradient and Hessian) … Read more

Recursive Bound-Constrained AdaGrad with Applications to Multilevel and Domain Decomposition Minimization

Two OFFO (Objective-Function Free Optimization) noise tolerant algorithms are presented that handle bound constraints, inexact gradients and use second-order information when available. The first is a multi-level method exploiting a hierarchical description of the problem and the second is a domain-decomposition method covering the standard addditive Schwarz decompositions. Both are generalizations of the first-order AdaGrad … Read more

A first-order method for nonconvex-nonconcave minimax problems under a local Kurdyka-Łojasiewicz condition

We study a class of nonconvex–nonconcave minimax problems in which the inner maximization problem satisfies a local Kurdyka–Łojasiewicz (KL) condition that may vary with the outer minimization variable. In contrast to the global KL or Polyak–Łojasiewicz (PL) conditions commonly assumed in the literature—which are significantly stronger and often too restrictive in practice—this local KL condition … Read more

General Perturbation Resilient Dynamic String-Averaging for Inconsistent Problems with Superiorization

In this paper we introduce a General Dynamic String-Averaging (GDSA) iterative scheme and investigate its convergence properties in the inconsistent case, that is, when the input operators don’t have a common fixed point. The Dynamic String-Averaging Projection (DSAP) algorithm itself was introduced in an 2013 paper, where its strong convergence and bounded perturbation resilience were … Read more

Complexity of normalized stochastic first-order methods with momentum under heavy-tailed noise

In this paper, we propose practical normalized stochastic first-order methods with Polyak momentum, multi-extrapolated momentum, and recursive momentum for solving unconstrained optimization problems. These methods employ dynamically updated algorithmic parameters and do not require explicit knowledge of problem-dependent quantities such as the Lipschitz constant or noise bound. We establish first-order oracle complexity results for finding … Read more

First-order methods for stochastic and finite-sum convex optimization with deterministic constraints

In this paper, we study a class of stochastic and finite-sum convex optimization problems with deterministic constraints. Existing methods typically aim to find an \(\epsilon\)-expectedly feasible stochastic optimal solution, in which the expected constraint violation and expected optimality gap are both within a prescribed tolerance ϵ. However, in many practical applications, constraints must be nearly … Read more

Lipschitz Stability for a Class of Parametric Optimization Problems with Polyhedral Feasible Set Mapping

This paper is devoted to the Lipschitz analysis of the solution sets and optimal values for a class of parametric optimization problems involving a polyhedral feasible set mapping and a quadratic objective function with arametric linear part. Recall that a multifunction is said to be polyhedral if its graph is the union of finitely many polyhedral … Read more

Novel closed-loop controllers for fractional linear quadratic tracking systems

A new method for finding closed-loop optimal controllers of fractional tracking quadratic optimal control problems is introduced. The optimality conditions for the fractional optimal control problem are obtained. Illustrative examples are presented to show the applicability and capabilities of the method. ArticleDownload View PDF

Efficient QUIC-Based Damped Inexact Iterative Reweighting for Sparse Inverse Covariance Estimation with Nonconvex Partly Smooth Regularization

In this paper, we study sparse inverse covariance matrix estimation incorporating partly smooth nonconvex regularizers. To solve the resulting regularized log-determinant problem, we develop DIIR-QUIC—a novel Damped Inexact Iteratively Reweighted algorithm based on QUadratic approximate Inverse Covariance (QUIC) method. Our approach generalizes the classic iteratively reweighted \(\ell_1\) scheme through damped fixed-point updates. A key novelty … Read more