Decomposition-Based Reformulation of Nonseparable Quadratic Expressions in Convex MINLP

In this paper, we present a reformulation technique for convex mixed-integer nonlinear programming (MINLP) problems with nonseparable quadratic terms. For each convex non-diagonal matrix that defines quadratic expressions in the problem, we show that an eigenvalue or LDLT decomposition can be performed to transform the quadratic expressions into convex additively separable constraints. The reformulated problem … Read more

An Inexact Trust-Region Method for Structured Nonsmooth Optimization with Application to Risk-Averse Stochastic Programming

We develop a trust-region method for efficiently minimizing the sum of a smooth function, a nonsmooth convex function, and the composition of a finite-valued support function with a smooth function. Optimization problems with this structure arise in numerous applications including risk-averse stochastic programming and subproblems for nonsmooth penalty nonlinear programming methods. Our method permits the … Read more

A Gradient Sampling Algorithm for Noisy Nonsmooth Optimization

An algorithm is proposed, analyzed, and tested for minimizing locally Lipschitz objective functions that may be nonconvex and/or nonsmooth. The algorithm, which is built upon the gradient-sampling methodology, is designed specifically for cases when objective function and generalized gradient values might be subject to bounded uncontrollable errors. Similarly to state-of-the-art guarantees for noisy smooth optimization … Read more

Bregman Regularized Proximal Point Methods for Computing Projected Solutions of Quasi-equilibrium Problems

In this paper, we propose two Bregman regularized proximal point methods that provide flexibility to compute projected solutions for quasi-equilibrium problems. Each method has one Bregman projection onto the feasible set and the regularized equilibrium problem. Under standard assumptions, we prove that the methods are well-defined and that the sequences they generate converge to a … Read more

Normal cones and subdifferentials at infinity for convex analysis and optimization

Motivated by recent developments, this paper further investigates normal cones and subdifferentials at infinity within the framework of convex analysis. We establish fundamental properties of these constructions and derive basic calculus rules. The obtained results extend and refine existing concepts in variational analysis and nonsmooth optimization, providing new insights into the asymptotic structure of functions … Read more

Preconditioned Proximal Gradient Methods with Conjugate Momentum: A Subspace Perspective

In this paper, we propose a descent method for composite optimization problems with linear operators. Specifically, we first design a structure-exploiting preconditioner tailored to the linear operator so that the resulting preconditioned proximal subproblem admits a closed-form solution through its dual formulation. However, such a structure-driven preconditioner may be poorly aligned with the local curvature … Read more

Strong convergence, perturbation resilience and superiorization of Generalized Modular String-Averaging with infinitely many input operators

We study the strong convergence and bounded perturbation resilience of iterative algorithms based on the Generalized Modular String-Averaging (GMSA) procedure for infinite sequences of input operators under a general admissible control. These methods address a variety of feasibility-seeking problems in real Hilbert spaces, including the common fixed point problem and the convex feasibility problem. In … Read more

On the Complexity of Subgradient Methods for Trilevel Hierarchical Generalized Variational Inequalities

We study generalized variational inequalities with a three-level hierarchical structure. This setting extends nested GVI models beyond the bilevel case, for which $\mathcal{O}(\delta^{-4})$ complexity bounds are known for any prescribed positive tolerance $\delta$, to a fully three-level hierarchical structure. We analyze a projected averaged subgradient method combined with a Tikhonov-like regularization scheme. Under compactness, maximal … Read more

A Modified Projected Gradient Algorithm for Solving Quasiconvex Programming with Applications

In this manuscript, we introduce a novel projected gradient algorithm for solving quasiconvex optimization problems over closed convex sets. The key innovation of our new algorithm is an adaptive, parameter-free stepsize rule that requires no line search and avoids estimating constants, such as Lipschitz modulus. Unlike recent self-adaptive approach given in [17] which typically produce … Read more

Copositive and completely positive cones over symmetric cones of rank at least 5

We focus on copositive and completely positive cones over symmetric cones of rank at least $5$, and in particular investigate whether these cones are spectrahedral shadows. We extend known results for nonnegative orthants of dimension at least $5$ to general symmetric cones of rank at least $5$. Specifically, we prove that when the rank of … Read more