Decision-focused predictions via pessimistic bilevel optimization: complexity and algorithms

Dealing with uncertainty in optimization parameters is an important and longstanding challenge. Typically, uncertain parameters are predicted accurately, and then a deterministic optimization problem is solved. However, the decisions produced by this so-called predict-then-optimize procedure can be highly sensitive to uncertain parameters. In this work, we contribute to recent efforts in producing decision-focused predictions, i.e., to … Read more

Unboundedness in Bilevel Optimization

Bilevel optimization has garnered growing interest over the past decade. However, little attention has been paid to detecting and dealing with unboundedness in these problems, with most research assuming a bounded high-point relaxation. In this paper, we address unboundedness in bilevel optimization by studying its computational complexity and developing algorithmic approaches to detect it. We … Read more

An analytical lower bound for a class of minimizing quadratic integer optimization problems

Lower bounds on minimization problems are essential for convergence of both branching-based and iterative solution methods for optimization problems. They are also required for evaluating the quality of feasible solutions by providing conservative optimality gaps. We provide an analytical lower bound for a class of quadratic optimization problems with binary decision variables. In contrast to … Read more

Exploiting Negative Curvature in Conjunction with Adaptive Sampling: Theoretical Results and a Practical Algorithm

In this paper, we propose algorithms that exploit negative curvature for solving noisy nonlinear nonconvex unconstrained optimization problems. We consider both deterministic and stochastic inexact settings, and develop two-step algorithms that combine directions of negative curvature and descent directions to update the iterates. Under reasonable assumptions, we prove second-order convergence results and derive complexity guarantees … Read more

Some new accelerated and stochastic gradient descent algorithms based on locally Lipschitz gradient constants

In this paper, we revisit the recent stepsize applied for the gradient descent scheme which is called NGD proposed by [Hoai et al., A novel stepsize for gradient descent method, Operations Research Letters (2024) 53, doi: 10.1016/j.orl.2024.107072]. We first investigate NGD stepsize with two well-known accelerated techniques which are Heavy ball and Nesterov’s methods. In … Read more

Gradient Methods with Online Scaling

We introduce a framework to accelerate the convergence of gradient-based methods with online learning. The framework learns to scale the gradient at each iteration through an online learning algorithm and provably accelerates gradient-based methods asymptotically. In contrast with previous literature, where convergence is established based on worst-case analysis, our framework provides a strong convergence guarantee … Read more

A Trust-Region Algorithm for Noisy Equality Constrained Optimization

This paper introduces a modified Byrd-Omojokun (BO) trust region algorithm to address the challenges posed by noisy function and gradient evaluations. The original BO method was designed to solve equality constrained problems and it forms the backbone of some interior point methods for general large-scale constrained optimization. A key strength of the BO method is … Read more

New results related to cutters and to an extrapolated block-iterative method for finding a common fixed point of a collection of them

Given a Hilbert space and a finite family of operators defined on the space, the common fixed point problem (CFPP) is the problem of finding a point in the intersection of the fixed point sets of these operators. A particular case of the problem, when the operators are orthogonal projections, is the convex feasibility problem … Read more

A tutorial on properties of the epigraph reformulation

This paper systematically surveys useful properties of the epigraph reformulation for optimization problems, and complements them by some new results. We focus on the complete compatibility of the original formulation and the epigraph reformulation with respect to solvability and unsolvability, the compatibility with respect to some, but not all, basic constraint qualifications, the formulation of … Read more

Parameter-free proximal bundle methods with adaptive stepsizes for hybrid convex composite optimization problems

This paper develops a parameter-free adaptive proximal bundle method with two important features: 1) adaptive choice of variable prox stepsizes that “closely fits” the instance under consideration; and 2) adaptive criterion for making the occurrence of serious steps easier. Computational experiments show that our method performs substantially fewer consecutive null steps (i.e., a shorter cycle) … Read more