Second order directional derivative of optimal solution function in parametric programming problem

In this paper, the second-order directional derivative of the optimal value function and the optimal solution function are obtained for a strongly stable parametric problem with non-unique Lagrange multipliers. Some properties of the Lagrange multipliers are proved. It is justified that the second-order directional derivative of the optimal solution function for the parametric problem can … Read more

A Primal Approach to Facial Reduction for SDP Relaxations of Combinatorial Optimization Problems

We propose a novel facial reduction algorithm tailored to semidefinite programming relaxations of combinatorial optimization problems with quadratic objective functions. Our method leverages the specific structure of these relaxations, particularly the availability of feasible solutions that can often be generated efficiently in practice. By incorporating such solutions into the facial reduction process, we substantially simplify … Read more

On the Convergence and Complexity of Proximal Gradient and Accelerated Proximal Gradient Methods under Adaptive Gradient Estimation

In this paper, we propose a proximal gradient method and an accelerated proximal gradient method for solving composite optimization problems, where the objective function is the sum of a smooth and a convex, possibly nonsmooth, function. We consider settings where the smooth component is either a finite-sum function or an expectation of a stochastic function, … Read more

Faster stochastic cubic regularized Newton methods with momentum

Cubic regularized Newton (CRN) methods have attracted significant research interest because they offer stronger solution guarantees and lower iteration complexity. With the rise of the big-data era, there is growing interest in developing stochastic cubic regularized Newton (SCRN) methods that do not require exact gradient and Hessian evaluations. In this paper, we propose faster SCRN … Read more

Sub-sampled Trust-Region Methods with Deterministic Worst-Case Complexity Guarantees

In this paper, we develop and analyze sub-sampled trust-region methods for solving finite-sum optimization problems. These methods employ subsampling strategies to approximate the gradient and Hessian of the objective function, significantly reducing the overall computational cost. We propose a novel adaptive procedure for deterministically adjusting the sample size used for gradient (or gradient and Hessian) … Read more

The complete edge relaxation for binary polynomial optimization

We consider the multilinear polytope defined as the convex hull of the feasible region of a linearized binary polynomial optimization problem. We define a relaxation in an extended space for this polytope, which we refer to as the complete edge relaxation. The complete edge relaxation is stronger than several well-known relaxations of the multilinear polytope, … Read more

Stochastic Approximation with Block Coordinate Optimal Stepsizes

We consider stochastic approximation with block-coordinate stepsizes and propose adaptive stepsize rules that aim to minimize the expected distance from the next iterate to an optimal point. These stepsize rules employ online estimates of the second moment of the search direction along each block coordinate. The popular Adam algorithm can be interpreted as a particular … Read more

Anesthesiologist Scheduling with Handoffs: A Combined Approach of Optimization and Human Factors

We present a two-stage stochastic programming model for optimizing anesthesiologist schedules, explicitly accounting for uncertainty in surgery durations and anesthesiologist handoffs. To inform model design, we conducted an online survey at our partner institution to identify key factors affecting the quality of intraoperative anesthesiologist handoffs. Insights from the survey results are incorporated into the model, … Read more

Non-smooth stochastic gradient descent using smoothing functions

In this paper, we address stochastic optimization problems involving a composition of a non-smooth outer function and a smooth inner function, a formulation frequently encountered in machine learning and operations research. To deal with the non-differentiability of the outer function, we approximate the original non-smooth function using smoothing functions, which are continuously differentiable and approach … Read more

Recursive Bound-Constrained AdaGrad with Applications to Multilevel and Domain Decomposition Minimization

Two OFFO (Objective-Function Free Optimization) noise tolerant algorithms are presented that handle bound constraints, inexact gradients and use second-order information when available. The first is a multi-level method exploiting a hierarchical description of the problem and the second is a domain-decomposition method covering the standard addditive Schwarz decompositions. Both are generalizations of the first-order AdaGrad … Read more