A Fast Newton Method Under Local Lipschitz Smoothness

A new, fast second-order method is proposed that achieves the optimal \(\mathcal{O}\left(|\log(\epsilon)|\epsilon^{-3/2}\right) \) complexity to obtain first-order $\epsilon$-stationary points. Crucially, this is deduced without assuming the standard global Lipschitz Hessian continuity condition, but onlyusing an appropriate local smoothness requirement. The algorithm exploits Hessian information to compute a Newton step and a negative curvature step when … Read more

An adaptive single-loop stochastic penalty method for nonconvex constrained stochastic optimization

Adaptive update schemes for penalty parameters are crucial to enhancing robustness and practical applicability of penalty methods for constrained optimization. However, in the context of general constrained stochastic optimization, additional challenges arise due to the randomness introduced by adaptive penalty parameters. To address these challenges, we propose an Adaptive Single-loop Stochastic Penalty method (AdaSSP) in … Read more

Sensitivity analysis for parametric nonlinear programming: A tutorial

This tutorial provides an overview of the current state-of-the-art in the sensitivity analysis for nonlinear programming. Building upon the fundamental work of Fiacco, it derives the sensitivity of primal-dual solutions for regular nonlinear programs and explores the extent to which Fiacco’s framework can be extended to degenerate nonlinear programs with non-unique dual solutions. The survey … Read more

The improvement function in branch-and-bound methods for complete global optimization

We present a new spatial branch-and-bound approach for treating optimization problems with nonconvex inequality constraints. It is able to approximate the set of all global minimal points in case of solvability, and else to detect infeasibility. The new technique covers the nonconvex constraints by means of an improvement function which, although nonsmooth, can be treated … Read more

IPAS: An Adaptive Sample Size Method for Weighted Finite Sum Problems with Linear Equality Constraints

Optimization problems with the objective function in the form of weighted sum and linear equality constraints are considered. Given that the number of local cost functions can be large as well as the number of constraints, a stochastic optimization method is proposed. The method belongs to the class of variable sample size first order methods, … Read more

The improvement function reformulation for graphs of minimal point mappings

Graphs of minimal point mappings of parametric optimization problems appear in the definition of feasible sets of bilevel optimization problems and of semi-infinite optimization problems, and the intersection of multiple such graphs defines (generalized) Nash equilibria. This paper shows how minimal point graphs of nonconvex parametric optimization problems can be written with the help of … Read more

Superiorization and Perturbation Resilience of Algorithms: A Continuously Updated Bibliography

This document presents a (mostly) chronologically-ordered bibliography of scientific publications on the superiorization methodology and perturbation resilience of algorithms which is compiled and continuously updated by us at: http://math.haifa.ac.il/yair/bib-superiorization-censor.html. Since the beginnings of this topic we try to trace the work that has been published about it since its inception. To the best of our … Read more

A characterization of positive spanning sets with ties to strongly edge-connected digraphs

Positive spanning sets (PSSs) are families of vectors that span a given linear space through non-negative linear combinations. Despite certain classes of PSSs being well understood, a complete characterization of PSSs remains elusive. In this paper, we explore a relatively understudied relationship between positive spanning sets and strongly edge-connected digraphs, in that the former can … Read more

Tight Semidefinite Relaxations for Verifying Robustness of Neural Networks

For verifying the safety of neural networks (NNs), Fazlyab et al. (2019) introduced a semidefinite programming (SDP) approach called DeepSDP. This formulation can be viewed as the dual of the SDP relaxation for a problem formulated as a quadratically constrained quadratic program (QCQP). While SDP relaxations of QCQPs generally provide approximate solutions with some gaps, … Read more

A relaxed version of Ryu’s three-operator splitting method for structured nonconvex optimization

In this work, we propose a modification of Ryu’s splitting algorithm for minimizing the sum of three functions, where two of them are convex with Lipschitz continuous gradients, and the third is an arbitrary proper closed function that is not necessarily convex. The modification is essential to facilitate the convergence analysis, particularly in establishing a … Read more