The improvement function in branch-and-bound methods for complete global optimization

We present a new spatial branch-and-bound approach for treating optimization problems with nonconvex inequality constraints. It is able to approximate the set of all global minimal points in case of solvability, and else to detect infeasibility. The new technique covers the nonconvex constraints by means of an improvement function which, although nonsmooth, can be treated … Read more

IPAS: An Adaptive Sample Size Method for Weighted Finite Sum Problems with Linear Equality Constraints

Optimization problems with the objective function in the form of weighted sum and linear equality constraints are considered. Given that the number of local cost functions can be large as well as the number of constraints, a stochastic optimization method is proposed. The method belongs to the class of variable sample size first order methods, … Read more

The improvement function reformulation for graphs of minimal point mappings

Graphs of minimal point mappings of parametric optimization problems appear in the definition of feasible sets of bilevel optimization problems and of semi-infinite optimization problems, and the intersection of multiple such graphs defines (generalized) Nash equilibria. This paper shows how minimal point graphs of nonconvex parametric optimization problems can be written with the help of … Read more

Superiorization and Perturbation Resilience of Algorithms: A Continuously Updated Bibliography

This document presents a (mostly) chronologically-ordered bibliography of scientific publications on the superiorization methodology and perturbation resilience of algorithms which is compiled and continuously updated by us at: http://math.haifa.ac.il/yair/bib-superiorization-censor.html. Since the beginnings of this topic we try to trace the work that has been published about it since its inception. To the best of our … Read more

An iterative process for the feasibility-seeking problem with sets that are unions of convex sets

In this paper we deal with the feasibility-seeking problem for unions of convex sets (UCS) sets and propose an iterative process for its solution. Renewed interest in this problem stems from the fact that it was recently discovered to serve as a modeling approach in fields of applications and from the ongoing recent research efforts … Read more

Optimistic Noise-Aware Sequential Quadratic Programming for Equality Constrained Optimization with Rank-Deficient Jacobians

We propose and analyze a sequential quadratic programming algorithm for minimizing a noisy nonlinear smooth function subject to noisy nonlinear smooth equality constraints. The algorithm uses a step decomposition strategy and, as a result, is robust to potential rank-deficiency in the constraints, allows for two different step size strategies, and has an early stopping mechanism. … Read more

The Least Singular Value Function in Variational Analysis

Metric regularity is among the central concepts of nonlinear and variational analysis, constrained optimization, and their numerous applications. However, met- ric regularity can be elusive for some important ill-posed classes of problems includ- ing polynomial equations, parametric variational systems, smooth reformulations of complementarity systems with degenerate solutions, etc. The study of stability issues for such … Read more

A Universally Optimal Primal-Dual Method for Minimizing Heterogeneous Compositions

This paper proposes a universal, optimal algorithm for convex minimization problems of the composite form $g_0(x)+h(g_1(x),\dots, g_m(x)) + u(x)$. We allow each $g_j$ to independently range from being nonsmooth Lipschitz to smooth, from convex to strongly convex, described by notions of H\”older continuous gradients and uniform convexity. Note that, although the objective is built from … Read more

On Sum-Rules for Second-Order Contingent Derivatives

We are concerned with contingent derivatives and their second-order counterparts (introduced by Ngai et al.) of set-valued mappings. Special attention is given to the development of new sum-rules for second-order contingent derivatives. To be precise, we want to find conditions under which the second-order contingent derivative of the sum of a smooth and a set-valued … Read more

A Study of First-Order Methods with a Probabilistic Relative-Error Gradient Oracle

This paper investigates the problem of minimizing a smooth function over a compact set with a probabilistic relative-error gradient oracle. The oracle succeeds with some probability, in which case it provides a relative-error approximation of the true gradient, or fails and returns an arbitrary vector, while the optimizer cannot distinguish between successful and failed queries … Read more