Strong local convergence properties of adaptive regularized methods for nonlinear least-squares

This paper studies adaptive regularized methods for nonlinear least-squares problems where the model of the objective function used at each iteration is either the Euclidean residual regularized by a quadratic term or the Gauss-Newton model regularized by a cubic term. For suitable choices of the regularization parameter the role of the regularization term is to … Read more

Attraction of Newton method to critical Lagrange multipliers: fully quadratic case

All previously known results concerned with attraction of Newton-type iterations for optimality systems to critical Lagrange multipliers were a posteriori by nature: they were showing that in case of convergence, the dual limit is in a sense unlikely to be noncritical. This paper suggests the first a priori result in this direction, showing that critical … Read more

Abstract Newtonian Frameworks and Their Applications

We unify and extend some Newtonian iterative frameworks developed earlier in the literature, which results in a collection of convenient tools for local convergence analysis of various algorithms under various sets of assumptions including strong metric regularity, semistability, or upper-Lipschizt stability, the latter allowing for nonisolated solutions. These abstract schemes are further applied for deriving … Read more

Extension of Completely Positive Cone Relaxation to Polynomial Optimization

We propose the moment cone relaxation for a class of polynomial optimization problems (POPs) to extend the results on the completely positive cone programming relaxation for the quadratic optimization (QOP) model by Arima, Kim and Kojima. The moment cone relaxation is constructed to take advantage of sparsity of the POPs, so that efficient numerical methods … Read more

Proximal bundle methods in depth: a unified analysis for inexact oracles

The last few years have seen the advent ofa new generation of bundle methods, capable to handle inexact oracles, polluted by “noise”. Proving convergence of a bundle method is never simple and coping with inexact oracles substantially increases the technicalities. Besides, several variants exist to deal with noise, each one needing an ad hoc proof … Read more

Distributionally Robust Convex Optimization

Distributionally robust optimization is a paradigm for decision-making under uncertainty where the uncertain problem data is governed by a probability distribution that is itself subject to uncertainty. The distribution is then assumed to belong to an ambiguity set comprising all distributions that are compatible with the decision maker’s prior information. In this paper, we propose … Read more

Sparse Recovery on Euclidean Jordan Algebras

We consider the sparse recovery problem on Euclidean Jordan algebra (SREJA), which includes sparse signal recovery and low-rank symmetric matrix recovery as special cases. We introduce the restricted isometry property, null space property (NSP), and $s$-goodness for linear transformations in $s$-sparse element recovery on Euclidean Jordan algebra (SREJA), all of which provide sufficient conditions for … Read more

Shipping Data Generation for the Hunter Valley Coal Chain

Strategic capacity planning is a core activity for the Hunter Valley Coal Chain Coordinator as demand for coal is expected to double in the next decade. Optimization and simulation models are used to suggest and evaluate infrastructure expansions and operating policy changes. These models require input data in the form of shipping stems, which are … Read more

Quadratic Outer Approximation for Convex Integer Programming

We present a quadratic outer approximation scheme for solving general convex integer programs, where suitable quadratic approximations are used to underestimate the objective function instead of classical linear approximations. As a resulting surrogate problem we consider the problem of minimizing a function given as the maximum of finitely many convex quadratic functions having the same … Read more