A new family of high order directions for unconstrained optimization inspired by Chebyshev and Shamanskii methods

The 1669-1670 Newton-Raphson’s method is still used to solve equations systems and unconstrained optimization problems. Since this method, some other algorithms inspired by Newton’s have been proposed: in 1839 Chebyshev developped a high order cubical convergence algorithm, and in 1967 Shamanskii proposed an acceleration of Newton’s method. By considering a Newton-type methods as displacement directions, … Read more

On the Difficulty of Deciding Asymptotic Stability of Cubic Homogeneous Vector Fields

It is well-known that asymptotic stability (AS) of homogeneous polynomial vector fields of degree one (i.e., linear systems) can be decided in polynomial time e.g. by searching for a quadratic Lyapunov function. Since homogeneous vector fields of even degree can never be AS, the next interesting degree to consider is equal to three. In this … Read more

Global Search Strategies for Solving Multilinear Least-squares Problems

The multilinear least-squares (MLLS) problem is an extension of the linear least-squares problem. The difference is that a multilinear operator is used in place of a matrix-vector product. The MLLS is typically a large-scale problem characterized by a large number of local minimizers. It originates, for instance, from the design of filter networks. We present … Read more

Inexact Restoration method for Derivative-Free Optimization with smooth constraints

A new method is introduced for solving constrained optimization problems in which the derivatives of the constraints are available but the derivatives of the objective function are not. The method is based on the Inexact Restoration framework, by means of which each iteration is divided in two phases. In the first phase one considers only … Read more

Sobolev Seminorm of Quadratic Functions with Applications to Derivative-Free Optimization

This paper studies the $H^1$ Sobolev seminorm of quadratic functions. The research is motivated by the least-norm interpolation that is widely used in derivative-free optimization. We express the $H^1$ seminorm of a quadratic function explicitly in terms of the Hessian and the gradient when the underlying domain is a ball. The seminorm gives new insights … Read more

Robust inversion, dimensionality reduction, and randomized sampling

We consider a class of inverse problems in which the forward model is the solution operator to linear ODEs or PDEs. This class admits several dimensionality-reduction techniques based on data averaging or sampling, which are especially useful for large-scale problems. We survey these approaches and their connection to stochastic optimization. The data-averaging approach is only … Read more

A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications

A hybrid HS and PRP type conjugate gradient method for smooth optimization is presented, which reduces to the classical RPR or HS method if exact linear search is used and converges globally and R-linearly for nonconvex functions with an inexact backtracking line search under standard assumption. An inexact version of the proposed method which admits … Read more

Global Error bounds for systems of convex polynomials over polyhedral constraints

This paper is devoted to study the Lipschitzian/Holderian type global error bound for systems of many finitely convex polynomial inequalities over a polyhedral constraint. Firstly, for systems of this type, we show that under a suitable asymtotic qualification condition, the Lipschitzian type global error bound property is equivalent to the Abadie qualification condition, in particular, … Read more

TACO – A Toolkit for AMPL Control Optimization

We describe a set of extensions to the AMPL modeling language to conveniently model mixed-integer optimal control problems for ODE or DAE dynamic processes. These extensions are realized as AMPL user functions and suffixes and do not require intrusive changes to the AMPL language standard or implementation itself. We describe and provide TACO, a Toolkit … Read more

Sensitivity analysis for two-level value functions with applications to bilevel programming

This paper contributes to a deeper understanding of the link between a now conventional framework in hierarchical optimization spread under the name of the optimistic bilevel problem and its initial more dicult formulation that we call here the original optimistic bilevel optimization problem. It follows from this research that, although the process of deriving necessary … Read more