Necessary optimality conditions in pessimistic bilevel programming

This paper is devoted to the so-called pessimistic version of bilevel programming programs. Minimization problems of this type are challenging to handle partly because the corresponding value functions are often merely upper (while not lower) semicontinuous. Employing advanced tools of variational analysis and generalized differentiation, we provide rather general frameworks ensuring the Lipschitz continuity of … Read more

Squeeze-and-Breathe Evolutionary Monte Carlo Optimisation with Local Search Acceleration and its application to parameter fitting

Estimating parameters from data is a key stage of the modelling process, particularly in biological systems where many parameters need to be estimated from sparse and noisy data sets. Over the years, a variety of heuristics have been proposed to solve this complex optimisation problem, with good results in some cases yet with limitations in … Read more

Parallel algebraic multilevel Schwarz preconditioners for a class of elliptic PDE systems

We present algebraic multilevel preconditioners for linear systems arising from the discretization of systems of coupled elliptic partial differential equations (PDEs). These preconditioners are based on modifications of Schwarz methods and of the smoothed aggregation technique, where the coarsening strategy and the restriction and prolongation operators are defined using a point-based approach with a primary … Read more

A preconditioning framework for sequences of diagonally modified linear systems arising in optimization

We propose a framework for building preconditioners for sequences of linear systems of the form $(A+\Delta_k) x_k=b_k$, where $A$ is symmetric positive semidefinite and $\Delta_k$ is diagonal positive semidefinite. Such sequences arise in several optimization methods, e.g., in affine-scaling methods for bound-constrained convex quadratic programming and bound-constrained linear least squares, as well as in trust-region … Read more

The Lagrange method and SAO with bounds on the dual variables

We consider the general nonlinear programming problem with equality and inequality constraints when the variables x are confined to a compact set. We regard the Lagrange multipliers as dual variables lambda, those of the inequalities being nonnegative. For each lambda, we let phi(lambda) be the least value of the Lagrange function, which occurs at x=x(lambda), … Read more

A NEW PROBABILISTIC ALGORITHM FOR SOLVING NONLINEAR EQUATIONS SYSTEMS

In this paper, we consider a class of optimization problems having the following characteristics: there exists a fixed number k which does not depend on the size n of the problem such that if we randomly change the value of k variables, it has the ability to find a new solution that is better than … Read more

Modifications of the limited-memory BNS method for better satisfaction of previous quasi-Newton conditions

Several modifications of the limited-memory variable metric BNS method for large scale unconstrained optimization are proposed, which consist in corrections (derived from the idea of conjugate directions) of the used di®erence vectors to improve satisfaction of previous quasi-Newton conditions, utilizing information from previous or subsequent iterations. In case of quadratic objective functions, conjugacy of all … Read more

A Family of Newton Methods for Nonsmooth Constrained Systems with Nonisolated Solutions

We propose a new family of Newton-type methods for the solution of constrained systems of equations. Under suitable conditions, that do not include differentiability or local uniqueness of solutions, local, quadratic convergence to a solution of the system of equations can be established. We show that as particular instances of the method we obtain inexact … Read more

A new family of high order directions for unconstrained optimization inspired by Chebyshev and Shamanskii methods

The 1669-1670 Newton-Raphson’s method is still used to solve equations systems and unconstrained optimization problems. Since this method, some other algorithms inspired by Newton’s have been proposed: in 1839 Chebyshev developped a high order cubical convergence algorithm, and in 1967 Shamanskii proposed an acceleration of Newton’s method. By considering a Newton-type methods as displacement directions, … Read more

On the Difficulty of Deciding Asymptotic Stability of Cubic Homogeneous Vector Fields

It is well-known that asymptotic stability (AS) of homogeneous polynomial vector fields of degree one (i.e., linear systems) can be decided in polynomial time e.g. by searching for a quadratic Lyapunov function. Since homogeneous vector fields of even degree can never be AS, the next interesting degree to consider is equal to three. In this … Read more