Non-Convex Mixed-Integer Nonlinear Programming: A Survey

A wide range of problems arising in practical applications can be formulated as Mixed-Integer Nonlinear Programs (MINLPs). For the case in which the objective and constraint functions are convex, some quite effective exact and heuristic algorithms are available. When non-convexities are present, however, things become much more difficult, since then even the continuous relaxation is … Read more

A Stochastic Gradient Method with an Exponential Convergence Rate for Strongly-Convex Optimization with Finite Training Sets

We propose a new stochastic gradient method for optimizing the sum of a finite set of smooth functions, where the sum is strongly convex. While standard stochastic gradient methods converge at sublinear rates for this problem, the proposed method incorporates a memory of previous gradient values in order to achieve a linear convergence rate. Numerical … Read more

Numerical Optimization of Eigenvalues of Hermitian Matrix Functions

The eigenvalues of a Hermitian matrix function that depends on one parameter analytically can be ordered so that each eigenvalue is an analytic function of the parameter. Ordering these analytic eigenvalues from the largest to the smallest yields continuous and piece-wise analytic functions. For multi-variate Hermitian matrix functions that depend on $d$ parameters analytically, the … Read more

A Fast Algorithm for Constructing Efficient Event-Related fMRI Designs

We propose a novel, ecient approach for obtaining high-quality experimental designs for event-related functional magnetic resonance imaging (ER-fMRI). Our approach combines a greedy hillclimbing algorithm and a cyclic permutation method. When searching for optimal ER-fMRI designs, the proposed approach focuses only on a promising restricted class of designs with equal frequency of occurrence across stimulus … Read more

Proximal Point Method for Minimizing Quasiconvex Locally Lipschitz Functions on Hadamard Manifolds

In this paper we propose an extension of the proximal point method to solve minimization problems with quasiconvex locally Lipschitz objective functions on Hadamard manifolds. To reach this goal, we use the concept of Clarke subdifferential on Hadamard manifolds and assuming that the function is bounded from below, we prove the global convergence of the … Read more

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

This paper proposes a new probabilistic algorithm for solving multi-objective optimization problems – Probability-Driven Search Algorithm. The algorithm uses probabilities to control the process in search of Pareto optimal solutions. Especially, we use the absorbing Markov Chain to argue the convergence of the algorithm. We test this approach by implementing the algorithm on some benchmark … Read more

Pessimistic Bi-Level Optimisation

Bi-level problems are optimisation problems in which some of the decision variables must optimise a subordinate (lower-level) problem. In general, the lower-level problem can possess multiple optimal solutions. One therefore distinguishes between optimistic formulations, which assume that the most favourable lower-level solution is implemented, and pessimistic formulations, in which the most adverse lower-level solution is … Read more

On feasibility based bounds tightening

Mathematical programming problems involving nonconvexities are usually solved to optimality using a (spatial) Branch-and-Bound algorithm. Algorithmic efficiency depends on many factors, among which the widths of the bounding box for the problem variables at each Branch-and-Bound node naturally plays a critical role. The practically fastest box-tightening algorithm is known as FBBT (Feasibility-Based Bounds Tightening): an … Read more

Squeeze-and-Breathe Evolutionary Monte Carlo Optimisation with Local Search Acceleration and its application to parameter fitting

Estimating parameters from data is a key stage of the modelling process, particularly in biological systems where many parameters need to be estimated from sparse and noisy data sets. Over the years, a variety of heuristics have been proposed to solve this complex optimisation problem, with good results in some cases yet with limitations in … Read more

Proximal Methods with Bregman Distances to Solve VIP on Hadamard manifolds

We present an extension of the proximal point method with Bregman distances to solve Variational Inequality Problems (VIP) on Hadamard manifolds (simply connected finite dimensional Riemannian manifold with nonpositive sectional curvature). Under some natural assumption, as for example, the existence of solutions of the (VIP) and the monotonicity of the multivalued vector field, we prove … Read more