An implementation of the steepest descent method using retractions on riemannian manifolds

In 2008 Absil et al. published a book with optimization methods in Riemannian manifolds. The authors developed steepest descent, Newton, trust-region and conjugate gradients methods using an approximation of the geodesic called retraction. In this paper we present implementations of the of steepest descent method of Absil et al. using Matlab software. We show the … Read more

A Multi-Layer Line Search Method to Improve the Initialization of Optimization Algorithms

We introduce a novel metaheuristic methodology to improve the initialization of a given deterministic or stochastic optimization algorithm. Our objective is to improve the performance of the considered algorithm, called core optimization algorithm, by reducing its number of cost function evaluations, by increasing its success rate and by boosting the precision of its results. In … Read more

Inexact Proximal Point Methods for Quasiconvex Minimization on Hadamard Manifolds

In this paper we present two inexact proximal point algorithms to solve minimization problems for quasiconvex objective functions on Hadamard manifolds. We prove that under natural assumptions the sequence generated by the algorithms are well defined and converge to critical points of the problem. We also present an application of the method to demand theory … Read more

Successive Rank-One Approximations of Nearly Orthogonally Decomposable Symmetric Tensors

Many idealized problems in signal processing, machine learning and statistics can be reduced to the problem of finding the symmetric canonical decomposition of an underlying symmetric and orthogonally decomposable (SOD) tensor. Drawing inspiration from the matrix case, the successive rank-one approximations (SROA) scheme has been proposed and shown to yield this tensor decomposition exactly, and … Read more

An Adaptive Unified Differential Evolution Algorithm for Global Optimization

In this paper, we propose a new adaptive unified differential evolution algorithm for single-objective global optimization. Instead of the multiple mutation strategies proposed in conventional differential evolution algorithms, this algorithm employs a single equation unifying multiple strategies into one expression. It has the virtue of mathematical simplicity and also provides users the flexibility for broader … Read more

A polyhedral study of binary polynomial programs

We study the polyhedral convex hull of a mixed-integer set S defined by a collection of multilinear equations over the 0-1-cube. Such sets appear frequently in the factorable reformulation of mixed-integer nonlinear optimization problems. In particular, the set S represents the feasible region of a linearized unconstrained binary polynomial optimization problem. We define an equivalent … Read more

Six mathematical gems from the history of Distance Geometry

This is a partial account of the fascinating history of Distance Geometry. We make no claim to completeness, but we do promise a dazzling display of beautiful, elementary mathematics. We prove Heron’s formula, Cauchy’s theorem on the rigidity of polyhedra, Cayley’s generalization of Heron’s formula to higher dimensions, Menger’s characterization of abstract semi-metric spaces, a … Read more

A Fast Branch-and-Bound Algorithm for Non-convex Quadratic Integer Optimization Subject To Linear Constraints Using Ellipsoidal Relaxations

We propose two exact approaches for non-convex quadratic integer minimization subject to linear constraints where lower bounds are computed by considering ellipsoidal relaxations of the feasible set. In the first approach, we intersect the ellipsoids with the feasible linear subspace. In the second approach we penalize exactly the linear constraints. We investigate the connection between … Read more

A note on the ergodic convergence of symmetric alternating proximal gradient method

We consider the alternating proximal gradient method (APGM) proposed to solve a convex minimization model with linear constraints and separable objective function which is the sum of two functions without coupled variables. Inspired by Peaceman-Rachford splitting method (PRSM), a nature idea is to extend APGM to the symmetric alternating proximal gradient method (SAPGM), which can … Read more