An inertial Tseng’s type proximal algorithm for nonsmooth and nonconvex optimization problems

We investigate the convergence of a forward-backward-forward proximal-type algorithm with inertial and memory effects when minimizing the sum of a nonsmooth function with a smooth one in the absence of convexity. The convergence is obtained provided an appropriate regularization of the objective satisfies the Kurdyka-\L{}ojasiewicz inequality, which is for instance fulfilled for semi-algebraic functions. ArticleDownload … Read more

Gradient Sliding for Composite Optimization

We consider in this paper a class of composite optimization problems whose objective function is given by the summation of a general smooth and nonsmooth component, together with a relatively simple nonsmooth term. We present a new class of first-order methods, namely the gradient sliding algorithms, which can skip the computation of the gradient for … Read more

An Efficient Gauss-Newton Algorithm for Symmetric Low-Rank Product Matrix Approximations

We derive and study a Gauss-Newton method for computing a symmetric low-rank product that is the closest to a given symmetric matrix in Frobenius norm. Our Gauss-Newton method, which has a particularly simple form, shares the same order of iteration-complexity as a gradient method when the size of desired eigenspace is small, but can be … Read more

A Quasi-Newton Algorithm for Nonconvex, Nonsmooth Optimization with Global Convergence Guarantees

A line search algorithm for minimizing nonconvex and/or nonsmooth objective functions is presented. The algorithm is a hybrid between a standard Broyden–Fletcher–Goldfarb–Shanno (BFGS) and an adaptive gradient sampling (GS) method. The BFGS strategy is employed because it typically yields fast convergence to the vicinity of a stationary point, and together with the adaptive GS strategy … Read more

A Cut-and-Branch Algorithm for the Quadratic Knapsack Problem

The Quadratic Knapsack Problem (QKP) is a well-known NP-hard combinatorial optimisation problem, with many practical applications. We present a ‘cut-and-branch’ algorithm for the QKP, in which a cutting-plane phase is followed by a branch-and-bound phase. The cutting-plane phase is more sophisticated than the existing ones in the literature, incorporating several classes of cutting planes, two … Read more

Solving a Huff-like Stackelberg problem on networks

This work deals with a Huff-like Stackelberg problem, where the leader facility wants to decide its location so that its profit is maximal after the competitor (the follower) also built its facility. It is assumed that the follower makes a rational decision, maximizing their profit. The inelastic demand is aggregated into the vertices of a … Read more

Strict Fejér Monotonicity by Superiorization of Feasibility-Seeking Projection Methods

We consider the superiorization methodology, which can be thought of as lying between feasibility-seeking and constrained minimization. It is not quite trying to solve the full fledged constrained minimization problem; rather, the task is to find a feasible point which is superior (with respect to the objective function value) to one returned by a feasibility-seeking … Read more

Strong duality in Lasserre’s hierarchy for polynomial optimization

A polynomial optimization problem (POP) consists of minimizing a multivariate real polynomial on a semi-algebraic set $K$ described by polynomial inequalities and equations. In its full generality it is a non-convex, multi-extremal, difficult global optimization problem. More than an decade ago, J.~B.~Lasserre proposed to solve POPs by a hierarchy of convex semidefinite programming (SDP) relaxations … Read more

An Optimization Approach to the Design of Multi-Size Heliostat fields

In this paper, the problem of optimizing the heliostats field configuration of a Solar Power Tower system with heliostats of different sizes is addressed. Maximizing the efficiency of the plant, i.e., optimizing the energy generated per unit cost, leads to a difficult high dimensional optimization problem (of variable dimension) with an objective function hard to … Read more

Linear equalities in blackbox optimization

The Mesh Adaptive Direct Search (Mads) algorithm is designed for blackbox optimization problems subject to general inequality constraints. Currently, Mads does not support equalities, neither in theory nor in practice. The present work proposes extensions to treat problems with linear equalities whose expression is known. The main idea consists in reformulating the optimization problem into … Read more