Radial Duality Part II: Applications and Algorithms

The first part of this work established the foundations of a radial duality between nonnegative optimization problems, inspired by the work of (Renegar, 2016). Here we utilize our radial duality theory to design and analyze projection-free optimization algorithms that operate by solving a radially dual problem. In particular, we consider radial subgradient, smoothing, and accelerated … Read more

An Accelerated Minimal Gradient Method with Momentum for Convex Quadratic Optimization

In this article we address the problem of minimizing a strictly convex quadratic function using a novel iterative method. The new algorithm is based on the well–known Nesterov’s accelerated gradient method. At each iteration of our scheme, the new point is computed by performing a line–search scheme using a search direction given by a linear … Read more

Directional TGV-based image restoration under Poisson noise

We are interested in the restoration of noisy and blurry images where the texture mainly follows a single direction (i.e., directional images). Problems of this type arise, for example, in microscopy or computed tomography for carbon or glass fibres. In order to deal with these problems, the Directional Total Generalized Variation (DTGV) was developed by … Read more

Vector Optimization w.r.t. Relatively Solid Convex Cones in Real Linear Spaces

In vector optimization, it is of increasing interest to study problems where the image space (a real linear space) is preordered by a not necessarily solid (and not necessarily pointed) convex cone. It is well-known that there are many examples where the ordering cone of the image space has an empty (topological / algebraic) interior, … Read more

A Nonmonontone Accelerated Proximal Gradient Method with Variable Stepsize Strategy for Nonsmooth and Nonconvex Minimization Problems

We propose a new nonmonontone accelerated proximal gradient method with variable stepsize strategy for minimizing the sum of a nonsmooth function with a smooth one in the nonconvex setting. In this algorithm, the objective function value be allowed to increase discontinuously, but is decreasing from the overall point of view. The variable stepsize strategy don’t … Read more

Factorization of completely positive matrices using iterative projected gradient steps

We aim to factorize a completely positive matrix by using an optimization approach which consists in the minimization of a nonconvex smooth function over a convex and compact set. To solve this problem we propose a projected gradient algorithm with parameters that take into account the effects of relaxation and inertia. Both projection and gradient … Read more

FrankWolfe.jl: a high-performance and flexible toolbox for Frank-Wolfe algorithms and Conditional Gradients

We present FrankWolfe.jl, an open-source implementation of several popular Frank-Wolfe and Conditional Gradients variants for first-order constrained optimization. The package is designed with flexibility and high-performance in mind, allowing for easy extension and relying on few assumptions regarding the user-provided functions. It supports Julia’s unique multiple dispatch feature, and interfaces smoothly with generic linear optimization … Read more

On the Weak and Strong Convergence of a Conceptual Algorithm for Solving Three Operator Monotone Inclusions

In this paper, a conceptual algorithm modifying the forward-backward-half-forward (FBHF) splitting method for solving three operator monotone inclusion problems is investigated. The FBHF splitting method adjusts and improves Tseng’s forward-backward-forward (FBF) split- ting method when the inclusion problem has a third-part operator that is cocoercive. The FBHF method recovers the FBF iteration (when this aforementioned … Read more

Robust Interior Point Method for Quantum Key Distribution Rate Computation

While the security proof method for quantum key distribution, QKD, based on the numerical key rate calculation problem, is powerful in principle, the practicality of the method is limited by computational resources and the efficiency of the underlying algorithm for convex optimization. We derive a stable reformulation of the convex nonlinear semidefinite programming, SDP, model … Read more

Algorithms for Difference-of-Convex (DC) Programs Based on Difference-of-Moreau-Envelopes Smoothing

In this paper we consider minimization of a difference-of-convex (DC) function with and without linear constraints. We first study a smooth approximation of a generic DC function, termed difference-of-Moreau-envelopes (DME) smoothing, where both components of the DC function are replaced by their respective Moreau envelopes. The resulting smooth approximation is shown to be Lipschitz differentiable, … Read more