A Fixed Point Approach with a New Solution Concept for Set-valued Optimization

We present a fixed point approach to find the whole solution set of a set-valued optimization problem though a parametric problem, in which the height of the level set of the objective function is regarded as the parameter. First, the solution concept based on the vector approach is considered in this method. Then, we propose … Read more

FISTA and Extensions – Review and New Insights

The purpose of this technical report is to review the main properties of an accelerated composite gradient (ACG) method commonly referred to as the Fast Iterative Shrinkage-Thresholding Algorithm (FISTA). In addition, we state a version of FISTA for solving both convex and strongly convex composite minimization problems and derive its iteration complexities to generate iterates … Read more

A new stopping criterion for Krylov solvers applied in Interior Point Methods

A surprising result is presented in this paper with possible far reaching consequences for any optimization technique which relies on Krylov subspace methods employed to solve the underlying linear equation systems. In this paper the advantages of the new technique are illustrated in the context of Interior Point Methods (IPMs). When an iterative method is … Read more

Dealing with inequality constraints in large-scale semidefinite relaxations for graph coloring and maximum clique problems

Semidefinite programs (SDPs) can be solved in polynomial time by interior point methods. However, when the dimension of the problem gets large, interior point methods become impractical in terms of both computational time and memory requirements. Certain first-order methods, such as Alternating Direction Methods of Multipliers (ADMMs), established as suitable algorithms to deal with large-scale … Read more

Frank-Wolfe and friends: a journey into projection-free first-order optimization methods

Invented some 65 years ago in a seminal paper by Marguerite Straus-Frank and Philip Wolfe, the Frank-Wolfe method recently enjoys a remarkable revival, fuelled by the need of fast and reliable first-order optimization methods in Data Science and other relevant application areas. This review tries to explain the success of this approach by illustrating versatility … Read more

Escaping strict saddle points of the Moreau envelope in nonsmooth optimization

Recent work has shown that stochastically perturbed gradient methods can efficiently escape strict saddle points of smooth functions. We extend this body of work to nonsmooth optimization, by analyzing an inexact analogue of a stochastically perturbed gradient method applied to the Moreau envelope. The main conclusion is that a variety of algorithms for nonsmooth optimization … Read more

Analysis of the Frank-Wolfe Method for Convex Composite Optimization involving a Logarithmically-Homogeneous Barrier

We present and analyze a new generalized Frank-Wolfe method for the composite optimization problem (P): F*:= min_x f(Ax) + h(x), where f is a \theta-logarithmically-homogeneous self-concordant barrier and the function h has bounded domain but is possibly non-smooth. We show that our generalized Frank-Wolfe method requires O((Gap_0 + \theta + Var_h)\ln(\delta_0) + (\theta + Var_h)^2/\epsilon) … Read more

Alternative Regularizations for OA Algorithms for Convex MINLP

In this work, we extend the regularization framework from Kronqvist et al. (https://doi.org/10.1007/s10107-018-1356-3) by incorporating several new regularization functions and develop a regularized single-tree search method for solving convex mixed-integer nonlinear programming (MINLP) problems. We propose a set of regularization functions based on distance-metrics and Lagrangean approximations, used in the projection problem for finding new … Read more

Practical Large-Scale Linear Programming using Primal-Dual Hybrid Gradient

We present PDLP, a practical first-order method for linear programming (LP) that can solve to the high levels of accuracy that are expected in traditional LP applications. In addition, it can scale to very large problems because its core operation is matrix-vector multiplications. PDLP is derived by applying the primal-dual hybrid gradient (PDHG) method, popularized … Read more

New complexity results and algorithms for min-max-min robust combinatorial optimization

In this work we investigate the min-max-min robust optimization problem applied to combinatorial problems with uncertain cost-vectors which are contained in a convex uncertainty set. The idea of the approach is to calculate a set of k feasible solutions which are worst-case optimal if in each possible scenario the best of the k solutions would … Read more