Convergence rate analysis of the gradient descent-ascent method for convex-concave saddle-point problems

In this paper, we study the gradient descent-ascent method for convex-concave saddle-point problems. We derive a new non-asymptotic global convergence rate in terms of distance to the solution set by using the semidefinite programming performance estimation method. The given convergence rate incorporates most parameters of the problem and it is exact for a large class … Read more

Using Taylor-Approximated Gradients to Improve the Frank-Wolfe Method for Empirical Risk Minimization

The Frank-Wolfe method has become increasingly useful in statistical and machine learning applications, due to the structure-inducing properties of the iterates, and especially in settings where linear minimization over the feasible set is more computationally efficient than projection. In the setting of Empirical Risk Minimization — one of the fundamental optimization problems in statistical and … Read more

Finite convergence of the inexact proximal gradient method to sharp minima

Attractive properties of subgradient methods, such as robust stability and linear convergence, has been emphasized when they are used to solve nonsmooth optimization problems with sharp minima [12, 13]. In this letter we extend the robustness results to the composite convex models and show that the basic proximal gradient algorithm under the presence of a … Read more

On Optimal Universal First-Order Methods for Minimizing Heterogeneous Sums

This work considers minimizing a convex sum of functions, each with potentially different structure ranging from nonsmooth to smooth, Lipschitz to non-Lipschitz. Nesterov’s universal fast gradient method provides an optimal black-box first-order method for minimizing a single function that takes advantage of any continuity structure present without requiring prior knowledge. In this paper, we show … Read more

Weighted Geometric Mean, Minimum Mediated Set, and Optimal Second-Order Cone Representation

We study optimal second-order cone representations for weighted geometric means, which turns out to be closely related to minimum mediated sets. Several lower bounds and upper bounds on the size of optimal second-order cone representations are proved. In the case of bivariate weighted geometric means (equivalently, one dimensional mediated sets), we are able to prove … Read more

The Hyperbolic Augmented Lagrangian Algorithm

The hyperbolic augmented Lagrangian algorithm (HALA) is introduced in the area of continuous optimization for solving nonlinear programming problems. Under mild assumptions, such as: convexity, Slater’s qualification and differentiability, the convergence of the proposed algorithm is proved. We also study the duality theory for the case of the hyperbolic augmented Lagrangian function. Finally, in order … Read more

Superiorization as a novel strategy for linearly constrained inverse radiotherapy treatment planning

Objective: We apply the superiorization methodology to the intensity-modulated radiation therapy (IMRT) treatment planning problem. In superiorization, linear voxel dose inequality constraints are the fundamental modeling tool within which a feasibility-seeking projection algorithm will seek a feasible point. This algorithm is then perturbed with gradient descent steps to reduce a nonlinear objective function. Approach: Within … Read more

The superiorization method with restarted perturbations for split minimization problems with an application to radiotherapy treatment planning

In this paper we study the split minimization problem that consists of two constrained minimization problems in two separate spaces that are connected via a linear operator that maps one space into the other. To handle the data of such a problem we develop a superiorization approach that can reach a feasible point with reduced … Read more

The exact worst-case convergence rate of the alternating direction method of multipliers

Recently, semidefinite programming performance estimation has been employed as a strong tool for the worst-case performance analysis of first order methods. In this paper, we derive new non-ergodic convergence rates for the alternating direction method of multipliers (ADMM) by using performance estimation. We give some examples which show the exactness of the given bounds. We … Read more