Hamiltonian Descent Methods

We propose a family of optimization methods that achieve linear convergence using first-order gradient information and constant step sizes on a class of convex functions much larger than the smooth and strongly convex ones. This larger class includes functions whose second derivatives may be singular or unbounded at their minima. Our methods are discretizations of … Read more

Projective Splitting with Forward Steps only Requires Continuity

A recent innovation in projective splitting algorithms for monotone operator inclusions has been the development of a procedure using two forward steps instead of the customary proximal steps for operators that are Lipschitz continuous. This paper shows that the Lipschitz assumption is unnecessary when the forward steps are performed in finite-dimensional spaces: a backtracking linesearch … Read more

An inertial extrapolation method for convex simple bilevel optimization

We consider a scalar objective minimization problem over the solution set of another optimization problem. This problem is known as simple bilevel optimization problem and has drawn a significant attention in the last few years. Our inner problem consists of minimizing the sum of smooth and nonsmooth functions while the outer one is the minimization … Read more

An Inexact First-order Method for Constrained Nonlinear Optimization

The primary focus of this paper is on designing inexact first-order methods for solving large-scale constrained nonlinear optimization problems. By controlling the inexactness of the subproblem solution, we can significantly reduce the computational cost needed for each iteration. A penalty parameter updating strategy during the subproblem solve enables the algorithm to automatically detect infeasibility. Global … Read more

Universal Barrier is n-Self-Concordant

This paper shows that the self-concordance parameter of the universal barrier on any n-dimensional proper convex domain is upper bounded by n. This bound is tight and improves the previous O(n) bound by Nesterov and Nemirovski. The key to our main result is a pair of new, sharp moment inequalities for s-concave distributions, which could … Read more

Deep Neural Network Structures Solving Variational Inequalities

We propose a novel theoretical framework to investigate deep neural networks using the formalism of proximal fixed point methods for solving variational inequalities. We first show that almost all activation functions used in neural networks are actually proximity operators. This leads to an algorithmic model alternating firmly nonexpansive and linear operators. We derive new results … Read more

Conditional Extragradient Algorithms for Solving Constrained Variational Inequalities

In this paper, we generalize the classical extragradient algorithm for solving variational inequality problems by utilizing non-null normal vectors of the feasible set. In particular, conceptual algorithms are proposed with two different linesearches. We then establish convergence results for these algorithms under mild assumptions. Our study suggests that non-null normal vectors may significantly improve convergence … Read more

A proximal ADMM with the Broyden family for Convex Optimization Problems

Alternating direction methods of multipliers (ADMM) have been well studied and effectively used in various application fields. The classical ADMM must solve two subproblems exactly at each iteration. To overcome the difficulty of computing the exact solution of the subproblems, some proximal terms are added to the subproblems. Recently, Gu and Yamashita studied a special … Read more

Asymptotic results of Stochastic Decomposition for Two-stage Stochastic Quadratic Programming

This paper presents stochastic decomposition (SD) algorithms for two classes of stochastic programming problems: 1) two-stage stochastic quadratic-linear programming (SQLP) in which a quadratic program defines the objective function in the first stage and a linear program defines the value function in the second stage; 2) two-stage stochastic quadratic-quadratic programming (SQQP) which has quadratic programming … Read more

The primal-dual hybrid gradient method reduces to a primal method for linearly constrained optimization problems

In this work, we show that for linearly constrained optimization problems the primal-dual hybrid gradient algorithm, analyzed by Chambolle and Pock [3], can be written as an entirely primal algorithm. This allows us to prove convergence of the iterates even in the degenerate cases when the linear system is inconsistent or when the strong duality … Read more