The cone condition and nonsmoothness in linear generalized Nash games

We consider linear generalized Nash games and introduce the so-called cone condition which characterizes the smoothness of the Nikaido-Isoda function under weak assumptions. The latter mapping arises from a reformulation of the generalized Nash equilibrium problem as a possibly nonsmooth optimization problem. Other regularity conditions like LICQ or SMFC(Q) are only sufficient for smoothness, but … Read more

A Flexible ADMM Algorithm for Big Data Applications

We present a flexible Alternating Direction Method of Multipliers (F-ADMM) algorithm for solving optimization problems involving a strongly convex objective function that is separable into $n \geq 2$ blocks, subject to (non-separable) linear equality constraints. The F-ADMM algorithm uses a \emph{Gauss-Seidel} scheme to update blocks of variables, and a regularization term is added to each … Read more

A weighted Mirror Descent algorithm for nonsmooth convex optimization problem

Large scale nonsmooth convex optimization is a common problem for a range of computational areas including machine learning and computer vision. Problems in these areas contain special domain structures and characteristics. Special treatment of such problem domains, exploiting their structures, can significantly improve the the computational burden. We present a weighted Mirror Descent method to … Read more

A strong polynomial gradient algorithm in Linear Programming

It has been an open question whether the Linear Programming (LP) problem can be solved in strong polynomial time. The simplex algorithm does not offer a polynomial bound, and polynomial algorithms by Khachiyan and Karmarkar don’t have the strong characteristic. The curious fact that non-linear algorithms would be needed to deliver the strongest complexity result … Read more

A Framework for Applying Subgradient Methods to Conic Optimization Problems (version 2)

A framework is presented whereby a general convex conic optimization problem is transformed into an equivalent convex optimization problem whose only constraints are linear equations and whose objective function is Lipschitz continuous. Virtually any subgradient method can be applied to solve the equivalent problem. Two methods are analyzed. (In version 2, the development of algorithms … Read more

A polynomial-time descent method for separable convex optimization problems with linear constraints

We propose a polynomial algorithm for a separable convex optimization problem with linear constraints. We do not make any additional assumptions about the structure of the objective function except for polynomial computability. That is, the objective function can be non-differentiable. The running time of our algorithm is polynomial in the the size of the input … Read more

A Nonmonotone Approach without Differentiability Test for Gradient Sampling Methods

Recently, optimization problems involving nonsmooth and locally Lipschitz functions have been subject of investigation, and an innovative method known as Gradient Sampling has gained attention. Although the method has shown good results for important real problems, some drawbacks still remain unexplored. This study suggests modifications to the gradient sampling class of methods in order to … Read more

Second order forward-backward dynamical systems for monotone inclusion problems

We begin by considering second order dynamical systems of the from $\ddot x(t) + \Gamma (\dot x(t)) + \lambda(t)B(x(t))=0$, where $\Gamma: {\cal H}\rightarrow{\cal H}$ is an elliptic bounded self-adjoint linear operator defined on a real Hilbert space ${\cal H}$, $B: {\cal H}\rightarrow{\cal H}$ is a cocoercive operator and $\lambda:[0,+\infty)\rightarrow [0,+\infty)$ is a relaxation function depending … Read more

Activity Identification and Local Linear Convergence of Forward–Backward-type methods

In this paper, we consider a class of Forward–Backward (FB) splitting methods that includes several variants (e.g. inertial schemes, FISTA) for minimizing the sum of two proper convex and lower semi-continuous functions, one of which has a Lipschitz continuous gradient, and the other is partly smooth relatively to a smooth active manifold $\mathcal{M}$. We propose … Read more

Parallel Block Coordinate Minimization with Application to Group Regularized Regression

This paper proposes a method for parallel block coordinate-wise minimization for convex functions. Each iteration involves a first phase where n independent minimizations are performed over the n variable blocks, followed by a phase where the results of the first phase are coordinated to obtain the whole variable update. Convergence of the method to the … Read more