The cone condition and nonsmoothness in linear generalized Nash games

We consider linear generalized Nash games and introduce the so-called cone condition which characterizes the smoothness of the Nikaido-Isoda function under weak assumptions. The latter mapping arises from a reformulation of the generalized Nash equilibrium problem as a possibly nonsmooth optimization problem. Other regularity conditions like LICQ or SMFC(Q) are only sufficient for smoothness, but … Read more

New bounds for the max-hBccut and chromatic number of a graph

We consider several semidefinite programming relaxations for the max-$k$-cut problem, with increasing complexity. The optimal solution of the weakest presented semidefinite programming relaxation has a closed form expression that includes the largest Laplacian eigenvalue of the graph under consideration. This is the first known eigenvalue bound for the max-$k$-cut when $k>2$ that is applicable to … Read more

A strong polynomial gradient algorithm in Linear Programming

It has been an open question whether the Linear Programming (LP) problem can be solved in strong polynomial time. The simplex algorithm does not offer a polynomial bound, and polynomial algorithms by Khachiyan and Karmarkar don’t have the strong characteristic. The curious fact that non-linear algorithms would be needed to deliver the strongest complexity result … Read more

Partial Relaxation of Equality-constrained Programs

This paper presents a reformulation that is a natural “by-product” of the ‘variable endogenization’ process for equality-constrained programs. The method results a partial relaxation of the constraints which in turn confers some computational advantages. A fully-annotated example illustrates the technique and presents some comparative numerical results. CitationSiwale, I.: Partial Relaxation of Equality-constrained Programs. Technical Report … Read more

A Flexible ADMM Algorithm for Big Data Applications

We present a flexible Alternating Direction Method of Multipliers (F-ADMM) algorithm for solving optimization problems involving a strongly convex objective function that is separable into $n \geq 2$ blocks, subject to (non-separable) linear equality constraints. The F-ADMM algorithm uses a \emph{Gauss-Seidel} scheme to update blocks of variables, and a regularization term is added to each … Read more

A weighted Mirror Descent algorithm for nonsmooth convex optimization problem

Large scale nonsmooth convex optimization is a common problem for a range of computational areas including machine learning and computer vision. Problems in these areas contain special domain structures and characteristics. Special treatment of such problem domains, exploiting their structures, can significantly improve the the computational burden. We present a weighted Mirror Descent method to … Read more

A Framework for Applying Subgradient Methods to Conic Optimization Problems (version 2)

A framework is presented whereby a general convex conic optimization problem is transformed into an equivalent convex optimization problem whose only constraints are linear equations and whose objective function is Lipschitz continuous. Virtually any subgradient method can be applied to solve the equivalent problem. Two methods are analyzed. (In version 2, the development of algorithms … Read more

A polynomial-time descent method for separable convex optimization problems with linear constraints

We propose a polynomial algorithm for a separable convex optimization problem with linear constraints. We do not make any additional assumptions about the structure of the objective function except for polynomial computability. That is, the objective function can be non-differentiable. The running time of our algorithm is polynomial in the the size of the input … Read more

Projected Reflected Gradient Methods for Monotone Variational Inequalities

This paper is concerned with some new projection methods for solving variational inequality problems with monotone and Lipschitz-continuous mapping in Hilbert space. First, we propose the projected reflected gradient algorithm with a constant stepsize. It is similar to the projected gradient method, namely, the method requires only one projection onto the feasible set and only … Read more