Subdifferentials of nonconvex supremum functions and their applications to semi-infinite and infinite programs with Lipschitzian data

The paper is devoted to the subdifferential study and applications of the supremum of uniformly Lipschitzian functions over arbitrary index sets with no topology. Based on advanced techniques of variational analysis, we evaluate major subdifferentials of the supremum functions in the general framework of Asplund (in particular, reflexive) spaces with no convexity or relaxation assumptions. … Read more

A Class of Dantzig-Wolfe Type Decomposition Methods for Variational Inequality Problems

We consider a class of decomposition methods for variational inequalities, which is related to the classical Dantzig–Wolfe decomposition of linear programs. Our approach is rather general, in that it can be used with set-valued or nonmonotone operators, as well as various kinds of approximations in the subproblems of the functions and derivatives in the single-valued … Read more

Lifts of Convex Sets and Cone Factorizations

In this paper we address the basic geometric question of when a given convex set is the image under a linear map of an affine slice of a given closed convex cone. Such a representation or ‘lift’ of the convex set is especially useful if the cone admits an efficient algorithm for linear optimization over … Read more

A Complete Characterization of the Gap between Convexity and SOS-Convexity

Our first contribution in this paper is to prove that three natural sum of squares (sos) based sufficient conditions for convexity of polynomials via the definition of convexity, its first order characterization, and its second order characterization are equivalent. These three equivalent algebraic conditions, henceforth referred to as sos-convexity, can be checked by semidefinite programming … Read more

A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications

A hybrid HS and PRP type conjugate gradient method for smooth optimization is presented, which reduces to the classical RPR or HS method if exact linear search is used and converges globally and R-linearly for nonconvex functions with an inexact backtracking line search under standard assumption. An inexact version of the proposed method which admits … Read more

A First-Order Smoothing Technique for a Class of Large-Scale Linear Programs

We study a class of linear programming (LP) problems motivated by large-scale machine learning applications. After reformulating the LP as a convex nonsmooth problem, we apply Nesterov’s primal-dual smoothing technique. It turns out that the iteration complexity of the smoothing technique depends on a parameter $\th$ that arises because we need to bound the originally … Read more

Sensitivity analysis for two-level value functions with applications to bilevel programming

This paper contributes to a deeper understanding of the link between a now conventional framework in hierarchical optimization spread under the name of the optimistic bilevel problem and its initial more dicult formulation that we call here the original optimistic bilevel optimization problem. It follows from this research that, although the process of deriving necessary … Read more

New optimality conditions for the semivectorial bilevel optimization problem

The paper is concerned with the optimistic formulation of a bilevel optimization problem with multiobjective lower-level problem. Considering the scalarization approach for the multiobjective program, we transform our problem into a scalar-objective optimization problem with inequality constraints by means of the well-known optimal value reformulation. Completely detailed rst-order necessary optimality conditions are then derived in … Read more

Optimization problems with value function objectives

The family of optimization problems with value function objectives includes the minmax programming problem and the bilevel optimization problem. In this paper, we derive necessary optimality conditions for this class of problems. The main focus is on the case where the functions involved are nonsmooth and the constraints are the very general operator constraints. CitationsubmittedArticleDownload … Read more

An extension of the elimination method for a sparse SOS polynomial

We propose a method to reduce the sizes of SDP relaxation problems for a given polynomial optimization problem (POP). This method is an extension of the elimination method for a sparse SOS polynomial in [Kojima et al., Mathematical Programming] and exploits sparsity of polynomials involved in a given POP. In addition, we show that this … Read more