Error bounds for vector-valued functions: necessary and sufficient conditions

In this paper, we attempt to extend the definition and existing local error bound criteria to vector-valued functions, or more generally, to functions taking values in a normed linear space. Some new derivative-like objects (slopes and subdifferentials) are introduced and a general classification scheme of error bound criteria is presented. CitationPublished in Nonlinear Analysis. Theory, … Read more

Generalized Decision Rule Approximations for Stochastic Programming via Liftings

Stochastic programming provides a versatile framework for decision-making under uncertainty, but the resulting optimization problems can be computationally demanding. It has recently been shown that, primal and dual linear decision rule approximations can yield tractable upper and lower bounds on the optimal value of a stochastic program. Unfortunately, linear decision rules often provide crude approximations … Read more

Implicit Multifunction Theorems in complete metric spaces

In this paper, we establish some new characterizations of the metric regularity of implicit multifunctions in complete metric spaces by using the lower semicontinuous envelopes of the distance functions for set-valued mappings. Through these new characterizations it is possible to investigate implicit multifunction theorems based on coderivatives and on contingent derivatives as well as the … Read more

Stability of error bounds for convex constraint systems in Banach spaces

This paper studies stability of error bounds for convex constraint systems in Banach spaces. We show that certain known sufficient conditions for local and global error bounds actually ensure error bounds for the family of functions being in a sense small perturbations of the given one. A single inequality as well as semi-infinite constraint systems … Read more

Stability of error bounds for semi-infinite convex constraint systems

In this paper, we are concerned with the stability of the error bounds for semi-infinite convex constraint systems. Roughly speaking, the error bound of a system of inequalities is said to be stable if all its “small” perturbations admit a (local or global) error bound. We first establish subdifferential characterizations of the stability of error … Read more

Homogeneous Cone Complementarity Problems and $ Properties

We consider existence and uniqueness properties of a solution to homogeneous cone complementarity problem (HCCP). Employing the $T$-algebraic characterization of homogeneous cones, we generalize the $P, P_0, R_0$ properties for a nonlinear function associated with the standard nonlinear complementarity problem to the setting of HCCP. We prove that if a continuous function has either the … Read more

Primal and dual linear decision rules in stochastic and robust optimization

Linear stochastic programming provides a flexible toolbox for analyzing real-life decision situations, but it can become computationally cumbersome when recourse decisions are involved. The latter are usually modelled as decision rules, i.e., functions of the uncertain problem data. It has recently been argued that stochastic programs can quite generally be made tractable by restricting the … Read more

A Coordinate Gradient Descent Method for Linearly Constrained Smooth Optimization and Support Vector Machines Training

Support vector machines (SVMs) training may be posed as a large quadratic program (QP) with bound constraints and a single linear equality constraint. We propose a (block) coordinate gradient descent method for solving this problem and, more generally, linearly constrained smooth optimization. Our method is closely related to decomposition methods currently popular for SVM training. … Read more

A Coordinate Gradient Descent Method for Nonsmooth Separable Minimization

We consider the problem of minimizing the sum of a smooth function and a separable convex function. This problem includes as special cases bound-constrained optimization and smooth optimization with l_1-regularization. We propose a (block) coordinate gradient descent method for solving this class of nonsmooth separable problems. We establish global convergence and, under a local Lipschitzian … Read more

Sum of Squares Method for Sensor Network Localization

We formulate the sensor network localization problem as finding the global minimizer of a quartic polynomial. Then sum of squares (SOS) relaxations can be applied to solve it. However, the general SOS relaxations are too expensive to implement for large problems. Exploiting the special features of this polynomial, we propose a new structured SOS relaxation, … Read more