An extragradient method for solving variational inequalities without monotonicity

A new extragradient projection method is devised in this paper, which does not obviously require generalized monotonicity and assumes only that the so-called dual variational inequality has a solution in order to ensure its global convergence. In particular, it applies to quasimonotone variational inequality having a nontrivial solution. Article Download View An extragradient method for … Read more

Learning to Project in Multi-Objective Binary Linear Programming

In this paper, we investigate the possibility of improving the performance of multi-objective optimization solution approaches using machine learning techniques. Specifically, we focus on multi-objective binary linear programs and employ one of the most effective and recently developed criterion space search algorithms, the so-called KSA, during our study. This algorithm computes all nondominated points of … Read more

Rank-one Convexification for Sparse Regression

Sparse regression models are increasingly prevalent due to their ease of interpretability and superior out-of-sample performance. However, the exact model of sparse regression with an L0 constraint restricting the support of the estimators is a challenging non-convex optimization problem. In this paper, we derive new strong convex relaxations for sparse regression. These relaxations are based … Read more

A general framework for customized transition to smart homes

Smart homes have the potential to achieve efficient energy consumption: households can profit from appropriately scheduled consumption. By 2020, 35% of all households in North America and 20% in Europe are expected to become smart homes. Developing a smart home requires considerable investment, and the householders expect a positive return. In this context, we address … Read more

When a maximal angle among cones is nonobtuse

Principal angles between linear subspaces have been studied for their application to statistics, numerical linear algebra, and other areas. In 2005, Iusem and Seeger defined critical angles within a single convex cone as an extension of antipodality in a compact set. Then, in 2016, Seeger and Sossa extended that notion to two cones. This was … Read more

Analysis of the BFGS Method with Errors

The classical convergence analysis of quasi-Newton methods assumes that the function and gradients employed at each iteration are exact. In this paper, we consider the case when there are (bounded) errors in both computations and establish conditions under which a slight modification of the BFGS algorithm with an Armijo-Wolfe line search converges to a neighborhood … Read more

Generalized subdifferentials of spectral functions over Euclidean Jordan algebras

This paper is devoted to the study of generalized subdifferentials of spectral functions over Euclidean Jordan algebras. Spectral functions appear often in optimization problems playing the role of “regularizer”, “barrier”, “penalty function” and many others. We provide formulae for the regular, approximate and horizon subdifferentials of spectral functions. In addition, under local lower semicontinuity, we … Read more

The condition number of a function relative to a set

The condition number of a differentiable convex function, namely the ratio of its smoothness to strong convexity constants, is closely tied to fundamental properties of the function. In particular, the condition number of a quadratic convex function is the square of the aspect ratio of a canonical ellipsoid associated to the function. Furthermore, the condition … Read more

Fast Robust Methods for Singular State-Space Models

State-space models are used in a wide range of time series analysis applications. Kalman filtering and smoothing are work-horse algorithms in these settings. While classic algorithms assume Gaussian errors to simplify estimation, recent advances use a broad range of optimization formulations to allow outlier-robust estimation, as well as constraints to capture prior information. Here we … Read more

On Data-Driven Prescriptive Analytics with Side Information: A Regularized Nadaraya-Watson Approach

We consider generic stochastic optimization problems in the presence of side information which enables a more insightful decision. The side information constitutes observable exogenous covariates that alter the conditional probability distribution of the random problem parameters. A decision maker who adapts her decisions according to the observed side information solves an optimization problem where the … Read more