Level Bundle Methods for oracles with on-demand accuracy

For nonsmooth convex optimization, we consider level bundle methods built using an oracle that computes values for the objective function and a subgradient at any given feasible point. For the problems of interest, the exact oracle information is computable, but difficult to obtain. In order to save computational effort the oracle can provide estimations with … Read more

Reweighted $\ell_1hBcMinimization for Sparse Solutions to Underdetermined Linear Systems

Numerical experiments have indicated that the reweighted $\ell_1$-minimization performs exceptionally well in locating sparse solutions of underdetermined linear systems of equations. Thus it is important to carry out a further investigation of this class of methods. In this paper, we point out that reweighted $\ell_1$-methods are intrinsically associated with the minimization of the so-called merit … Read more

Efficient Cardinality/Mean-Variance Portfolios

A number of variants of the classical Markowitz mean-variance optimization model for portfolio selection have been investigated to render it more realistic. Recently, it has been studied the imposition of a cardinality constraint, setting an upper bound on the number of active positions taken in the portfolio, in an attempt to improve its performance and … Read more

Sparse/Robust Estimation and Kalman Smoothing with Nonsmooth Log-Concave Densities: Modeling, Computation, and Theory

Piecewise linear quadratic (PLQ) penalties play a crucial role in many applications, including machine learning, robust statistical inference, sparsity promotion, and inverse problems such as Kalman smoothing. Well known examples of PLQ penalties include the l2, Huber, l1 and Vapnik losses. This paper builds on a dual representation for PLQ penalties known from convex analysis. … Read more

A Stochastic Gradient Method with an Exponential Convergence Rate for Strongly-Convex Optimization with Finite Training Sets

We propose a new stochastic gradient method for optimizing the sum of a finite set of smooth functions, where the sum is strongly convex. While standard stochastic gradient methods converge at sublinear rates for this problem, the proposed method incorporates a memory of previous gradient values in order to achieve a linear convergence rate. Numerical … Read more

A regularized simplex method

In case of a special problem class, the simplex method can be implemented as a cutting-plane method that approximates a certain convex polyhedral objective function. In this paper we consider a regularized version of this cutting-plane method, and interpret the resulting procedure as a regularized simplex method. (Regularization is performed in the dual space and … Read more

A Parallel Bundle Method for Asynchronous Subspace Optimization in Lagrangian Relaxation

An algorithmic approach is proposed for exploiting parallelization possibilities in large scale optimization models of the following generic type. Objects change their state over time subject to a limited availability of common resources. These are modeled by linear coupling constraints and result in few objects competing for the same resource at each point in time. … Read more

D-ADMM: A Communication-Efficient Distributed Algorithm For Separable Optimization

We propose a distributed algorithm, named D-ADMM, for solving separable optimization problems in networks of interconnected nodes or agents. In a separable optimization problem, the cost function is the sum of all the agents’ private cost functions, and the constraint set is the intersection of all the agents’ private constraint sets. We require the private … Read more

A Constructive Proof of the Existence of a Utility in Revealed Preference Theory

Within the context of the standard model of rationality within economic modelling we show the existence of a utility function that rationalises a demand correspondence, hence completely characterizes the associated preference structure, by taking a dense demand sample. This resolves the problem of revealed preferences under some very mild assumptions on the demand correspondence which … Read more

Warmstarting the Homogeneous and Self-Dual Interior Point Method for Linear and Conic Quadratic Problems

We present two strategies for warmstarting primal-dual interior point methods for the homogeneous self-dual model when applied to mixed linear and quadratic conic optimization problems. Common to both strategies is their use of only the final (optimal) iterate of the initial problem and their negligible computational cost. This is a major advantage when comparing to … Read more