Minimal Representation of Insurance Prices

This paper addresses law invariant coherent risk measures and their Kusuoka representations. By elaborating the existence of a minimal representation we show that every Kusuoka representation can be reduced to its minimal representation. Uniqueness — in a sense specified in the paper — of the risk measure’s Kusuoka representation is derived from this initial result. … Read more

Fast global convergence of gradient methods for high-dimensional statistical recovery

Many statistical $M$-estimators are based on convex optimization problems formed by the combination of a data-dependent loss function with a norm-based regularizer. We analyze the convergence rates of projected gradient and composite gradient methods for solving such problems, working within a high-dimensional framework that allows the data dimension $\pdim$ to grow with (and possibly exceed) … Read more

pcaL1: An Implementation in R of Three Methods for L1-Norm Principal Component Analysis

pcaL1 is a package for the R environment for finding principal components using methods based on the L1 norm. The principal components derived using traditional principal component analysis (PCA) can be interpreted as an optimal solution to several optimization problems involving the L2 norm. Using the L1 norm in these problems provides an alternative that … Read more

A Low-Memory Approach For Best-State Estimation Of Hidden Markov Models With Model Error

We present a low-memory approach for the best-state estimate (data assimilation) of hidden Markov models where model error is considered. In particular, our findings apply for the 4D- Var framework. The novelty of our approach resides in the fact that the storage needed by our estimation framework, while including model error, is dramatically reduced from … Read more

Smoothing SQP Algorithm for Non-Lipschitz Optimization with Complexity Analysis

In this paper, we propose a smoothing sequential quadratic programming (SSQP) algorithm for solving a class of nonsmooth nonconvex, perhaps even non-Lipschitz minimization problems, which has wide applications in statistics and sparse reconstruction. At each step, the SSQP algorithm solves a strongly convex quadratic minimization problem with a diagonal Hessian matrix, which has a simple … Read more

Interior Point Methods for Optimal Experimental Designs

In this paper, we propose a primal IP method for solving the optimal experimental design problem with a large class of smooth convex optimality criteria, including A-, D- and p th mean criterion, and establish its global convergence. We also show that the Newton direction can be computed efficiently when the size of the moment … Read more

Robust and Trend-following Student’s t Kalman Smoothers

Two nonlinear Kalman smoothers are proposed using the Student’s t distribution. The first, which we call the T-Robust smoother, finds the maximum a posteriori (MAP) solution for Gaussian process noise and Student’s t observation noise. It is extremely robust against outliers, outperforming the recently proposed L1-Laplace smoother in extreme situations with data containing 20% or … Read more

A Matrix-Free Approach For Solving The Gaussian Process Maximum Likelihood Problem

Gaussian processes are the cornerstone of statistical analysis in many application ar- eas. Nevertheless, most of the applications are limited by their need to use the Cholesky factorization in the computation of the likelihood. In this work, we present a matrix-free approach for comput- ing the solution of the maximum likelihood problem involving Gaussian processes. … Read more

DIFFERENCE FILTER PRECONDITIONING FOR LARGE COVARIANCE MATRICES

In many statistical applications one must solve linear systems corresponding to large, dense, and possibly irregularly structured covariance matrices. These matrices are often ill- conditioned; for example, the condition number increases at least linearly with respect to the size of the matrix when observations of a random process are obtained from a xed domain. This … Read more

A Matrix-Free Approach For Solving The Gaussian Process Maximum Likelihood Problem

Gaussian processes are the cornerstone of statistical analysis in many application ar- eas. Nevertheless, most of the applications are limited by their need to use the Cholesky factorization in the computation of the likelihood. In this work, we present a matrix-free approach for comput- ing the solution of the maximum likelihood problem involving Gaussian processes. … Read more