Interior Point Methods for Optimal Experimental Designs

In this paper, we propose a primal IP method for solving the optimal experimental design problem with a large class of smooth convex optimality criteria, including A-, D- and p th mean criterion, and establish its global convergence. We also show that the Newton direction can be computed efficiently when the size of the moment … Read more

Robust and Trend-following Student’s t Kalman Smoothers

Two nonlinear Kalman smoothers are proposed using the Student’s t distribution. The first, which we call the T-Robust smoother, finds the maximum a posteriori (MAP) solution for Gaussian process noise and Student’s t observation noise. It is extremely robust against outliers, outperforming the recently proposed L1-Laplace smoother in extreme situations with data containing 20% or … Read more

A Matrix-Free Approach For Solving The Gaussian Process Maximum Likelihood Problem

Gaussian processes are the cornerstone of statistical analysis in many application ar- eas. Nevertheless, most of the applications are limited by their need to use the Cholesky factorization in the computation of the likelihood. In this work, we present a matrix-free approach for comput- ing the solution of the maximum likelihood problem involving Gaussian processes. … Read more

DIFFERENCE FILTER PRECONDITIONING FOR LARGE COVARIANCE MATRICES

In many statistical applications one must solve linear systems corresponding to large, dense, and possibly irregularly structured covariance matrices. These matrices are often ill- conditioned; for example, the condition number increases at least linearly with respect to the size of the matrix when observations of a random process are obtained from a xed domain. This … Read more

A Matrix-Free Approach For Solving The Gaussian Process Maximum Likelihood Problem

Gaussian processes are the cornerstone of statistical analysis in many application ar- eas. Nevertheless, most of the applications are limited by their need to use the Cholesky factorization in the computation of the likelihood. In this work, we present a matrix-free approach for comput- ing the solution of the maximum likelihood problem involving Gaussian processes. … Read more

A Matrix-Free Approach For Solving The Gaussian Process Maximum Likelihood Problem

Gaussian processes are the cornerstone of statistical analysis in many application ar- eas. Nevertheless, most of the applications are limited by their need to use the Cholesky factorization in the computation of the likelihood. In this work, we present a matrix-free approach for comput- ing the solution of the maximum likelihood problem involving Gaussian processes. … Read more

A Matrix-Free Approach For Solving The Gaussian Process Maximum Likelihood Problem

Gaussian processes are the cornerstone of statistical analysis in many application ar- eas. Nevertheless, most of the applications are limited by their need to use the Cholesky factorization in the computation of the likelihood. In this work, we present a matrix-free approach for comput- ing the solution of the maximum likelihood problem involving Gaussian processes. … Read more

Accuracy guarantees for ℓ1-recovery

We discuss two new methods of recovery of sparse signals from noisy observation based on ℓ1- minimization. They are closely related to the well-known techniques such as Lasso and Dantzig Selector. However, these estimators come with efficiently verifiable guaranties of performance. By optimizing these bounds with respect to the method parameters we are able to … Read more

Interior Point Methods for Computing Optimal Designs

In this paper we study interior point (IP) methods for solving optimal design problems. In particular, we propose a primal IP method for solving the problems with general convex optimality criteria and establish its global convergence. In addition, we reformulate the problems with A-, D- and E-criterion into linear or log-determinant semidefinite programs (SDPs) and … Read more

Accelerated Block-Coordinate Relaxation for Regularized Optimization

We discuss minimization of a smooth function to which is added a separable regularization function that induces structure in the solution. A block-coordinate relaxation approach with proximal linearized subproblems yields convergence to critical points, while identification of the optimal manifold (under a nondegeneracy condition) allows acceleration techniques to be applied on a reduced space. The … Read more