Algorithms for Block Tridiagonal Systems: Foundations and New Results for Generalized Kalman Smoothing

Block tridiagonal systems appear in classic Kalman smoothing problems, as well in generalized Kalman smoothing, where problems may have nonsmooth terms, singular covariance, constraints, nonlinear models, and unknown parameters. In this paper, first we interpret all the classic smoothing algorithms as different approaches to solve positive definite block tridiagonal linear systems. Then, we obtain new … Read more

LQR Design under Stability Constraints

The solution of classic discrete-time, finite-horizon linear quadratic regulator (LQR) problem is well known in literature. By casting the solution to be a static state-feedback, we propose a new method that trades off low LQR objective value with closed-loop stability. Citation To appear on the special issue on the 21st IFAC World Congress 2020, IFAC … Read more

Fast Robust Methods for Singular State-Space Models

State-space models are used in a wide range of time series analysis applications. Kalman filtering and smoothing are work-horse algorithms in these settings. While classic algorithms assume Gaussian errors to simplify estimation, recent advances use a broad range of optimization formulations to allow outlier-robust estimation, as well as constraints to capture prior information. Here we … Read more

Sparse/Robust Estimation and Kalman Smoothing with Nonsmooth Log-Concave Densities: Modeling, Computation, and Theory

Piecewise linear quadratic (PLQ) penalties play a crucial role in many applications, including machine learning, robust statistical inference, sparsity promotion, and inverse problems such as Kalman smoothing. Well known examples of PLQ penalties include the l2, Huber, l1 and Vapnik losses. This paper builds on a dual representation for PLQ penalties known from convex analysis. … Read more

Robust and Trend-following Student’s t Kalman Smoothers

Two nonlinear Kalman smoothers are proposed using the Student’s t distribution. The first, which we call the T-Robust smoother, finds the maximum a posteriori (MAP) solution for Gaussian process noise and Student’s t observation noise. It is extremely robust against outliers, outperforming the recently proposed L1-Laplace smoother in extreme situations with data containing 20% or … Read more