Fast Robust Methods for Singular State-Space Models

State-space models are used in a wide range of time series analysis applications. Kalman filtering and smoothing are work-horse algorithms in these settings. While classic algorithms assume Gaussian errors to simplify estimation, recent advances use a broad range of optimization formulations to allow outlier-robust estimation, as well as constraints to capture prior information. Here we develop methods on state-space models where either transition or error covariances may be singular. These models frequently arise in navigation (e.g. for ‘colored noise’ models or deterministic integrals) and are ubiquitous in auto-correlated time series models such as ARMA. We reformulate all state-space models (singular as well as nonsingluar) as constrained convex optimization problems, and develop an efficient algorithm for this reformulation. The convergence rate is locally linear, with constants that do not depend on the conditioning of the problem. Numerical comparisons show that the new approach outperforms competing approaches for nonsingular models, including state of the art interior point (IP) methods. IP methods converge at superlinear rates; we expect them to dominate. However, the steep rate of the proposed approach (independent of problem conditioning) combined with cheap iterations wins against IP in a run-time comparison. This suggests that the proposed approach can be a default choice for estimating state space models outside of the Gaussian context for singular and nonsingular models. To highlight the capabilities of the new framework, we focus on navigation applications that use singular process covariance models. The methods have been implemented in an open source Python code.

Article

Download

View PDF