Skip to Main Content
This paper is concerned with information theoretic "metrics" for comparing two dynamical systems. Following the recent work of Tryphon Georgiou, we outline a prediction (filtering) based approach to do so. Central to the considerations of this paper is the notion of uncertainty. In particular, we compare systems in terms of additional uncertainty that results for the prediction problem with an incorrect choice of the model. While used variance of the prediction error, we quantify the additional uncertainty in terms of the Kullback-Leibler rate. This pseudometric is closely related to the classical Bode formula in control theory and we provide detailed comparison to the variance based metric. We present three applications that serve to illustrate the utility of the Kullback-Leibler rate to a range of model reduction and model selection issues. One, we show that model reduction with the metric leads to the so-called optimal prediction model. Two, for the particular case of linear systems, we describe an algorithm to obtain optimal prediction auto regressive (AR) models. Three, we use the metric to obtain a formula for stochastic linearization of a nonlinear dynamical system.