Skip to Main Content
We characterize the stability and achievable performance of networked estimation under correlated packet losses described by the Gilbert-Elliot model. For scalar continuous-time linear systems, we derive closed-form expressions for the mean-square distortion of the optimal estimator. The conditions for stable mean square estimation error are equivalent to those obtained previously for stability of `peak distortions'. We study how the estimator performance depends on loss probability and loss burst length, and show that the mean-square distortion remains bounded if the average burst length does not exceed a calculated bound. The main new finding is that, once we fix the mean length of loss bursts, the average packet loss rate influences the estimator's performance but not its stability.