Skip to Main Content
The paper studies distributed static parameter (vector) estimation in sensor networks with nonlinear observation models and noisy intersensor communication. It introduces separably estimable observation models that generalize the observability condition in linear centralized estimation to nonlinear distributed estimation. It studies two distributed estimation algorithms in separably estimable models, the NU (with its linear counterpart LU) and the NLU. Their update rule combines a consensus step (where each sensor updates the state by weight averaging it with its neighbors' states) and an innovation step (where each sensor processes its local current observation). This makes the three algorithms of the consensus + innovations type, very different from traditional consensus. This paper proves consistency (all sensors reach consensus almost surely and converge to the true parameter value), efficiency, and asymptotic unbiasedness. For LU and NU, it proves asymptotic normality and provides convergence rate guarantees. The three algorithms are characterized by appropriately chosen decaying weight sequences. Algorithms LU and NU are analyzed in the framework of stochastic approximation theory; algorithm NLU exhibits mixed time-scale behavior and biased perturbations, and its analysis requires a different approach that is developed in this paper.
Date of Publication: June 2012