Skip to Main Content
While peer-to-peer consensus algorithms have enviable robustness and locality for distributed estimation and computation problems, they have poor scaling behavior with network diameter. We show how deterministic multi-scale consensus algorithms overcome this limitation and provide optimal scaling with network size, but at the cost of requiring global knowledge of network topology. To obtain the benefits of both single- and multi-scale consensus methods we introduce a class of stochastic message-passing schemes that require no topology information and yet transmit information on several scales, achieving scalability. The algorithm is described by a sequence of random Markov chains, allowing us to prove convergence for general topologies.