By Topic

Almost sure convergence to consensus in Markovian random graphs

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Matei, I. ; Inst. for Syst. Res. & the Dept. of Electr. & Comput. Eng., Univ. of Maryland, MD, USA ; Martins, N. ; Baras, J.S.

In this paper we discuss the consensus problem for a network of dynamic agents with undirected information flow and random switching topologies. The switching is determined by a Markov chain, each topology corresponding to a state of the Markov chain. We show that in order to achieve consensus almost surely and from any initial state the sets of graphs corresponding to the closed positive recurrent sets of the Markov chain must be jointly connected. The analysis relies on tools from matrix theory, Markovian jump linear systems theory and random processes theory. The distinctive feature of this work is addressing the consensus problem with ¿Markovian switching¿ topologies.

Published in:

Decision and Control, 2008. CDC 2008. 47th IEEE Conference on

Date of Conference:

9-11 Dec. 2008