Skip to Main Content
Networks of queues are basic models for the analysis and design of computer networks, and represent, in their own right, an important research field, originated by the seminal work of J.R. Jackson (see Operations Research, vol.5, no.4, p.518-21, 1957; vol.50, no.1, p.112-13, 2002). Various networks of queues, proposed after Jackson, are different generalizations or variations of a class of fundamental models, referred to as Jackson networks of queues. For this reason, the classical result of Jackson, known as Jackson's theorem, is considered the cornerstone of the mathematical theory of networks of queues. However, Jackson's theorem does not hold. After revisiting Jackson's theorem, we disprove the theorem with simple counterexamples. We show that the limitation of the existing theory of stochastic modeling may explain why Jackson's proof and all other proofs of Jackson's theorem are flawed. We conclude by pointing out the implication of our result to networking studies.