Skip to Main Content
In this paper, we study the queue-overflow probability of wireless scheduling algorithms. In wireless networks operated under queue-length-based scheduling algorithms, there often exists a tight coupling between the service-rate process, the system backlog process, the arrival process, and the stochastic process governing channel variations. Although one can use sample-path large-deviation techniques to form an estimate of the queue-overflow probability, the formulation leads to a difficult multidimensional calculus-of-variations problem. In this paper, we present a new technique to address this complexity issue. Using ideas from the Lyapunov function approach in control theory, this technique maps the complex multidimensional calculus-of-variations problem to a 1-D calculus-of-variations problem, and the latter is often much easier to solve. Further, under appropriate conditions, we show that when a scheduling algorithm minimizes the drift of a Lyapunov function at each point of every fluid sample path, the algorithm will be optimal in the sense that it maximizes the asymptotic decay rate of the probability that the Lyapunov function value exceeds a given threshold. We believe that these results can potentially be used to study the queue-overflow probability of a large class of wireless scheduling algorithms and to design new scheduling algorithms with optimal overflow probabilities.