Skip to Main Content
Server-side resource configuration and allocation for QoS guarantee is a challenge in performance-critical Internet applications. To overcome the difficulties caused by the high-variability and burstiness of Internet traffic, this paper presents a decay function model of request scheduling algorithms for the resource configuration and allocation problem. Under the decay function model, request scheduling is modelled as a transfer-function based filter system that has an input process of requests and an output process of server load. Unlike conventional queueing network models that rely on mean-value analysis for input renewal or Markovian processes, this decay function model works for general time-series based or measurement based processes and hence facilitates the study of statistical correlations between the request traffic, server load, and QoS of requests. Based on the model, we apply filter design theories in signal processing in the optimality analysis of various scheduling algorithms. We reveal a relationship between the server capacity, scheduling policy, service deadline, and other request properties in a formalism and present the optimality condition with respect to the second moments of request properties for an important class of fixed-time scheduling policies. Simulation results verify the relationship and show that optimal fixed-time scheduling can effectively reduce the server workload variance and guarantee service deadlines with high robustness on the Internet.