Skip to Main Content
In this paper, the resource access control process is modelled as a Markov decision process. The stationary optimal resource access control policy is determined by solving the associated linear programming problem. Proposed resource access control policy allows employed system to make admission decisions to maximize system rewards according to current traffic condition and QoS metrics. The salient contribution of this research is to find the range for each optimal resource access control policy under specific traffic conditions. Sensitivity analysis technique is used to make it possible to find the range for each optimal policy under specific traffic parameters. Proposed model and sensitivity analysis techniques can be easily extended to a rich service model with multiple traffic types. The performance in terms of the weighted system utilization is analyzed via simulation and the results are compared with those of the complete sharing scheme. The ranges for the optimal policy are illustrated to show that a table-lookup scheme is a prominent application for the real-time resource access control.