Skip to Main Content
In many application areas, including control systems, careful management of system resources is key to providing the best application performance. Most traditional resource management techniques for real-time systems with multiple control loops are based on open-loop strategies that statically allocate a constant CPU share to each controller, independent of their current resource needs. This provides average control performance with minimal overhead but in general fails to provide the best performance possible within the available resources. We show that by using feedback to dynamically allocate resources to controllers as a function of the current state of their controlled systems, control performance can be significantly improved. We present an optimal resource allocation policy that maximizes control performance within the available resources and provide experimental results showing that the optimal policy 1) significantly increases control performance compared to traditional control system implementations (by more than 20% in our experiments), 2) maximizes control performance over other feedback-based policies, 3) saves resources when perturbations occur infrequently, and 4) incurs negligible overhead.