Skip to Main Content
The main contribution of this paper is a novel hierarchical scheme for adaptive dynamic power management (DPM) under nonstationary service requests. We model the nonstationary arrival process of service requests as a Markov-modulated stochastic process in which the stochastic process for each modulation state models a particular stationary mode of the arrival process. The bottom layer of our hierarchical architecture is a set of stationary optimal DPM policies, pre-calculated off-line for selected modes from policy optimization in Markov decision processes. The supervisory power manager at the top layer adaptively and optimally switches among these stationary policies on-line to accommodate the actual mode-switching arrival dynamics. Simulation results show that our approach, under highly nonstationary requests, can lead to significant power savings compared to previously proposed heuristic approaches.