Abstract:
In the dynamic power management (DPM), it is quite important to switch a high-power consuming state to a low-power consuming one at a suitable timing. This problem has be...Show MoreMetadata
Abstract:
In the dynamic power management (DPM), it is quite important to switch a high-power consuming state to a low-power consuming one at a suitable timing. This problem has been formulated as Markov decision processes (MDPs) with state-dependent control in the past literature. However, this approach may not be often feasible in many practical situations, because the state-dependent policy requires that all the states on a request arrival process must be observed through an online monitoring. To overcome this problem, we develop a simple time-out policy in the DPM, which can be regarded as the optimal timing to take a GO-SLEEP action during an idle period of the transaction system. We derive the optimal time-out policy minimizing the expected power consumption per unit time in the steady state analytically under the assumption that the request arrival process is given by a Markovian arrival process with an arbitrary number of phases. In numerical experiments with real read/write data for a hard disk unit and CPU utilization data, we estimate the expected power consumption per unit time under two DPM policies, namely, the time-out policy and the MDP-based policy, and quantitatively compare their effectiveness on the power reduction.
Published in: IEEE Systems Journal ( Volume: 11, Issue: 2, June 2017)