Skip to Main Content
We consider a dynamic dispatching problem where jobs are assigned upon arrival into parallel queues. Jobs have an arbitrary size distribution and each queue has its own service rate, queueing discipline, and operating power while serving jobs. Our goal is to minimize a weighted sum of delay and energy consumption under the assumption that the dispatcher is aware of the remaining service time of each job in the system, including that of the arriving job. We devise efficient dispatching heuristics based on the first policy iteration procedure of Markov Decision Processes. The resulting policies are illustrated by numerical examples. Direct control over the trade-off between performance and energy consumption will be increasingly important in future ICT equipment designs wherever dynamic queue assignment is needed.