Skip to Main Content
Providers of Web-based services can take advantage of many convenient features of cloud computing infrastructures, but they still have to implement request management algorithms that are able to face sudden peaks of requests. We consider distributed algorithms implemented by front-end servers to dispatch and redirect requests among application servers. Current solutions based on load-blind algorithms, or considering just server load and thresholds are inadequate to cope with the demand patterns reaching modern Internet application servers. In this paper, we propose and evaluate a request management algorithm, namely Performance Gain Prediction, that combines several pieces of information (server load, computational cost of a request, user session migration and redirection delay) to predict whether the redirection of a request to another server may result in a shorter response time. To the best of our knowledge, no other study combines information about infrastructure status, user request characteristics and redirection overhead for dynamic request management in cloud computing. Our results show that the proposed algorithm is able to reduce the response time with respect to existing request management algorithms operating on the basis of thresholds.