Skip to Main Content
In parallel cluster computing, an unscalable or unstable load balancing algorithm can intensely affect the performance of computing. To aim at this case, this paper puts forward a linear dynamic load balancing model and analyzes the stability of this linear model on the condition of existing time delay. Base on analyzing results, this paper uses a load balancing gain to control this model with the increasing system scale. In the end, a more useful nonlinear model is proposed and the simulation results are given to compare with analyzing results and other load balancing methods.