Skip to Main Content
We propose a hierarchical framework and new parallel algorithms for stochastic function optimization under conditions where the function to be optimized is subject to random noise, the variance of which decreases with sampling time. This is the situation expected for many real-world and simulation applications where results are obtained from sampling, and contain experimental error or random noise. Our new optimization algorithms are based on a downhill simplex algorithm, with extensions that alter the timing of simplex operations based on the level of noise in the function evaluations. Three proposed optimization methods, which we term maxnoise, point-to-point comparison, and a combination of these two, are tested on the Rosenbrock function and found to be better than previous stochastic optimization methods. The parallel framework implementing the optimization algorithms is also new, and is based on a master-worker architecture where each worker runs a massively parallel program. The parallel implementation allows the sampling to proceed independently on multiple processors, and is demonstrated to scale well up to over 100 vertices. It is highly suitable for clusters with an ever increasing number of cores per node. The new methods have been applied successfully to the reparameterization of the TIP4P water model, achieving thermodynamic and structural results for liquid water that are as good as or better than the original model, with the advantage of a fully automated parameterization process.