Loading [MathJax]/extensions/MathMenu.js
HyperSpace: Distributed Bayesian Hyperparameter Optimization | IEEE Conference Publication | IEEE Xplore

HyperSpace: Distributed Bayesian Hyperparameter Optimization


Abstract:

As machine learning models continue to increase in complexity, so does the potential number of free model parameters commonly known as hyperparameters. While there has be...Show More

Abstract:

As machine learning models continue to increase in complexity, so does the potential number of free model parameters commonly known as hyperparameters. While there has been considerable progress toward finding optimal configurations of these hyperparameters, many optimization procedures are treated as black boxes. We believe optimization methods should not only return a set of optimized hyperparameters, but also give insight into the effects of model hyperparameter settings. To this end, we present HyperSpace, a parallel implementation of Bayesian sequential model-based optimization. HyperSpace leverages high performance computing (HPC) resources to better understand unknown, potentially non-convex hyperparameter search spaces. We show that it is possible to learn the dependencies between model hyperparameters through the optimization process. By partitioning large search spaces and running many optimization procedures in parallel, we also show that it is possible to discover families of good hyperparameter settings over a variety of models including unsupervised clustering, regression, and classification tasks.
Date of Conference: 24-27 September 2018
Date Added to IEEE Xplore: 21 February 2019
ISBN Information:
Print on Demand(PoD) ISSN: 1550-6533
Conference Location: Lyon, France

Contact IEEE to Subscribe

References

References is not available for this document.