Hyperparameter Tuning using Gaussian Process Multi-Arm Bandits

semanticscholar(2020)

引用 0|浏览2
暂无评分
摘要
Learning useful models from data generally requires fixing hyperparameters to either define model class or optimization procedure. Choice of hyperparameter can have a huge impact on model performance, but hyperparameter tuning is often labor-intensive, costly, and sub-optimal. Hyperparameter tuning can be automated by using a surrogate model to regress the choice of parameter to model score, then searching using some heuristic. In this paper, we use Gaussian Process guided by expected improvement exploration to efficiently perform Bayesian hyperparameter optimization. Our framework can be easily integrated with a wide range of problems requiring highdimensional hyperparameter optimization, including parameters that are discrete. We showcase our framework on a number of examples, including choosing optimal regularization coefficients for regression and optimizing neural network architecture for image classification.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要