arXiv Analytics

Sign in

arXiv:2004.01628 [cs.LG]AbstractReferencesReviewsResources

Weighted Random Search for Hyperparameter Optimization

Adrian-Catalin Florea, Razvan Andonie

Published 2020-04-03Version 1

We introduce an improved version of Random Search (RS), used here for hyperparameter optimization of machine learning algorithms. Unlike the standard RS, which generates for each trial new values for all hyperparameters, we generate new values for each hyperparameter with a probability of change. The intuition behind our approach is that a value that already triggered a good result is a good candidate for the next step, and should be tested in new combinations of hyperparameter values. Within the same computational budget, our method yields better results than the standard RS. Our theoretical results prove this statement. We test our method on a variation of one of the most commonly used objective function for this class of problems (the Grievank function) and for the hyperparameter optimization of a deep learning CNN architecture. Our results can be generalized to any optimization problem defined on a discrete domain.

Comments: 14 pages, 5 figures, journal paper
Journal: INTERNATIONAL JOURNAL OF COMPUTERS COMMUNICATIONS & CONTROL, Vol 14, Nr. 2 (2019)
Categories: cs.LG, stat.ML
Related articles: Most relevant | Search more
arXiv:2003.13300 [cs.LG] (Published 2020-03-30)
Weighted Random Search for CNN Hyperparameter Optimization
arXiv:1905.12982 [cs.LG] (Published 2019-05-30)
Meta-Surrogate Benchmarking for Hyperparameter Optimization
arXiv:2410.10417 [cs.LG] (Published 2024-10-14)
A Stochastic Approach to Bi-Level Optimization for Hyperparameter Optimization and Meta Learning