arXiv Analytics

Sign in

arXiv:2107.07343 [cs.LG]AbstractReferencesReviewsResources

Mutation is all you need

Lennart Schneider, Florian Pfisterer, Martin Binder, Bernd Bischl

Published 2021-07-04Version 1

Neural architecture search (NAS) promises to make deep learning accessible to non-experts by automating architecture engineering of deep neural networks. BANANAS is one state-of-the-art NAS method that is embedded within the Bayesian optimization framework. Recent experimental findings have demonstrated the strong performance of BANANAS on the NAS-Bench-101 benchmark being determined by its path encoding and not its choice of surrogate model. We present experimental results suggesting that the performance of BANANAS on the NAS-Bench-301 benchmark is determined by its acquisition function optimizer, which minimally mutates the incumbent.

Comments: Accepted for the 8th ICML Workshop on Automated Machine Learning (2021). 10 pages, 1 table, 3 figures
Categories: cs.LG, cs.NE
Related articles: Most relevant | Search more
arXiv:1601.00917 [cs.LG] (Published 2016-01-05)
Distilling Reverse-Mode Automatic Differentiation (DrMAD) for Optimizing Hyperparameters of Deep Neural Networks
arXiv:1611.01639 [cs.LG] (Published 2016-11-05)
Representation of uncertainty in deep neural networks through sampling
arXiv:1605.04639 [cs.LG] (Published 2016-05-16)
Alternating optimization method based on nonnegative matrix factorizations for deep neural networks