arXiv:2107.07343 [cs.LG]AbstractReferencesReviewsResources
Mutation is all you need
Lennart Schneider, Florian Pfisterer, Martin Binder, Bernd Bischl
Published 2021-07-04Version 1
Neural architecture search (NAS) promises to make deep learning accessible to non-experts by automating architecture engineering of deep neural networks. BANANAS is one state-of-the-art NAS method that is embedded within the Bayesian optimization framework. Recent experimental findings have demonstrated the strong performance of BANANAS on the NAS-Bench-101 benchmark being determined by its path encoding and not its choice of surrogate model. We present experimental results suggesting that the performance of BANANAS on the NAS-Bench-301 benchmark is determined by its acquisition function optimizer, which minimally mutates the incumbent.