arXiv Analytics

Sign in

arXiv:2101.00336 [cs.LG]AbstractReferencesReviewsResources

Neural Architecture Search via Combinatorial Multi-Armed Bandit

Hanxun Huang, Xingjun Ma, Sarah M. Erfani, James Bailey

Published 2021-01-01Version 1

Neural Architecture Search (NAS) has gained significant popularity as an effective tool for designing high performance deep neural networks (DNNs). NAS can be performed via policy gradient, evolutionary algorithms, differentiable architecture search or tree-search methods. While significant progress has been made for both policy gradient and differentiable architecture search, tree-search methods have so far failed to achieve comparable accuracy or search efficiency. In this paper, we formulate NAS as a Combinatorial Multi-Armed Bandit (CMAB) problem (CMAB-NAS). This allows the decomposition of a large search space into smaller blocks where tree-search methods can be applied more effectively and efficiently. We further leverage a tree-based method called Nested Monte-Carlo Search to tackle the CMAB-NAS problem. On CIFAR-10, our approach discovers a cell structure that achieves a low error rate that is comparable to the state-of-the-art, using only 0.58 GPU days, which is 20 times faster than current tree-search methods. Moreover, the discovered structure transfers well to large-scale datasets such as ImageNet.

Related articles: Most relevant | Search more
arXiv:2201.11679 [cs.LG] (Published 2022-01-27)
DropNAS: Grouped Operation Dropout for Differentiable Architecture Search
arXiv:1905.01786 [cs.LG] (Published 2019-05-06)
Differentiable Architecture Search with Ensemble Gumbel-Softmax
arXiv:2001.01431 [cs.LG] (Published 2020-01-06)
Deeper Insights into Weight Sharing in Neural Architecture Search
Yuge Zhang et al.