arXiv Analytics

Sign in

arXiv:1906.02869 [cs.LG]AbstractReferencesReviewsResources

One-Shot Neural Architecture Search via Compressive Sensing

Minsu Cho, Mohammadreza Soltani, Chinmay Hegde

Published 2019-06-07Version 1

Neural architecture search (NAS), or automated design of neural network models, remains a very challenging meta-learning problem. Several recent works (called "one-shot" approaches) have focused on dramatically reducing NAS running time by leveraging proxy models that still provide architectures with competitive performance. In our work, we propose a new meta-learning algorithm that we call CoNAS, or Compressive sensing-based Neural Architecture Search. Our approach merges ideas from one-shot approaches with iterative techniques for learning low-degree sparse Boolean polynomial functions. We validate our approach on several standard test datasets, discover novel architectures hitherto unreported, and achieve competitive (or better) results in both performance and search time compared to existing NAS approaches. Further, we support our algorithm with a theoretical analysis, providing upper bounds on the number of measurements needed to perform reliable meta-learning; to our knowledge, these analysis tools are novel to the NAS literature and may be of independent interest.

Related articles: Most relevant | Search more
arXiv:1906.09557 [cs.LG] (Published 2019-06-23)
One-Shot Neural Architecture Search Through A Posteriori Distribution Guided Sampling
arXiv:2008.06808 [cs.LG] (Published 2020-08-15)
Finding Fast Transformers: One-Shot Neural Architecture Search by Component Composition
arXiv:2006.09264 [cs.LG] (Published 2020-06-12)
Bonsai-Net: One-Shot Neural Architecture Search via Differentiable Pruners