arXiv:2106.02479 [cs.LG]AbstractReferencesReviewsResources
Neural Architecture Search via Bregman Iterations
Leon Bungert, Tim Roith, Daniel Tenbrinck, Martin Burger
Published 2021-06-04Version 1
We propose a novel strategy for Neural Architecture Search (NAS) based on Bregman iterations. Starting from a sparse neural network our gradient-based one-shot algorithm gradually adds relevant parameters in an inverse scale space manner. This allows the network to choose the best architecture in the search space which makes it well-designed for a given task, e.g., by adding neurons or skip connections. We demonstrate that using our approach one can unveil, for instance, residual autoencoders for denoising, deblurring, and classification tasks. Code is available at https://github.com/TimRoith/BregmanLearning.
Related articles: Most relevant | Search more
arXiv:2004.07802 [cs.LG] (Published 2020-04-16)
Geometry-Aware Gradient Algorithms for Neural Architecture Search
arXiv:2107.07343 [cs.LG] (Published 2021-07-04)
Mutation is all you need
arXiv:2104.01177 [cs.LG] (Published 2021-04-02)
How Powerful are Performance Predictors in Neural Architecture Search?