arXiv Analytics

Sign in

arXiv:2102.12985 [cs.LG]AbstractReferencesReviewsResources

A Novel Framework for Neural Architecture Search in the Hill Climbing Domain

Mudit Verma, Pradyumna Sinha, Karan Goyal, Apoorva Verma, Seba Susan

Published 2021-02-22Version 1

Neural networks have now long been used for solving complex problems of image domain, yet designing the same needs manual expertise. Furthermore, techniques for automatically generating a suitable deep learning architecture for a given dataset have frequently made use of reinforcement learning and evolutionary methods which take extensive computational resources and time. We propose a new framework for neural architecture search based on a hill-climbing procedure using morphism operators that makes use of a novel gradient update scheme. The update is based on the aging of neural network layers and results in the reduction in the overall training time. This technique can search in a broader search space which subsequently yields competitive results. We achieve a 4.96% error rate on the CIFAR-10 dataset in 19.4 hours of a single GPU training.

Related articles: Most relevant | Search more
arXiv:1909.03615 [cs.LG] (Published 2019-09-09)
Neural Architecture Search in Embedding Space
arXiv:2007.16149 [cs.LG] (Published 2020-07-31)
HMCNAS: Neural Architecture Search using Hidden Markov Chains and Bayesian Optimization
arXiv:2203.14577 [cs.LG] (Published 2022-03-28)
Demystifying the Neural Tangent Kernel from a Practical Perspective: Can it be trusted for Neural Architecture Search without training?