arXiv Analytics

Sign in

arXiv:2402.19102 [cs.LG]AbstractReferencesReviewsResources

FlatNAS: optimizing Flatness in Neural Architecture Search for Out-of-Distribution Robustness

Matteo Gambella, Fabrizio Pittorino, Manuel Roveri

Published 2024-02-29Version 1

Neural Architecture Search (NAS) paves the way for the automatic definition of Neural Network (NN) architectures, attracting increasing research attention and offering solutions in various scenarios. This study introduces a novel NAS solution, called Flat Neural Architecture Search (FlatNAS), which explores the interplay between a novel figure of merit based on robustness to weight perturbations and single NN optimization with Sharpness-Aware Minimization (SAM). FlatNAS is the first work in the literature to systematically explore flat regions in the loss landscape of NNs in a NAS procedure, while jointly optimizing their performance on in-distribution data, their out-of-distribution (OOD) robustness, and constraining the number of parameters in their architecture. Differently from current studies primarily concentrating on OOD algorithms, FlatNAS successfully evaluates the impact of NN architectures on OOD robustness, a crucial aspect in real-world applications of machine and deep learning. FlatNAS achieves a good trade-off between performance, OOD generalization, and the number of parameters, by using only in-distribution data in the NAS exploration. The OOD robustness of the NAS-designed models is evaluated by focusing on robustness to input data corruptions, using popular benchmark datasets in the literature.

Journal: 2024 International Joint Conference on Neural Networks (IJCNN), Yokohama, Japan, 2024, pp. 1-8
Categories: cs.LG, cs.AI, cs.CV
Related articles: Most relevant | Search more
arXiv:2408.01872 [cs.LG] (Published 2024-08-03)
Safe Semi-Supervised Contrastive Learning Using In-Distribution Data as Positive Examples
arXiv:2206.09385 [cs.LG] (Published 2022-06-19)
Out-of-distribution Detection by Cross-class Vicinity Distribution of In-distribution Data
arXiv:2012.04550 [cs.LG] (Published 2020-12-08)
In-N-Out: Pre-Training and Self-Training using Auxiliary Information for Out-of-Distribution Robustness