arXiv Analytics

Sign in

arXiv:2312.08083 [stat.ML]AbstractReferencesReviewsResources

Training of Neural Networks with Uncertain Data, A Mixture of Experts Approach

Lucas Luttner

Published 2023-12-13Version 1

This paper presents the "Uncertainty-aware Mixture of Experts" (uMoE), a novel approach designed to address aleatoric uncertainty in the training of predictive models based on Neural Networks (NNs). While existing methods primarily focus on managing uncertainty during infer-ence, uMoE integrates uncertainty directly into the train-ing process. The uMoE approach adopts a "Divide and Conquer" paradigm to partition the uncertain input space into more manageable subspaces. It consists of Expert components, each trained solely on the portion of input uncertainty corresponding to their subspace. On top of the Experts, a Gating Unit, guided by additional infor-mation about the distribution of uncertain inputs across these subspaces, learns to weight the Experts to minimize deviations from the ground truth. Our results highlight that uMoE significantly outperforms baseline methods in handling data uncertainty. Furthermore, we conducted a robustness analysis, illustrating its capability to adapt to varying levels of uncertainty and suggesting optimal threshold parameters. This innovative approach holds wide applicability across diverse data-driven domains, in-cluding biomedical signal processing, autonomous driv-ing, and production quality control.

Related articles: Most relevant | Search more
arXiv:2007.12826 [stat.ML] (Published 2020-07-25)
The Interpolation Phase Transition in Neural Networks: Memorization and Generalization under Lazy Training
arXiv:2403.12187 [stat.ML] (Published 2024-03-18)
Approximation of RKHS Functionals by Neural Networks
arXiv:2211.08654 [stat.ML] (Published 2022-11-16)
Prediction and Uncertainty Quantification of SAFARI-1 Axial Neutron Flux Profiles with Neural Networks