Current browse context:
stat.ML
Change to browse by:
References & Citations
Statistics > Machine Learning
Title: Training of Neural Networks with Uncertain Data: A Mixture of Experts Approach
(Submitted on 13 Dec 2023 (v1), last revised 25 Apr 2024 (this version, v4))
Abstract: This paper introduces the "Uncertainty-aware Mixture of Experts" (uMoE), a novel solution aimed at addressing aleatoric uncertainty within Neural Network (NN) based predictive models. While existing methodologies primarily concentrate on managing uncertainty during inference, uMoE uniquely embeds uncertainty into the training phase. Employing a "Divide and Conquer" strategy, uMoE strategically partitions the uncertain input space into more manageable subspaces. It comprises Expert components, individually trained on their respective subspace uncertainties. Overarching the Experts, a Gating Unit, leveraging additional information regarding the distribution of uncertain in-puts across these subspaces, dynamically adjusts the weighting to minimize deviations from ground truth. Our findings demonstrate the superior performance of uMoE over baseline methods in effectively managing data uncertainty. Furthermore, through a comprehensive robustness analysis, we showcase its adaptability to varying uncertainty levels and propose optimal threshold parameters. This innovative approach boasts broad applicability across diverse da-ta-driven domains, including but not limited to biomedical signal processing, autonomous driving, and production quality control.
Submission history
From: Lucas Luttner [view email][v1] Wed, 13 Dec 2023 11:57:15 GMT (1120kb)
[v2] Mon, 22 Apr 2024 05:49:58 GMT (670kb)
[v3] Tue, 23 Apr 2024 07:00:21 GMT (978kb)
[v4] Thu, 25 Apr 2024 02:10:56 GMT (978kb)
Link back to: arXiv, form interface, contact.