Current browse context:
stat.ML
Change to browse by:
References & Citations
Statistics > Machine Learning
Title: Training of Neural Networks with Uncertain Data, A Mixture of Experts Approach
(Submitted on 13 Dec 2023 (this version), latest version 25 Apr 2024 (v4))
Abstract: This paper presents the "Uncertainty-aware Mixture of Experts" (uMoE), a novel approach designed to address aleatoric uncertainty in the training of predictive models based on Neural Networks (NNs). While existing methods primarily focus on managing uncertainty during infer-ence, uMoE integrates uncertainty directly into the train-ing process. The uMoE approach adopts a "Divide and Conquer" paradigm to partition the uncertain input space into more manageable subspaces. It consists of Expert components, each trained solely on the portion of input uncertainty corresponding to their subspace. On top of the Experts, a Gating Unit, guided by additional infor-mation about the distribution of uncertain inputs across these subspaces, learns to weight the Experts to minimize deviations from the ground truth. Our results highlight that uMoE significantly outperforms baseline methods in handling data uncertainty. Furthermore, we conducted a robustness analysis, illustrating its capability to adapt to varying levels of uncertainty and suggesting optimal threshold parameters. This innovative approach holds wide applicability across diverse data-driven domains, in-cluding biomedical signal processing, autonomous driv-ing, and production quality control.
Submission history
From: Lucas Luttner [view email][v1] Wed, 13 Dec 2023 11:57:15 GMT (1120kb)
[v2] Mon, 22 Apr 2024 05:49:58 GMT (670kb)
[v3] Tue, 23 Apr 2024 07:00:21 GMT (978kb)
[v4] Thu, 25 Apr 2024 02:10:56 GMT (978kb)
Link back to: arXiv, form interface, contact.