We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

stat.ML

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Statistics > Machine Learning

Title: Training of Neural Networks with Uncertain Data: A Mixture of Experts Approach

Authors: Lucas Luttner
Abstract: This paper introduces the "Uncertainty-aware Mixture of Experts" (uMoE), a novel solution aimed at addressing aleatoric uncertainty within Neural Network (NN) based predictive models. While existing methodologies primarily concentrate on managing uncertainty during inference, uMoE uniquely embeds uncertainty into the training phase. Employing a "Divide and Conquer" strategy, uMoE strategically partitions the uncertain input space into more manageable subspaces. It comprises Expert components, individually trained on their respective subspace uncertainties. Overarching the Experts, a Gating Unit, leveraging additional information regarding the distribution of uncertain in-puts across these subspaces, dynamically adjusts the weighting to minimize deviations from ground truth. Our findings demonstrate the superior performance of uMoE over baseline methods in effectively managing data uncertainty. Furthermore, through a comprehensive robustness analysis, we showcase its adaptability to varying uncertainty levels and propose optimal threshold parameters. This innovative approach boasts broad applicability across diverse da-ta-driven domains, including but not limited to biomedical signal processing, autonomous driving, and production quality control.
Subjects: Machine Learning (stat.ML); Machine Learning (cs.LG)
DOI: 10.5281/zenodo.10050097
Cite as: arXiv:2312.08083 [stat.ML]
  (or arXiv:2312.08083v4 [stat.ML] for this version)

Submission history

From: Lucas Luttner [view email]
[v1] Wed, 13 Dec 2023 11:57:15 GMT (1120kb)
[v2] Mon, 22 Apr 2024 05:49:58 GMT (670kb)
[v3] Tue, 23 Apr 2024 07:00:21 GMT (978kb)
[v4] Thu, 25 Apr 2024 02:10:56 GMT (978kb)

Link back to: arXiv, form interface, contact.