We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

stat.ML

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Statistics > Machine Learning

Title: Training of Neural Networks with Uncertain Data, A Mixture of Experts Approach

Authors: Lucas Luttner
Abstract: This paper presents the "Uncertainty-aware Mixture of Experts" (uMoE), a novel approach designed to address aleatoric uncertainty in the training of predictive models based on Neural Networks (NNs). While existing methods primarily focus on managing uncertainty during infer-ence, uMoE integrates uncertainty directly into the train-ing process. The uMoE approach adopts a "Divide and Conquer" paradigm to partition the uncertain input space into more manageable subspaces. It consists of Expert components, each trained solely on the portion of input uncertainty corresponding to their subspace. On top of the Experts, a Gating Unit, guided by additional infor-mation about the distribution of uncertain inputs across these subspaces, learns to weight the Experts to minimize deviations from the ground truth. Our results highlight that uMoE significantly outperforms baseline methods in handling data uncertainty. Furthermore, we conducted a robustness analysis, illustrating its capability to adapt to varying levels of uncertainty and suggesting optimal threshold parameters. This innovative approach holds wide applicability across diverse data-driven domains, in-cluding biomedical signal processing, autonomous driv-ing, and production quality control.
Subjects: Machine Learning (stat.ML); Machine Learning (cs.LG)
DOI: 10.5281/zenodo.10050097
Cite as: arXiv:2312.08083 [stat.ML]
  (or arXiv:2312.08083v1 [stat.ML] for this version)

Submission history

From: Lucas Luttner [view email]
[v1] Wed, 13 Dec 2023 11:57:15 GMT (1120kb)
[v2] Mon, 22 Apr 2024 05:49:58 GMT (670kb)
[v3] Tue, 23 Apr 2024 07:00:21 GMT (978kb)
[v4] Thu, 25 Apr 2024 02:10:56 GMT (978kb)

Link back to: arXiv, form interface, contact.