We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

nucl-th

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Nuclear Theory

Title: Model orthogonalization and Bayesian forecast mixing via Principal Component Analysis

Abstract: One can improve predictability in the unknown domain by combining forecasts of imperfect complex computational models using a Bayesian statistical machine learning framework. In many cases, however, the models used in the mixing process are similar. In addition to contaminating the model space, the existence of such similar, or even redundant, models during the multimodeling process can result in misinterpretation of results and deterioration of predictive performance. In this work we describe a method based on the Principal Component Analysis that eliminates model redundancy. We show that by adding model orthogonalization to the proposed Bayesian Model Combination framework, one can arrive at better prediction accuracy and reach excellent uncertainty quantification performance.
Comments: 12 pages, 4 figures
Subjects: Nuclear Theory (nucl-th); Data Analysis, Statistics and Probability (physics.data-an); Machine Learning (stat.ML)
Cite as: arXiv:2405.10839 [nucl-th]
  (or arXiv:2405.10839v1 [nucl-th] for this version)

Submission history

From: Kyle Godbey [view email]
[v1] Fri, 17 May 2024 15:01:29 GMT (535kb,D)

Link back to: arXiv, form interface, contact.