We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.LG

Change to browse by:

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Machine Learning

Title: Making Better Use of Unlabelled Data in Bayesian Active Learning

Abstract: Fully supervised models are predominant in Bayesian active learning. We argue that their neglect of the information present in unlabelled data harms not just predictive performance but also decisions about what data to acquire. Our proposed solution is a simple framework for semi-supervised Bayesian active learning. We find it produces better-performing models than either conventional Bayesian active learning or semi-supervised learning with randomly acquired data. It is also easier to scale up than the conventional approach. As well as supporting a shift towards semi-supervised models, our findings highlight the importance of studying models and acquisition methods in conjunction.
Comments: Published at AISTATS 2024
Subjects: Machine Learning (cs.LG); Machine Learning (stat.ML)
Cite as: arXiv:2404.17249 [cs.LG]
  (or arXiv:2404.17249v1 [cs.LG] for this version)

Submission history

From: Freddie Bickford Smith [view email]
[v1] Fri, 26 Apr 2024 08:41:55 GMT (3210kb,D)

Link back to: arXiv, form interface, contact.