We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

stat.ML

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Statistics > Machine Learning

Title: Fast and Simple Natural-Gradient Variational Inference with Mixture of Exponential-family Approximations

Abstract: Natural-gradient methods enable fast and simple algorithms for variational inference, but due to computational difficulties, their use is mostly limited to \emph{minimal} exponential-family (EF) approximations. In this paper, we extend their application to estimate \emph{structured} approximations such as mixtures of EF distributions. Such approximations can fit complex, multimodal posterior distributions and are generally more accurate than unimodal EF approximations. By using a \emph{minimal conditional-EF} representation of such approximations, we derive simple natural-gradient updates. Our empirical results demonstrate a faster convergence of our natural-gradient method compared to black-box gradient-based methods with reparameterization gradients. Our work expands the scope of natural gradients for Bayesian inference and makes them more widely applicable than before.
Comments: Corrected some typos and updated the appendix (ICML 2019)
Subjects: Machine Learning (stat.ML); Machine Learning (cs.LG)
Cite as: arXiv:1906.02914 [stat.ML]
  (or arXiv:1906.02914v3 [stat.ML] for this version)

Submission history

From: Wu Lin [view email]
[v1] Fri, 7 Jun 2019 06:16:04 GMT (3334kb,D)
[v2] Wed, 30 Oct 2019 18:59:48 GMT (3342kb,D)
[v3] Fri, 6 Nov 2020 06:40:05 GMT (3343kb,D)

Link back to: arXiv, form interface, contact.