We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

stat.ML

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Statistics > Machine Learning

Title: Linear Convergence of Black-Box Variational Inference: Should We Stick the Landing?

Abstract: We prove that black-box variational inference (BBVI) with control variates, particularly the sticking-the-landing (STL) estimator, converges at a geometric (traditionally called "linear") rate under perfect variational family specification. In particular, we prove a quadratic bound on the gradient variance of the STL estimator, one which encompasses misspecified variational families. Combined with previous works on the quadratic variance condition, this directly implies convergence of BBVI with the use of projected stochastic gradient descent. We also improve existing analysis on the regular closed-form entropy gradient estimators, which enables comparison against the STL estimator and provides explicit non-asymptotic complexity guarantees for both.
Subjects: Machine Learning (stat.ML); Machine Learning (cs.LG); Computation (stat.CO)
Cite as: arXiv:2307.14642 [stat.ML]
  (or arXiv:2307.14642v2 [stat.ML] for this version)

Submission history

From: Kyurae Kim [view email]
[v1] Thu, 27 Jul 2023 06:32:43 GMT (263kb)
[v2] Mon, 23 Oct 2023 19:19:11 GMT (298kb)
[v3] Wed, 21 Feb 2024 23:03:37 GMT (13054kb)
[v4] Sat, 9 Mar 2024 01:10:21 GMT (327kb)
[v5] Tue, 23 Apr 2024 01:58:11 GMT (313kb)

Link back to: arXiv, form interface, contact.