We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.LG

Change to browse by:

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Machine Learning

Title: Convergence Analysis of Probability Flow ODE for Score-based Generative Models

Abstract: Score-based generative models have emerged as a powerful approach for sampling high-dimensional probability distributions. Despite their effectiveness, their theoretical underpinnings remain relatively underdeveloped. In this work, we study the convergence properties of deterministic samplers based on probability flow ODEs from both theoretical and numerical perspectives. Assuming access to $L^2$-accurate estimates of the score function, we prove the total variation between the target and the generated data distributions can be bounded above by $\mathcal{O}(d\sqrt{\delta})$ in the continuous time level, where $d$ denotes the data dimension and $\delta$ represents the $L^2$-score matching error. For practical implementations using a $p$-th order Runge-Kutta integrator with step size $h$, we establish error bounds of $\mathcal{O}(d(\sqrt{\delta} + (dh)^p))$ at the discrete level. Finally, we present numerical studies on problems up to $128$ dimensions to verify our theory, which indicate a better score matching error and dimension dependence.
Comments: 33 pages, 7 figures
Subjects: Machine Learning (cs.LG); Classical Analysis and ODEs (math.CA); Numerical Analysis (math.NA)
Cite as: arXiv:2404.09730 [cs.LG]
  (or arXiv:2404.09730v1 [cs.LG] for this version)

Submission history

From: Zhengjiang Lin [view email]
[v1] Mon, 15 Apr 2024 12:29:28 GMT (302kb,D)

Link back to: arXiv, form interface, contact.