Current browse context:
stat.ML
Change to browse by:
References & Citations
Statistics > Machine Learning
Title: Learning Energy-Based Models by Cooperative Diffusion Recovery Likelihood
(Submitted on 10 Sep 2023 (v1), last revised 18 Apr 2024 (this version, v4))
Abstract: Training energy-based models (EBMs) on high-dimensional data can be both challenging and time-consuming, and there exists a noticeable gap in sample quality between EBMs and other generative frameworks like GANs and diffusion models. To close this gap, inspired by the recent efforts of learning EBMs by maximizing diffusion recovery likelihood (DRL), we propose cooperative diffusion recovery likelihood (CDRL), an effective approach to tractably learn and sample from a series of EBMs defined on increasingly noisy versions of a dataset, paired with an initializer model for each EBM. At each noise level, the two models are jointly estimated within a cooperative training framework: samples from the initializer serve as starting points that are refined by a few MCMC sampling steps from the EBM. The EBM is then optimized by maximizing recovery likelihood, while the initializer model is optimized by learning from the difference between the refined samples and the initial samples. In addition, we made several practical designs for EBM training to further improve the sample quality. Combining these advances, our approach significantly boost the generation performance compared to existing EBM methods on CIFAR-10 and ImageNet datasets. We also demonstrate the effectiveness of our models for several downstream tasks, including classifier-free guided generation, compositional generation, image inpainting and out-of-distribution detection.
Submission history
From: Yaxuan Zhu [view email][v1] Sun, 10 Sep 2023 22:05:24 GMT (32398kb,D)
[v2] Tue, 12 Sep 2023 20:23:34 GMT (32398kb,D)
[v3] Sun, 24 Mar 2024 07:31:23 GMT (28235kb,D)
[v4] Thu, 18 Apr 2024 04:02:03 GMT (27131kb,D)
Link back to: arXiv, form interface, contact.