Fast Sampling of Score-based Models with Cyclical Diffusion Sampling

Karimul Makhtidi, Alhadi Bustamam, Risman Adnan, Hanif Amal Robbani, Wibowo Mangunwardoyo, Mohammad Asif Khan

Research output: Contribution to journalArticlepeer-review

Abstract

Diffusion models have recently exhibited significant potential in generative modeling, surpassing generative adversarial networks concerning perceptual quality and autoregressive models in density estimation. However, a notable drawback of these models is their slow sampling time, requiring numerous model evaluations to generate high-quality samples. This research proposes a technique that incorporates cyclical stochastic gradient Langevin dynamics (SGLD) with elucidated diffusion models (EDM) sampler to enhance stability when utilizing a limited number of sampling steps. Cyclical step-size scheduling has been demonstrated to enhance the effectiveness of SGLD in learning complex multimodal distributions. By applying our method to well-established image generation benchmarks, such as CIFAR-10, AFHQv2, FFHQ, and ImageNet, we initially utilize state-of-the-art samplers requiring up to 4000 steps. However, our approach can significantly reduce the number of steps to as few as 18 and 40 while preserving considerable perceptual quality. For instance, on CIFAR-10, our method achieves a Fréchet inception distance of 1.79 in just 18 steps.

Original languageEnglish
Pages (from-to)1
Number of pages1
JournalIEEE Access
Volume12
DOIs
Publication statusAccepted/In press - 2024

Keywords

  • Computational modeling
  • Denoising diffusion probabilistic models
  • Langevin dynamics
  • Mathematical models
  • Noise reduction
  • Perturbation methods
  • Probabilistic logic
  • Score-based generative models
  • Solid modeling
  • Stochastic differential equation
  • Stochastic processes
  • Training

Fingerprint

Dive into the research topics of 'Fast Sampling of Score-based Models with Cyclical Diffusion Sampling'. Together they form a unique fingerprint.

Cite this