Journals
Resources
About Us
Open Access

Convergence Analysis of Discrete Diffusion Model: Exact Implementation Through Uniformization

Year:    2025

Author:    Hongrui Chen, Lexing Ying

Journal of Machine Learning, Vol. 4 (2025), Iss. 2 : pp. 108–127

Abstract

Diffusion models have achieved huge empirical success in data generation tasks. Recently, some efforts have been made to adapt the framework of diffusion models to discrete state space, providing a more natural approach for modeling intrinsically discrete data, such as language and graphs. This is achieved by formulating both the forward noising process and the corresponding reversed process as continuous time Markov chains. In this paper, we investigate the theoretical properties of the discrete diffusion model. Specifically, we introduce an algorithm leveraging the uniformization of continuous Markov chains, implementing transitions on random time points. Under reasonable assumptions on the learning of the discrete score function, we derive total variation distance and Kullback–Leibler divergence guarantees for sampling from any distribution on a hypercube. Our results align with state-of-the-art achievements for diffusion models in $\mathbb{R}^d$ and further underscore the advantages of discrete diffusion models in comparison to the $\mathbb{R}^d$ setting.

Journal Article Details

Publisher Name:    Global Science Press

Language:    English

DOI:    https://doi.org/10.4208/jml.240812

Journal of Machine Learning, Vol. 4 (2025), Iss. 2 : pp. 108–127

Published online:    2025-01

AMS Subject Headings:   

Copyright:    COPYRIGHT: © Global Science Press

Pages:    20

Keywords:    Diffusion model Sampling Machine learning theory.

Author Details

Hongrui Chen

Lexing Ying