Year: 2025
Author: Yubin Lu, Zhongjian Wang, Guillaume Bal
East Asian Journal on Applied Mathematics, Vol. 15 (2025), Iss. 4 : pp. 669–700
Abstract
This paper concerns the mathematical analyses of the diffusion model in machine learning. The drift term of the backward sampling process is represented as a conditional expectation involving the data distribution and the forward diffusion. The training process aims to find such a drift function by minimizing the mean-squared residue related to the conditional expectation. Using small-time approximations of the Green’s function of the forward diffusion, we show that the analytical mean drift function in DDPM and the score function in SGM asymptotically blow up in the final stages of the sampling process for singular data distributions such as those concentrated on lower-dimensional manifolds, and are therefore difficult to approximate by a network. To overcome this difficulty, we derive a new target function and associated loss, which remains bounded even for singular data distributions. We validate the theoretical findings with several numerical examples.
You do not have full access to this article.
Already a Subscriber? Sign in as an individual or via your institution
Journal Article Details
Publisher Name: Global Science Press
Language: English
DOI: https://doi.org/10.4208/eajam.2024-158.280924
East Asian Journal on Applied Mathematics, Vol. 15 (2025), Iss. 4 : pp. 669–700
Published online: 2025-01
AMS Subject Headings:
Copyright: COPYRIGHT: © Global Science Press
Pages: 32
Keywords: Generative model singularities Green’s kernel adaptive time-stepping low-dimensional manifold.