|
From Score Matching to Diffusion: a fine-grained error analysis in the Gaussian setting| title | From Score Matching to Diffusion: a fine-grained error analysis in the Gaussian setting |
|---|
| start_date | 2025/12/02 |
|---|
| schedule | 14h-16h |
|---|
| online | no |
|---|
| location_info | room Yvette Cauchoix (Perrin Building) |
|---|
| summary | Sampling from an unknown distribution, accessible only through discrete samples, is a fundamental problem at the core of generative AI. The current state-of-the-art methods follow a two-step process: first, estimating the score function (the gradient of a smoothed log-distribution) and then applying a diffusion-based sampling algorithm. The resulting distribution's correctness can be impacted by three major factors: the generalization and optimization errors in score matching, and the discretization error in the diffusion. In this paper we provide, in the Gaussian setting, the exact Wasserstein sampling error that arises from these four error sources. This allows us to rigorously track how the anisotropy of the data distribution (encoded by its power spectrum) interacts with key parameters of the end-to-end sampling method. |
|---|
| responsibles | Leclaire |
|---|
Workflow history| from state (1) | to state | comment | date |
| submitted | published | | 2025/11/27 11:01 UTC |
| |
|