arXiv Analytics

Sign in

arXiv:2407.11133 [cond-mat.stat-mech]AbstractReferencesReviewsResources

Discrete generative diffusion models without stochastic differential equations: a tensor network approach

Luke Causer, Grant M. Rotskoff, Juan P. Garrahan

Published 2024-07-15Version 1

Diffusion models (DMs) are a class of generative machine learning methods that sample a target distribution by transforming samples of a trivial (often Gaussian) distribution using a learned stochastic differential equation. In standard DMs, this is done by learning a ``score function'' that reverses the effect of adding diffusive noise to the distribution of interest. Here we consider the generalisation of DMs to lattice systems with discrete degrees of freedom, and where noise is added via Markov chain jump dynamics. We show how to use tensor networks (TNs) to efficiently define and sample such ``discrete diffusion models'' (DDMs) without explicitly having to solve a stochastic differential equation. We show the following: (i) by parametrising the data and evolution operators as TNs, the denoising dynamics can be represented exactly; (ii) the auto-regressive nature of TNs allows to generate samples efficiently and without bias; (iii) for sampling Boltzmann-like distributions, TNs allow to construct an efficient learning scheme that integrates well with Monte Carlo. We illustrate this approach to study the equilibrium of two models with non-trivial thermodynamics, the $d=1$ constrained Fredkin chain and the $d=2$ Ising model.

Related articles: Most relevant | Search more
arXiv:cond-mat/9905305 (Published 1999-05-20)
Scaling of the distribution of fluctuations of financial market indices
arXiv:cond-mat/0702136 (Published 2007-02-06)
Records in a changing world
arXiv:1408.1943 [cond-mat.stat-mech] (Published 2014-08-08)
Scaling in the Timing of Extreme Events