Home
A high-performance PyTorch library that makes Energy-Based Models accessible and efficient for researchers and practitioners alike.
Overview¶
Energy-based models assign a scalar energy to each input, implicitly defining a probability distribution where lower energy means higher probability. TorchEBM gives you composable PyTorch building blocks that span this landscape, from energy functions and MCMC samplers to flow matching and diffusion-based generation.
In Action¶
Equilibrium matching with different interpolants transforming noise into structured distributions.
Core Components¶
-
Core
Base classes, energy models, schedulers, and the device/dtype management layer shared across all components.
-
Samplers
Draw samples from energy landscapes via MCMC methods, gradient-based optimization, or learned flow/diffusion dynamics.
-
Loss Functions
Training objectives including contrastive divergence, score matching variants, and equilibrium matching.
-
Interpolants
Define how noise and data are mixed along a continuous time path for flow matching and diffusion.
-
Integrators
Numerical solvers for SDEs, ODEs, and Hamiltonian systems. Pluggable into samplers and generation pipelines.
-
Models
Neural architectures for energy functions and velocity fields, including vision transformers and guidance wrappers.
-
Datasets
Synthetic 2D distributions for rapid prototyping and visual evaluation. All PyTorch
Datasetobjects. -
CUDA
CUDA-accelerated kernels and mixed precision support for performance-critical sampling and training.
Quick Start¶
Train a generative model with Equilibrium Matching, then sample with both a flow solver and an energy-based sampler:
See the tutorials and examples for CIFAR-10 generation, score matching, and more.
Enjoying TorchEBM? A GitHub star helps others discover the project and motivates continued development.
Star on GitHubCitation¶
If TorchEBM is useful in your research, please cite it:

