-
Adjoint matching and stochastic optimal control for diffusion models
How optimal control theory provides a principled framework for finetuning flow-based generative models, from the Hamilton-Jacobi-Bellman equations to the adjoint matching algorithm.
-
The linear SDE viewpoint on flow matching and diffusion
How linear SDEs provide a unified framework that covers the entire space of flow matching and diffusion models.
-
A differential geometry toolkit for machine learning
The essential machinery of Riemannian geometry, from metrics and connections to curvature tensors, with an eye toward applications in representation learning and generative modeling.
-
Representation learning, generative models, and the manifold hypothesis
How the geometry of learned coordinate systems connects generative modeling to representation learning, and what this means for finding good low-dimensional descriptions of data.