Remember how, when the lockdowns started, every organization said "we only have two weeks of cash on hand and will shut down if we don't get assistance"? That's basically happening to every single lab and NGO right now, except for no actual reason.
28.01.2025 11:11 β
π 35
π 6
π¬ 0
π 0
YouTube video by Mathématiques et informatique - Collège de France
GΓ©nΓ©ration de donnΓ©es en IA par transport et dΓ©bruitage (1) - StΓ©phane Mallat (2024-2025)
For the French-speaking audience, S. Mallat's courses at the College de France on Data generation in AI by transport and denoising have just started. I highly recommend them, as I've learned a lot from the overall vision of his courses.
Recordings are also available: www.youtube.com/watch?v=5zFh...
20.01.2025 17:49 β
π 10
π 3
π¬ 0
π 0
Slides for a general introduction to the use of Optimal Transport methods in learning, with an emphasis on diffusion models, flow matching, training 2 layers neural networks and deep transformers. speakerdeck.com/gpeyre/optim...
15.01.2025 19:08 β
π 125
π 26
π¬ 3
π 1
πππ
lmbp.uca.fr/stflour/
05.01.2025 04:48 β
π 25
π 5
π¬ 1
π 1
I'm delighted to note that our paper InDI has been selected as one of two Outstanding Paper awardees by the Transactions on Machine Learning @tmlr-pub.bsky.social
We sincerely thank the expert reviewers, Action Editors, the Outstanding Paper Committee, and the Editors for this honor
1/3
19.12.2024 23:41 β
π 28
π 1
π¬ 2
π 2
BreimanLectureNeurIPS2024_Doucet.pdf
The slides of my NeurIPS lecture "From Diffusion Models to SchrΓΆdinger Bridges - Generative Modeling meets Optimal Transport" can be found here
drive.google.com/file/d/1eLa3...
15.12.2024 18:40 β
π 327
π 67
π¬ 9
π 6
I love a good illustration π
15.12.2024 00:40 β
π 4
π 0
π¬ 0
π 0
After watching this beautiful keynote by @arnauddoucet.bsky.social , I *had* to give these Schrodinger bridges a try! Very interesting to be able to "straighten" a basic flow-matching approach. Super cool work by @vdebortoli.bsky.social & co-author!
14.12.2024 11:57 β
π 57
π 9
π¬ 3
π 0
SciForDL'24
Speaking at this #NeurIPS2024 workshop on a new analytic theory of creativity in diffusion models that predicts what new images they will create and explains how these images are constructed as patch mosaics of the training data. Great work by @masonkamb.bsky.social
scienceofdlworkshop.github.io
14.12.2024 17:01 β
π 43
π 3
π¬ 0
π 2
Don't miss it!
14.12.2024 04:40 β
π 1
π 0
π¬ 0
π 0
NeurIPS 2024 Workshop on Adaptive Foundation Models
I've been getting a lot of questions about autoregression vs diffusion at #NeurIPS2024 this week! I'm speaking at the adaptive foundation models workshop at 9AM tomorrow (West Hall A), about what happens when we combine modalities and modelling paradigms.
adaptive-foundation-models.org
14.12.2024 04:02 β
π 46
π 6
π¬ 2
π 1
Fantastic #neurips keynote by Arnaud Doucet! Really like this slide tracing back many of the modern flow-matching / stochastic interpolants ideas to a 1986 result by probabilist Istvan Gyongy describing how to "Markovianize" a diffusion process (eg. having coefficients depending on all the past)
13.12.2024 15:45 β
π 49
π 4
π¬ 1
π 1
π₯You enjoyed @arnauddoucet.bsky.social talk but want even more Schrodinger Bridge? Come talk to me at our poster!
π·Schrodinger Bridge Flow for Unpaired Data Translation
π East Exhibit Hall A-C #2504
Work done with my amazing collaborators Ira Korshunova
Andriy Mnih and @arnauddoucet.bsky.social
13.12.2024 17:52 β
π 17
π 2
π¬ 0
π 0
It's located near the west entrance to the west side of the conference center, on the first floor, in case that helps!
When a bunch of diffusers sit down and talk shop, their flow cannot be matchedπ
It's time for the #NeurIPS2024 diffusion circle!
πJoin us at 3PM on Friday December 13. We'll meet near this thing, and venture out from there and find a good spot to sit. Tell your friends!
12.12.2024 01:15 β
π 42
π 7
π¬ 1
π 1
100% agree. OT is not (or rarely) a goal in itself but rather a mean to enforce useful properties
05.12.2024 08:15 β
π 3
π 0
π¬ 0
π 0
Have you ever wondered why diffusion models memorize and all initializations lead to the same training sample? As we show, this is because like in dynamic systems, the memorized sample acts as an attractor and a corresponding attraction basin is formed in the denoising trajectory.
04.12.2024 21:03 β
π 48
π 10
π¬ 3
π 2
Optimal transport, convolution, and averaging define interpolations between probability distributions. One can find vector fields advecting particles that match these interpolations. They are the Benamou-Brenier, flow-matching, and Dacorogna-Moser fields.
04.12.2024 13:55 β
π 77
π 11
π¬ 1
π 0
Hellinger and Wasserstein are the two main geodesic distances on probability distributions. While both minimize the same energy, they differ in their interpolation methods: Hellinger focuses on density, whereas Wasserstein emphasizes position displacements.
03.12.2024 17:16 β
π 106
π 15
π¬ 1
π 0
This is a really nice blogpost by
@RuiqiGao and team that I enjoyed being a part of. My favorite key learnings are:
- DDIM sampler == flow matching sampling
- (Not) straight?
- SD3 weighting (Esser, Rombach, et al) is very similar to the EDM weighting (Karras, et al).
π
03.12.2024 13:26 β
π 10
π 2
π¬ 2
π 0
ahah yeah apologies for this, I am slowly learning how to write for non-theoretical proba crowd but it's a process π
02.12.2024 23:36 β
π 2
π 0
π¬ 0
π 0
Yeah I was referring to the coupling obtained after the flow matching operation (or "Reflow"). It's an interesting object in itself which is not exactly OT but still exhibit *some* level of straightness.
02.12.2024 23:29 β
π 1
π 0
π¬ 0
π 0
New Datasets Will Train AI Models To Think Like Scientists
New Datasets Will Train AI Models To Think Like Scientists on Simons Foundation
New datasets from @polymathicai.bsky.social available on @hf.co will train AI models to think like scientists. Read more: www.simonsfoundation.org/2024/12/02/n... #science #AI #machinelearning
02.12.2024 21:28 β
π 29
π 7
π¬ 0
π 0
Augmented Bridge Matching
Flow and bridge matching are a novel class of processes which encompass diffusion models. One of the main aspect of their increased flexibility is that these models can interpolate between arbitrary d...
I am a broken record but yeah totally agree. If you iterate FM on that coupling though you get OT though (If you add a bit of noise). In the case of noisy FM we showed that the only coupling that is left invariant by noisy FM is the EOT one in arxiv.org/abs/2311.06978
02.12.2024 19:22 β
π 2
π 0
π¬ 1
π 0
A common question nowadays: Which is better, diffusion or flow matching? π€
Our answer: Theyβre two sides of the same coin. We wrote a blog post to show how diffusion models and Gaussian flow matching are equivalent. Thatβs great: It means you can use them interchangeably.
02.12.2024 18:45 β
π 254
π 58
π¬ 6
π 7
What you are showing is the coupling *before* the flow matching procedure though, right? After the flow matching procedure the coupling is modified (image from arxiv.org/abs/2209.03003)
02.12.2024 14:41 β
π 5
π 0
π¬ 2
π 0
(Specific to diffusion models) but goes in the direction of what Sander was suggesting: i.e. these models learn a somewhat robust coupling data/Gaussian
30.11.2024 22:52 β
π 4
π 0
π¬ 1
π 0
What about arxiv.org/abs/2310.05264
30.11.2024 22:50 β
π 6
π 0
π¬ 1
π 0
Diffusion SchrΓΆdinger Bridge Matching
Solving transport problems, i.e. finding a map transporting one given distribution to another, has numerous applications in machine learning. Novel mass transport methods motivated by generative model...
Yeah in the sense of RF. Although RF wont get you to OT (Qiang Liu himself has a counterexample). But if you consider noisy flow matching (a la stochastic interpolant) then this procedure converges to EOT. Shameless plug + concurrent paper arxiv.org/abs/2303.16852 + arxiv.org/abs/2304.00917
30.11.2024 12:35 β
π 9
π 3
π¬ 1
π 0