Arnaud Doucet's Avatar

Arnaud Doucet

@arnauddoucet.bsky.social

Senior Staff Research Scientist @Google DeepMind, previously Stats Prof @Oxford Uni - interested in Computational Statistics, Generative Modeling, Monte Carlo methods, Optimal Transport.

857 Followers  |  217 Following  |  10 Posts  |  Joined: 04.12.2024  |  1.7363

Latest posts by arnauddoucet.bsky.social on Bluesky

Post image

(1/n)๐ŸšจTrain a model solving DFT for any geometry with almost no training data
Introducing Self-Refining Training for Amortized DFT: a variational method that predicts ground-state solutions across geometries and generates its own training data!
๐Ÿ“œ arxiv.org/abs/2506.01225
๐Ÿ’ป github.com/majhas/self-...

10.06.2025 19:49 โ€” ๐Ÿ‘ 12    ๐Ÿ” 4    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 1
Preview
็”˜ๅˆฉ ไฟŠไธ€ ๆ „่ช‰็ ”็ฉถๅ“กใŒใ€Œไบฌ้ƒฝ่ณžใ€ใ‚’ๅ—่ณž ็”˜ๅˆฉ ไฟŠไธ€ๆ „่ช‰็ ”็ฉถๅ“ก๏ผˆๆœฌๅ‹™๏ผšๅธไบฌๅคงๅญฆ ๅ…ˆ็ซฏ็ทๅˆ็ ”็ฉถๆฉŸๆง‹ ็‰นไปปๆ•™ๆŽˆ๏ผ‰ใฏใ€ไบบๅทฅใƒ‹ใƒฅใƒผใƒฉใƒซใƒใƒƒใƒˆใƒฏใƒผใ‚ฏใ€ๆฉŸๆขฐๅญฆ็ฟ’ใ€ๆƒ…ๅ ฑๅนพไฝ•ๅญฆๅˆ†้‡Žใงใฎๅ…ˆ้ง†็š„ใช็ ”็ฉถใŒ่ฉ•ไพกใ•ใ‚Œใ€็ฌฌ40ๅ›ž๏ผˆ2025๏ผ‰ไบฌ้ƒฝ่ณž๏ผˆๅ…ˆ็ซฏๆŠ€่ก“้ƒจ้–€ใ€€ๅ—่ณžๅฏพ่ฑกๅˆ†้‡Ž๏ผšๆƒ…ๅ ฑ็ง‘ๅญฆ๏ผ‰ใ‚’ๅ—่ณžใ—ใพใ—ใŸใ€‚

Shunichi Amari has been awarded the 40th (2025) Kyoto Prize in recognition of his pioneering research in the fields of artificial neural networks, machine learning, and information geometry

www.riken.jp/pr/news/2025...

20.06.2025 13:26 โ€” ๐Ÿ‘ 35    ๐Ÿ” 12    ๐Ÿ’ฌ 2    ๐Ÿ“Œ 0
Preview
LOGML 2025 London Geometry and Machine Learning Summer School, July 7-11 2025

๐ŸŒŸApplications open- LOGML 2025๐ŸŒŸ

๐Ÿ‘ฅMentor-led projects, expert talks, tutorials, socials, and a networking night
โœ๏ธApplication form: logml.ai
๐Ÿ”ฌProjects: www.logml.ai/projects.html
๐Ÿ“…Apply by 6th April 2025
โœ‰๏ธQuestions? logml.committee@gmail.com

#MachineLearning #SummerSchool #LOGML #Geometry

11.03.2025 15:24 โ€” ๐Ÿ‘ 20    ๐Ÿ” 9    ๐Ÿ’ฌ 2    ๐Ÿ“Œ 1

Just write a short informal email. If the person needs a long-winded polite email to answer, then perhaps you don't want to have to interact with them.

09.03.2025 13:42 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Post image

SuperDiff goes super big!
- Spotlight at #ICLR2025!๐Ÿฅณ
- Stable Diffusion XL pipeline on HuggingFace huggingface.co/superdiff/su... made by Viktor Ohanesian
- New results for molecules in the camera-ready arxiv.org/abs/2412.17762
Let's celebrate with a prompt guessing game in the thread๐Ÿ‘‡

06.03.2025 21:06 โ€” ๐Ÿ‘ 14    ๐Ÿ” 4    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 1
Post image

Why academia is sleepwalking into self-destruction. My editorial @brain1878.bsky.social If you agree with the sentiments please repost. It's important for all our sakes to stop the madness
academic.oup.com/brain/articl...

06.03.2025 19:15 โ€” ๐Ÿ‘ 534    ๐Ÿ” 309    ๐Ÿ’ฌ 51    ๐Ÿ“Œ 104
Post image

Excited to see our paper โ€œComputing Nonequilibrium Responses with Score-Shifted Stochastic Differential Equationsโ€ in Physical Review Letters this morning as an Editorโ€™s Suggestion! We uses ideas from generative modeling to unravel a rather technical problem. ๐Ÿงต journals.aps.org/prl/abstract...

04.03.2025 18:45 โ€” ๐Ÿ‘ 10    ๐Ÿ” 2    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

Great intro to PAC-Bayes bounds. Highly recommended!

05.03.2025 09:55 โ€” ๐Ÿ‘ 12    ๐Ÿ” 2    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

Well you can do it but we don't have any proof. We actually also ran alpha-DSBM with zero-variance noise, i.e. so really an "alpha-rectified flow": experimentally it does "work" but we have no proof of convergence for the procedure.

08.02.2025 17:54 โ€” ๐Ÿ‘ 3    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

Yes the trajectories are not quite smooth as they correspond to a Brownian bridge and, as the variance of the reference Brownian motion of your SB goes to zero, you get back to the deterministic and straight paths of OT.

08.02.2025 12:02 โ€” ๐Ÿ‘ 4    ๐Ÿ” 1    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

Better diffusions with scoring rules!

Fewer, larger denoising steps using distributional losses; learn the posterior distribution of clean samples given the noisy versions.

arxiv.org/pdf/2502.02483

@vdebortoli.bsky.social Galashov Guntupalli Zhou @sirbayes.bsky.social @arnauddoucet.bsky.social

05.02.2025 14:23 โ€” ๐Ÿ‘ 29    ๐Ÿ” 8    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 3
Preview
On the Asymptotics of Importance Weighted Variational Inference For complex latent variable models, the likelihood function is not available in closed form. In this context, a popular method to perform parameter estimation is Importance Weighted Variational Infere...

A standard ML approach for parameter estimation in latent variable models is to maximize the expectation of the logarithm of an importance sampling estimate of the intractable likelihood. We provide consistency/efficiency results for the resulting estimate: arxiv.org/abs/2501.08477

16.01.2025 17:43 โ€” ๐Ÿ‘ 30    ๐Ÿ” 2    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Post image

Speculative sampling accelerates inference in LLMs by drafting future tokens which are verified in parallel. With @vdebortoli.bsky.social , A. Galashov & @arthurgretton.bsky.social , we extend this approach to (continuous-space) diffusion models: arxiv.org/abs/2501.05370

10.01.2025 16:30 โ€” ๐Ÿ‘ 46    ๐Ÿ” 10    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

I personally read at least a couple of hours per day. It is not particularly focused and I might "waste" time but I just enjoy it.

08.01.2025 08:05 โ€” ๐Ÿ‘ 14    ๐Ÿ” 1    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 1

Very nice paper indeed. I like it.

27.12.2024 16:38 โ€” ๐Ÿ‘ 2    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Post image

๐Ÿ”Š Super excited to announce the first ever Frontiers of Probabilistic Inference: Learning meets Sampling workshop at #ICLR2025 @iclr-conf.bsky.social!

๐Ÿ”— website: sites.google.com/view/fpiwork...

๐Ÿ”ฅ Call for papers: sites.google.com/view/fpiwork...

more details in thread below๐Ÿ‘‡ ๐Ÿงต

18.12.2024 19:09 โ€” ๐Ÿ‘ 84    ๐Ÿ” 19    ๐Ÿ’ฌ 2    ๐Ÿ“Œ 3
Post image Post image Post image

Schrรถdinger Bridge Flow for Unpaired Data Translation (by @vdebortoli.bsky.social et al.)

It will take me some time to digest this article fully, but it's important to follow the authors' advice and read the appendices, as the examples are helpful and well-illustrated.

๐Ÿ“„ arxiv.org/abs/2409.09347

17.12.2024 16:53 โ€” ๐Ÿ‘ 22    ๐Ÿ” 10    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
BreimanLectureNeurIPS2024_Doucet.pdf

The slides of my NeurIPS lecture "From Diffusion Models to Schrรถdinger Bridges - Generative Modeling meets Optimal Transport" can be found here
drive.google.com/file/d/1eLa3...

15.12.2024 18:40 โ€” ๐Ÿ‘ 327    ๐Ÿ” 67    ๐Ÿ’ฌ 9    ๐Ÿ“Œ 6
Video thumbnail

One #postdoc position is still available at the National University of Singapore (NUS) to work on sampling, high-dimensional data-assimilation, and diffusion/flow models. Applications are open until the end of January. Details:

alexxthiery.github.io/jobs/2024_di...

15.12.2024 14:46 โ€” ๐Ÿ‘ 41    ๐Ÿ” 18    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

I couldn't speak for the following 3 days :-)

14.12.2024 22:13 โ€” ๐Ÿ‘ 2    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Post image

I have updated my course notes on Optimal Transport with a new Chapter 9 on Wasserstein flows. It includes 3 illustrative applications: training a 2-layer MLP, deep transformers, and flow-matching generative models. You can access it here: mathematical-tours.github.io/book-sources...

04.12.2024 08:11 โ€” ๐Ÿ‘ 103    ๐Ÿ” 19    ๐Ÿ’ฌ 2    ๐Ÿ“Œ 0
Preview
On the optimality of coin-betting for mean estimation Confidence sequences are sequences of confidence sets that adapt to incoming data while maintaining validity. Recent advances have introduced an algorithmic formulation for constructing some of the ti...

exciting new work by my truly brilliant postdoc Eugenio Clerico on the optimality of coin-betting strategies for mean estimation!

for fans of: mean estimation, online learning with log loss, optimal portfolios, hypothesis testing with E-values, etc.

dig in:
arxiv.org/abs/2412.02640

04.12.2024 08:13 โ€” ๐Ÿ‘ 41    ๐Ÿ” 8    ๐Ÿ’ฌ 2    ๐Ÿ“Œ 0

@arnauddoucet is following 20 prominent accounts