With @quentinbertrand.bsky.social we have one offer for a Marie Sklodowska-Curie postdoctoral fellowships at Inria, to work on generative models : www.inria.fr/en/marie-skl...
contact me if interested! RT appreciated ❤️
@mathurinmassias.bsky.social
Tenured Researcher @INRIA, Ockham team. Teacher @Polytechnique and @ENSdeLyon Machine Learning, Python and Optimization
With @quentinbertrand.bsky.social we have one offer for a Marie Sklodowska-Curie postdoctoral fellowships at Inria, to work on generative models : www.inria.fr/en/marie-skl...
contact me if interested! RT appreciated ❤️
We are recruiting four positions connected to Machine Learning, Statistical Learning, and AI for Science in the Applied Mathematics department at École polytechnique. Join our vibrant community at IP Paris and Hi! Paris IA center. List below🧵 tinyurl.com/3jpw9t26
06.02.2026 07:56 — 👍 11 🔁 19 💬 1 📌 0There is an Associate Professor position in CS at ENS Lyon, with potential integration in my team, starting in sept 2026: DM me in interested!
Details at www.ens-lyon.fr/LIP/images/P...
Three snippets of python code showing how to use skrub Data Ops with the Optuna optimization library.The first snippet shows a standard randomized search with the Data Ops. The second snippet adds the parameter "backend", which is set to "optuna". The third snippet uses the Optuna visualization API to plot information from the study.
Did you know that the skrub Data Ops support Optuna as backend to run hyperparameter search?
It's as easy as writing "backend='optuna'": this will set up a default Optuna study (and the TPE sampler) to replace the standard random sampler.
Les inscriptions aux journées MODE 2026 à Nice sont désormais ouvertes. Elles se dérouleront du 18 au 20 mars à l'Hôtel Saint-Paul.
Les inscriptions sont ouvertes jusqu'au 1 mars (majoration > 9/02). La deadline pour soumettre une communication est le **15 janvier**.
L'école d'été #Peyresq2026 #GRETSI portera sur le thème "Modèles génératifs et transport optimal" du 21 au 27 juin 2026
gretsi.fr/peyresq2026
Inscription du 18 décembre 2025 au 27 février 2026
Ecole d'été de Peyresq sur les modèles génératifs et le transport optimal : www.gretsi.fr/peyresq2026 (cours en français). Date limite de candidature le 27 février
12.12.2025 13:22 — 👍 9 🔁 6 💬 0 📌 0Openreview opened the door to continuous and major revisions that nobody has time to check properly.
I think that we should come back to short one pdf page replies to reviews. It would mean having decisions quicker so that we actually have time to work on papers before resubmitting them.
[Concours CNRS] Si comme moi, vous attendiez et que vous n'aviez pas vu passer ça.
Ouverture du concours : (a priori) aujourd'hui 8/12
Dépôt des candidatures : jusqu'au 7 janvier
Des conseils que j'ai regroupés pour ces concours (surtout section 1 2 et 3) : mathurinm.github.io/cnrs_inria_a...
02.12.2025 13:51 — 👍 9 🔁 7 💬 0 📌 1Une raison de plus de vous abonner à la chaîne YouTube du colloque #GRETSI :
"Transport optimal, de Monge à l’apprentissage profond", conférence plénière de Julie Delon au colloque #GRETSI2025
www.youtube.com/watch?v=ujYS...
I have several offers for Master internships / PhDs on graph ML funded by ERC MALAGA for 2026. Don't hesitate to contact me to apply!
All infos here: nkeriven.github.io/malaga/
The JMLR story and operating model should be widely known in academia as a clear success story for full open access. I have friends in the humanities and pure sciences that have no clue this is even possible
05.11.2025 01:16 — 👍 14 🔁 7 💬 1 📌 0To understand these phenomena, we study the spatial regularity of the velocity/denoiser over time: we observe a gap between the closed-form and trained model.
Applying Jacobian regularization, we recover effects seen previously on perturbed denoisers (drift vs noise)
Different loss weightings favor different times: which temporal regime drives the generation quality ? Controlled perturbations reveal: drift type effects at early times (& good FID) and noise type at late times (& bad FID)
05.11.2025 09:04 — 👍 0 🔁 0 💬 1 📌 0In practice, training a denoiser involves design choices: the parametrization (velocity as in FM, residual Ɛ as in diffusion, or standard denoiser?) and the loss weighting, each influencing the generation quality
05.11.2025 09:04 — 👍 0 🔁 0 💬 1 📌 0🌀🌀🌀New paper on the generation phases of Flow Matching arxiv.org/abs/2510.24830
Are FM & diffusion models nothing else than denoisers at every noise level?
In theory yes, *if trained optimally*. But in practice, do all noise level equally matter?
with @annegnx.bsky.social, S Martin & R Gribonval
We dig into this equivalence in our latest preprint with @annegnx.bsky.social ! arxiv.org/abs/2510.24830
30.10.2025 07:24 — 👍 1 🔁 1 💬 0 📌 0Strong afternoon session: Ségolène Martin on how to go from flow matching to denoisers (and hopefully come back?) and Claire Boyer on how learning rate and working in latent spaces affect diffusion models
24.10.2025 15:03 — 👍 3 🔁 1 💬 1 📌 0Followed by Scott Pesme on how to use diffusion/flow matching based MMSE to compute a MAP (and nice examples!), and Thibaut Issenhuth on new ways to learn consistency models
@skate-the-apple.bsky.social
Next is @annegnx.bsky.social presenting our neurips paper on why flow matching generalizes, while it shouldn't!
arxiv.org/abs/2506.03719
Kickstarting our workshop on Flow matching and Diffusion with a talk by Eric Vanden Eijnden on how to optimize learning and sampling in Stochastic Interpolants!
Broadcast available at gdr-iasis.cnrs.fr/reunions/mod...
My paper on Generalized Gradient Norm Clipping & Non-Euclidean (L0, L1)-Smoothness (together with collaborators from EPFL) was accepted as an oral at NeurIPS! We extend the theory for our Scion algorithm to include gradient clipping. Read about it here arxiv.org/abs/2506.01913
19.09.2025 16:48 — 👍 16 🔁 3 💬 1 📌 0merci David !
19.09.2025 16:34 — 👍 0 🔁 0 💬 0 📌 0merci !
19.09.2025 16:33 — 👍 0 🔁 0 💬 0 📌 0Our work on the generalization of Flow Matching got an oral at Neurips !
Go see @quentinbertrand.bsky.social present it there :)
🔥 Excited to announce the Workshop on the Principles of Generative Models at @euripsconf.bsky.social (the European conference parallel to NeurIPS 2025)
🇩🇰 Dec 6–7, Copenhagen
📝 Deadline for contributions: Oct 17
🔗 Website: sites.google.com/view/prigm-e...
Félicitations Anna !!
09.09.2025 11:36 — 👍 1 🔁 0 💬 1 📌 0Oui, tout sera en anglais !
04.09.2025 12:12 — 👍 1 🔁 0 💬 0 📌 0Oui !
04.09.2025 12:11 — 👍 1 🔁 0 💬 0 📌 0