We explore how to train conditional generative models to sample molecular conformations from their Boltzmann distribution β using only a reward signal.
16.07.2025 14:03 β π 0 π 0 π¬ 0 π 0@lucascimeca.bsky.social
AI Research @ Mila | Harvard | Cambridge | Edinburgh
We explore how to train conditional generative models to sample molecular conformations from their Boltzmann distribution β using only a reward signal.
16.07.2025 14:03 β π 0 π 0 π¬ 0 π 0π GenBio Workshop
Torsional-GFN: A Conditional Conformation Generator for Small Molecules
π₯ Authors
Lena NΓ©hale Ezzine*, Alexandra Volokhova*, Piotr GaiΕski, Luca Scimeca, Emmanuel Bengio, Prudencio Tossou, Yoshua Bengio, and Alex HernΓ‘ndez-GarcΓa
(* equal contribution)
Read the paper here:
arxiv.org/pdf/2502.06999
β’ Works out-of-the-box with large priors like StyleGAN3, NVAE, Stable Diffusion 3, and FoldFlow 2.
β’ Unifies constrained generation, RL-with-human-feedback, and protein design in a single framework.
β’ Outperforms both amortized data-space samplers and traditional MCMC across tasks.
β’ We show how to turn any pretrained generator (GAN, VAE, flow) into a conditional sampler by training a diffusion model directly in noise space.
β’ The diffusion sampler is trained with RL
β’ Noise-space posteriors are smoother, giving faster, more stable inference.
π₯ Where youβll find our work:
π Main Track
Outsourced Diffusion Sampling: Efficient Posterior Inference in Latent Spaces of Generative Models
π₯ Authors
Siddarth Venkatraman, Mohsin Hasan, Minsu Kim, Luca Scimeca, Marcin Sendera, Yoshua Bengio, Glen Berseth, Nikolay Malkin
Iβm attending ICML in Vancouver this week!
Itβs already been great to connect, chat, and hear about the amazing work happening across the community.
If youβre attending and would like to meet up, feel free to reach out!
(More details below)
#ICML2025 #MachineLearning #AI #DiffusionModels #GenAI
πΉ Outsourced Diffusion Sampling: Efficient Posterior Inference in Latent Spaces of Generative Models.
π Authors: Siddarth Venkatraman, Mohsin Hasan, Minsu Kim, Luca Scimeca, β¦, Yoshua Bengio, Nikolay Malkin
paper: arxiv.org/pdf/2502.06999
π To be presented at FPI-ICLR2025 & ICLR 2025 DeLTa Workshops
πΉ Solving Bayesian Inverse Problems with Diffusion Priors and Off-Policy RL.
π Authors: Luca Scimeca, Siddarth Venkatraman, Moksh Jain, Minsu Kim, Marcin Sendera, Mohsin Hasan, β¦, Yoshua Bengio, Glen Berseth, Nikolay Malkin
π To be presented at ICLR 2025 DeLTa Workshop
πΉ Mitigating Shortcut Learning with Diffusion Counterfactuals and Diverse Ensembles.
π Authors: Luca Scimeca, Alexander Rubinstein, Damien Teney, Seong Joon Oh, Yoshua Bengio
paper: arxiv.org/pdf/2311.16176
π To be presented at SCSL @ ICLR 2025 Workshop
πΉ Shaping Inductive Bias in Diffusion Models through Frequency-Based Noise Control.
π Authors: Thomas Jiralerspong, Berton Earnshaw, Jason Hartford, Yoshua Bengio, Luca Scimeca
paper: arxiv.org/pdf/2502.10236?
π To be presented at FPI-ICLR2025 & ICLR 2025 DeLTa Workshops
Thrilled to share that we will be presenting 4 papers across 3 workshops at #ICLR2025 in Singapore this week!
If you're attending, letβs connect! Feel free to DM me for more details about the work or potential collaborations.
See you at the venue! πΈπ¬
(More info to follow)
@mila-quebec.bsky.social
Thank Alex for his great efforts and work ethic. Thank @damienteney.bsky.social and @lucascimeca.bsky.social for their continued help with this paper. Weβll humbly address the criticisms to improve it further for future opportunities.
23.01.2025 22:21 β π 5 π 1 π¬ 1 π 0Come check out our neurips poster today! We will be at West Ballroom #7101 from 4:30pm - 7:30pm.
Website: github.com/gfnorg/diffu...
If you're attending, come check out our posters or feel free to reach out to connect during the conference!
Looking forward to insightful conversations and connecting with everyone; See you all at NeurIPS!
#NeurIPS2024 #NIPS24 #MachineLearning #DiffusionModels #Research #AI
Amortizing Intractable Inference in Diffusion Models for Bayesian Inverse Problems. Venkatraman, S., Jain, M., Scimeca, L., Kim, M., Sendera, M.,β¦, Bengio, Y., Malkin, K.
12.12.2024 06:28 β π 0 π 0 π¬ 0 π 0On Diffusion Models for Amortized Inference: Benchmarking and Improving Stochastic Control and Sampling. Sendera, M., Kim, M., Mittal, S., Lemos, P., Scimeca, L., Rector-Brooks, J., Adam, A., Bengio, Y., and Malkin, N.
arxiv.org/abs/2402.05098
Amortizing Intractable Inference in Diffusion Models for Vision, Language, and Control. Venkatraman, S., Jain, M., Scimeca, L., Kim, M., Sendera, M.,β¦, Bengio, Y., Malkin, K.
arxiv.org/abs/2405.20971
Excited to share that we will be presenting three papers at #NeurIPS2024 this week in Vancouver, pushing forward our work on Diffusion Models!
12.12.2024 06:23 β π 0 π 0 π¬ 3 π 0Hi, can I be added to the pack? :)
12.12.2024 06:19 β π 2 π 0 π¬ 1 π 0