Machine Learning in Science's Avatar

Machine Learning in Science

@mackelab.bsky.social

We build probabilistic #MachineLearning and #AI Tools for scientific discovery, especially in Neuroscience. Probably not posted by @jakhmack.bsky.social. πŸ“ @ml4science.bsky.social‬, TΓΌbingen, Germany

2,321 Followers  |  197 Following  |  60 Posts  |  Joined: 14.11.2024  |  2.1549

Latest posts by mackelab.bsky.social on Bluesky

Post image Post image Post image

Congrats to Dr Michael Deistler @deismic.bsky.social, who defended his PhD!

Michael worked on "Machine Learning for Inference in Biophysical Neuroscience Simulations", focusing on simulation-based inference and differentiable simulation.

We wish him all the best for the next chapter! πŸ‘πŸŽ“

02.10.2025 11:28 β€” πŸ‘ 30    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0

It goes without saying, but all posters obviously with @jakhmack.bsky.social as well!πŸ‘¨β€πŸ”¬ 9/9

30.09.2025 14:06 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

IV-21. @byoungsookim.bsky.social will present: Seeing in 3D: Compound eye integration in connectome-constrained models of the fruit fly (joint work with @srinituraga.bsky.social) 8/9

30.09.2025 14:06 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

IV-14. Zinovia Stefanidi will present: Progress on building connectome-constrained models of the whole fly optic lobe (joint work with @srinituraga.bsky.social) 7/9

30.09.2025 14:06 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

Session 4 (Wednesday 14:00):
IV-9: @stewah.bsky.social presents joint work with @danielged.bsky.social: A new perspective on LLM-based model discovery with applications in neuroscience 6/9

30.09.2025 14:06 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

III-9. @raeesmk.bsky.social will present: Modeling Spatial Hearing with Cochlear Implants Using Deep Neural Networks (joint work with @stefanieliebe.bsky.social) 5/9

30.09.2025 14:06 β€” πŸ‘ 4    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0

Session 3 (Wednesday 12:30):
III-6. @matthijspals.bsky.social will present: Sequence memory in distinct subspaces in data-constrained RNNs of human working memory (joint work with @stefanieliebe.bsky.social) 4/9

30.09.2025 14:06 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

II-9. @lulmer.bsky.social will present: Integrating neural activity measurements into connectome-constrained models (joint work with @srinituraga.bsky.social) 3/9

30.09.2025 14:06 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

Session 2 (Tuesday 18:00):
II-4. Isaac Omolayo will present: Contrastive Learning for Predicting Neural Activity in Connectome Constrained Deep Mechanistic Networks 2/9

30.09.2025 14:06 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

The Macke lab is well-represented at the @bernsteinneuro.bsky.social conference in Frankfurt this year! We have lots of exciting new work to present with 7 posters (detailsπŸ‘‡) 1/9

30.09.2025 14:06 β€” πŸ‘ 29    πŸ” 9    πŸ’¬ 1    πŸ“Œ 0

From hackathon to release: sbi v0.25 is here! πŸŽ‰

What happens when dozens of SBI researchers and practitioners collaborate for a week? New inference methods, new documentation, lots of new embedding networks, a bridge to pyro and a bridge between flow matching and score-based methods 🀯

1/7 🧡

09.09.2025 15:00 β€” πŸ‘ 28    πŸ” 16    πŸ’¬ 1    πŸ“Œ 0

Joint work of @vetterj.bsky.social, Manuel Gloeckler, @danielged.bsky.social, and @jakhmack.bsky.social

@ml4science.bsky.social, @tuebingen-ai.bsky.social, @unituebingen.bsky.social

23.07.2025 14:27 β€” πŸ‘ 3    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Preview
Effortless, Simulation-Efficient Bayesian Inference using Tabular Foundation Models Simulation-based inference (SBI) offers a flexible and general approach to performing Bayesian inference: In SBI, a neural network is trained on synthetic data simulated from a model and used to rapid...

πŸ“„ Check out the full paper for methods, experiments and more cool stuff: arxiv.org/abs/2504.17660
πŸ’» Code is available: github.com/mackelab/npe...

23.07.2025 14:27 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

πŸ’‘ Takeaway:
By leveraging foundation models like TabPFN, we can make SBI training-free, simulation-efficient, and easy to use.
This work is another step toward user-friendly Bayesian inference for a broader science and engineering community.

23.07.2025 14:27 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Results on the pyloric simulator.
(a) Voltage traces from the experimental measurement (top) and a posterior predictive simulated using the posterior mean from TSNPE-PF as the parameter (bottom). 
(b) Average distance (energy scoring rule) to observation and percentage of valid simulation from posterior samples; compared to experimental results obtained in Glaser et al.

Results on the pyloric simulator. (a) Voltage traces from the experimental measurement (top) and a posterior predictive simulated using the posterior mean from TSNPE-PF as the parameter (bottom). (b) Average distance (energy scoring rule) to observation and percentage of valid simulation from posterior samples; compared to experimental results obtained in Glaser et al.

But does it scale to complex real-world problems? We tested it on two challenging Hodgkin-Huxley-type models:
🧠 single-compartment neuron
πŸ¦€ 31-parameter crab pyloric network

NPE-PF delivers tight posteriors & accurate predictions with far fewer simulations than previous methods.

23.07.2025 14:27 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
SBI benchmark results for amortized and sequential NPE-PF. 
C2ST for NPE, NLE, and NPE-PF across ten reference posteriors (lower is better); dots indicate averages and bars show 95% confidence intervals over five independent runs.

SBI benchmark results for amortized and sequential NPE-PF. C2ST for NPE, NLE, and NPE-PF across ten reference posteriors (lower is better); dots indicate averages and bars show 95% confidence intervals over five independent runs.

What you get with NPE-PF:
🚫 No need to train inference nets or tune hyperparameters.
🌟 Competitive or superior performance vs. standard SBI methods.
πŸš€ Especially strong performance for smaller simulation budgets.
πŸ”„ Filtering to handle large datasets + support for sequential inference.

23.07.2025 14:27 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Illustration of NPE and NPE-PF: Both approaches use simulations sampled from the prior and simulator. In (standard) NPE, a neural density estimator is trained to obtain the posterior. In NPE-PF, the posterior is evaluated by autoregressively passing the simulation dataset and observations to TabPFN.

Illustration of NPE and NPE-PF: Both approaches use simulations sampled from the prior and simulator. In (standard) NPE, a neural density estimator is trained to obtain the posterior. In NPE-PF, the posterior is evaluated by autoregressively passing the simulation dataset and observations to TabPFN.

The key idea:
TabPFN, originally trained for tabular regression and classification, can estimate posteriors by autoregressively modeling one parameter dimension after the other.
It’s remarkably effective, even though TabPFN was not designed for SBI.

23.07.2025 14:27 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

SBI usually relies on training neural nets on simulated data to approximate posteriors. But:
⚠️ Simulators can be expensive
⚠️ Training & tuning neural nets can be tedious
Our method NPE-PF repurposes TabPFN as an in-context density estimator for training-free, simulation-efficient Bayesian inference.

23.07.2025 14:27 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

New preprint: SBI with foundation models!
Tired of training or tuning your inference network, or waiting for your simulations to finish? Our method NPE-PF can help: It provides training-free simulation-based inference, achieving competitive performance with orders of magnitude fewer simulations! ⚑️

23.07.2025 14:27 β€” πŸ‘ 23    πŸ” 9    πŸ’¬ 1    πŸ“Œ 2
Preview
Null and Noteworthy: Neurons tracking sequences don’t fire in order Instead, neurons encode the position of sequential items in working memory based on when they fire during ongoing brain wave oscillationsβ€”a finding that challenges a long-standing theory.

The neurons that encode sequential information into working memory do not fire in that same order during recall, a finding that is at odds with a long-standing theory. Read more in this month’s Null and Noteworthy.

By @ldattaro.bsky.social

#neuroskyence

www.thetransmitter.org/null-and-not...

30.06.2025 16:08 β€” πŸ‘ 42    πŸ” 19    πŸ’¬ 1    πŸ“Œ 0

Many people in our lab use Scholar Inbox regularly -- highly recommended!

02.07.2025 07:05 β€” πŸ‘ 7    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

This work was enabled and funded by an innovation project of @ml4science.bsky.social

11.06.2025 18:17 β€” πŸ‘ 1    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0

Congrats to @gmoss13.bsky.social, @coschroeder.bsky.social, @jakhmack.bsky.social , together with our great collaborators Vjeran ViΕ‘njeviΔ‡, Olaf Eisen, @oraschewski.bsky.social‬, Reinhard Drews. Thank you to @tuebingen-ai.bsky.social and @awi.de for marking this work possible.

11.06.2025 11:47 β€” πŸ‘ 2    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0

If you’re interested to learn more, check out the paper and code, or get in touch with @gmoss13.bsky.social

Code: github.com/mackelab/sbi...
Paper: www.cambridge.org/core/journal...

11.06.2025 11:47 β€” πŸ‘ 1    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0
Post image

We obtain posterior distributions over ice accumulation and melting rates for EkstrΓΆm Ice Shelf over the past hundreds of years. This allows us to make quantitative statements about the history of the atmospheric and oceanic conditions.

11.06.2025 11:47 β€” πŸ‘ 2    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0
Post image

Thanks to great data collection efforts from @geophys-tuebingen.bsky.social and @awi.de, we can apply this approach to EkstrΓΆm Ice Shelf, Antarctica.

11.06.2025 11:47 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

We develop a simulation-based-inference workflow for inferring the accumulation and melting rates from measurements of the internal layers.

11.06.2025 11:47 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

Radar measurements have long been used for measuring the internal layer structure of Antarctic ice shelves. This structure contains information about the history of the ice shelf. This includes the past rate of snow accumulation at the surface, as well as ice melting at the base.

11.06.2025 11:47 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Simulation-based inference of surface accumulation and basal melt rates of an Antarctic ice shelf from isochronal layers | Journal of Glaciology | Cambridge Core Simulation-based inference of surface accumulation and basal melt rates of an Antarctic ice shelf from isochronal layers - Volume 71

Thrilled to share that our paper on using simulation-based inference for inferring ice accumulation and melting rates for Antarctic ice shelves is now published in Journal of Glaciology!

www.cambridge.org/core/journal...

11.06.2025 11:47 β€” πŸ‘ 16    πŸ” 0    πŸ’¬ 1    πŸ“Œ 3
A wide shot of approximately 30 individuals standing in a line, posing for a group photograph outdoors. The background shows a clear blue sky, trees, and a distant cityscape or hills.

A wide shot of approximately 30 individuals standing in a line, posing for a group photograph outdoors. The background shows a clear blue sky, trees, and a distant cityscape or hills.

Great news! Our March SBI hackathon in TΓΌbingen was a huge success, with 40+ participants (30 onsite!). Expect significant updates soon: awesome new features & a revamped documentation you'll love! Huge thanks to our amazing SBI community! Release details coming soon. πŸ₯ πŸŽ‰

12.05.2025 14:29 β€” πŸ‘ 26    πŸ” 7    πŸ’¬ 0    πŸ“Œ 1

@mackelab is following 20 prominent accounts