It goes without saying, but all posters obviously with @jakhmack.bsky.social as well!π¨βπ¬ 9/9
30.09.2025 14:06 β π 0 π 0 π¬ 0 π 0
IV-21. @byoungsookim.bsky.social will present: Seeing in 3D: Compound eye integration in connectome-constrained models of the fruit fly (joint work with @srinituraga.bsky.social) 8/9
30.09.2025 14:06 β π 1 π 0 π¬ 1 π 0
IV-14. Zinovia Stefanidi will present: Progress on building connectome-constrained models of the whole fly optic lobe (joint work with @srinituraga.bsky.social) 7/9
30.09.2025 14:06 β π 1 π 0 π¬ 1 π 0
Session 4 (Wednesday 14:00):
IV-9: @stewah.bsky.social presents joint work with @danielged.bsky.social: A new perspective on LLM-based model discovery with applications in neuroscience 6/9
30.09.2025 14:06 β π 1 π 0 π¬ 1 π 0
III-9. @raeesmk.bsky.social will present: Modeling Spatial Hearing with Cochlear Implants Using Deep Neural Networks (joint work with @stefanieliebe.bsky.social) 5/9
30.09.2025 14:06 β π 4 π 1 π¬ 1 π 0
Session 3 (Wednesday 12:30):
III-6. @matthijspals.bsky.social will present: Sequence memory in distinct subspaces in data-constrained RNNs of human working memory (joint work with @stefanieliebe.bsky.social) 4/9
30.09.2025 14:06 β π 2 π 0 π¬ 1 π 0
II-9. @lulmer.bsky.social will present: Integrating neural activity measurements into connectome-constrained models (joint work with @srinituraga.bsky.social) 3/9
30.09.2025 14:06 β π 1 π 0 π¬ 1 π 0
Session 2 (Tuesday 18:00):
II-4. Isaac Omolayo will present: Contrastive Learning for Predicting Neural Activity in Connectome Constrained Deep Mechanistic Networks 2/9
30.09.2025 14:06 β π 2 π 0 π¬ 1 π 0
The Macke lab is well-represented at the @bernsteinneuro.bsky.social conference in Frankfurt this year! We have lots of exciting new work to present with 7 posters (detailsπ) 1/9
30.09.2025 14:06 β π 29 π 9 π¬ 1 π 0
From hackathon to release: sbi v0.25 is here! π
What happens when dozens of SBI researchers and practitioners collaborate for a week? New inference methods, new documentation, lots of new embedding networks, a bridge to pyro and a bridge between flow matching and score-based methods π€―
1/7 π§΅
09.09.2025 15:00 β π 28 π 16 π¬ 1 π 0
Joint work of @vetterj.bsky.social, Manuel Gloeckler, @danielged.bsky.social, and @jakhmack.bsky.social
@ml4science.bsky.social, @tuebingen-ai.bsky.social, @unituebingen.bsky.social
23.07.2025 14:27 β π 3 π 0 π¬ 0 π 0
π‘ Takeaway:
By leveraging foundation models like TabPFN, we can make SBI training-free, simulation-efficient, and easy to use.
This work is another step toward user-friendly Bayesian inference for a broader science and engineering community.
23.07.2025 14:27 β π 0 π 0 π¬ 1 π 0
Results on the pyloric simulator.
(a) Voltage traces from the experimental measurement (top) and a posterior predictive simulated using the posterior mean from TSNPE-PF as the parameter (bottom).
(b) Average distance (energy scoring rule) to observation and percentage of valid simulation from posterior samples; compared to experimental results obtained in Glaser et al.
But does it scale to complex real-world problems? We tested it on two challenging Hodgkin-Huxley-type models:
π§ single-compartment neuron
π¦ 31-parameter crab pyloric network
NPE-PF delivers tight posteriors & accurate predictions with far fewer simulations than previous methods.
23.07.2025 14:27 β π 1 π 0 π¬ 1 π 0
SBI benchmark results for amortized and sequential NPE-PF.
C2ST for NPE, NLE, and NPE-PF across ten reference posteriors (lower is better); dots indicate averages and bars show 95% confidence intervals over five independent runs.
What you get with NPE-PF:
π« No need to train inference nets or tune hyperparameters.
π Competitive or superior performance vs. standard SBI methods.
π Especially strong performance for smaller simulation budgets.
π Filtering to handle large datasets + support for sequential inference.
23.07.2025 14:27 β π 0 π 0 π¬ 1 π 0
Illustration of NPE and NPE-PF: Both approaches use simulations sampled from the prior and simulator. In (standard) NPE, a neural density estimator is trained to obtain the posterior. In NPE-PF, the posterior is evaluated by autoregressively passing the simulation dataset and observations to TabPFN.
The key idea:
TabPFN, originally trained for tabular regression and classification, can estimate posteriors by autoregressively modeling one parameter dimension after the other.
Itβs remarkably effective, even though TabPFN was not designed for SBI.
23.07.2025 14:27 β π 1 π 0 π¬ 1 π 0
SBI usually relies on training neural nets on simulated data to approximate posteriors. But:
β οΈ Simulators can be expensive
β οΈ Training & tuning neural nets can be tedious
Our method NPE-PF repurposes TabPFN as an in-context density estimator for training-free, simulation-efficient Bayesian inference.
23.07.2025 14:27 β π 1 π 0 π¬ 1 π 0
New preprint: SBI with foundation models!
Tired of training or tuning your inference network, or waiting for your simulations to finish? Our method NPE-PF can help: It provides training-free simulation-based inference, achieving competitive performance with orders of magnitude fewer simulations! β‘οΈ
23.07.2025 14:27 β π 23 π 9 π¬ 1 π 2
Null and Noteworthy: Neurons tracking sequences donβt fire in order
Instead, neurons encode the position of sequential items in working memory based on when they fire during ongoing brain wave oscillationsβa finding that challenges a long-standing theory.
The neurons that encode sequential information into working memory do not fire in that same order during recall, a finding that is at odds with a long-standing theory. Read more in this monthβs Null and Noteworthy.
By @ldattaro.bsky.social
#neuroskyence
www.thetransmitter.org/null-and-not...
30.06.2025 16:08 β π 42 π 19 π¬ 1 π 0
Many people in our lab use Scholar Inbox regularly -- highly recommended!
02.07.2025 07:05 β π 7 π 0 π¬ 0 π 0
This work was enabled and funded by an innovation project of @ml4science.bsky.social
11.06.2025 18:17 β π 1 π 1 π¬ 0 π 0
Congrats to @gmoss13.bsky.social, @coschroeder.bsky.social, @jakhmack.bsky.social , together with our great collaborators Vjeran ViΕ‘njeviΔ, Olaf Eisen, @oraschewski.bsky.socialβ¬, Reinhard Drews. Thank you to @tuebingen-ai.bsky.social and @awi.de for marking this work possible.
11.06.2025 11:47 β π 2 π 1 π¬ 1 π 0
If youβre interested to learn more, check out the paper and code, or get in touch with @gmoss13.bsky.social
Code: github.com/mackelab/sbi...
Paper: www.cambridge.org/core/journal...
11.06.2025 11:47 β π 1 π 1 π¬ 1 π 0
We obtain posterior distributions over ice accumulation and melting rates for EkstrΓΆm Ice Shelf over the past hundreds of years. This allows us to make quantitative statements about the history of the atmospheric and oceanic conditions.
11.06.2025 11:47 β π 2 π 1 π¬ 1 π 0
Thanks to great data collection efforts from @geophys-tuebingen.bsky.social and @awi.de, we can apply this approach to EkstrΓΆm Ice Shelf, Antarctica.
11.06.2025 11:47 β π 0 π 0 π¬ 1 π 0
We develop a simulation-based-inference workflow for inferring the accumulation and melting rates from measurements of the internal layers.
11.06.2025 11:47 β π 0 π 0 π¬ 1 π 0
Radar measurements have long been used for measuring the internal layer structure of Antarctic ice shelves. This structure contains information about the history of the ice shelf. This includes the past rate of snow accumulation at the surface, as well as ice melting at the base.
11.06.2025 11:47 β π 0 π 0 π¬ 1 π 0
A wide shot of approximately 30 individuals standing in a line, posing for a group photograph outdoors. The background shows a clear blue sky, trees, and a distant cityscape or hills.
Great news! Our March SBI hackathon in TΓΌbingen was a huge success, with 40+ participants (30 onsite!). Expect significant updates soon: awesome new features & a revamped documentation you'll love! Huge thanks to our amazing SBI community! Release details coming soon. π₯ π
12.05.2025 14:29 β π 26 π 7 π¬ 0 π 1
Neuro-AI PhD at @c3neuro.bsky.social and @mackelab.bsky.social, TΓΌbingen AI Center
Across many scientific disciplines, researchers in the Bernstein Network connect experimental approaches with theoretical models to explore brain function.
Master's student in Computational Neuroscience @gtc-tuebingen.bsky.social | Research Assistant at @mackelab.bsky.social | Formerly RA at RoLiLab @mpicybernetics.bsky.social & NeLy @epfl-brainmind.bsky.social
Research scientist at Google in Zurich
http://research.google/teams/connectomics
PhD from @mackelab.bsky.social
We are a joint partnership of University of TΓΌbingen and Max Planck Institute for Intelligent Systems. We aim at developing robust learning systems and societally responsible AI. https://tuebingen.ai/imprint
https://tuebingen.ai/privacy-policy#c1104
Dedicated to helping neuroscientists stay current and build connections. Subscribe to receive the latest news and perspectives on neuroscience: www.thetransmitter.org/newsletters/
Understanding life. Advancing health.
Looking at protists with the eyes of a theoretical neuroscientist.
Looking at brains with the eyes of a protistologist.
(I also like axon initial segments)
Forthcoming book: The Brain, in Theory.
http://romainbrette.fr/
Neuroscientist and physician specializing in epilepsy. Interested in brain function and translating discoveries to improve patient care. https://liebelab.github.io
Led by PI @stefanieliebe.bsky.social : We use #AI tools to analyze neural activity and behavior in humans, bridging basic research on cognition and clinical applications, with a focus on #epilepsy. Based in TΓΌbingen, Germany.
https://liebelab.github.io
PhD student at @mackelab.bsky.social
Full Professor of Computational Statistics at TU Dortmund University
Scientist | Statistician | Bayesian | Author of brms | Member of the Stan and BayesFlow development teams
Website: https://paulbuerkner.com
Opinions are my own
Asst. Prof. University of Amsterdam, rock climber, husband, father. Model-based (Mathematical) Cognitive Neuroscientist. I study decision-making, EEG, π§ , statistical methods, etc.
πΊπ¦πͺπΊ
Professor of Statistics and Machine Learning at UCL Statistical Science. Interested in computational statistics, machine learning and applications in the sciences & engineering.
Full Professor at @deptmathgothenburg.bsky.social | simulation-based inference | Bayes | stochastic dynamical systems | https://umbertopicchini.github.io/
Machine learner & physicist. At CuspAI, I teach machines to discover materials for carbon capture. Previously Qualcomm AI Research, NYU, Heidelberg U.
Posting about the One World Approximate Bayesian Inference (ABI) Seminar, details at https://warwick.ac.uk/fac/sci/statistics/news/upcoming-seminars/abcworldseminar/