Matthijs Pals's Avatar

Matthijs Pals

@matthijspals.bsky.social

Using deep learning to study neural dynamics @mackelab.bsky.social

893 Followers  |  594 Following  |  15 Posts  |  Joined: 14.11.2024  |  2.0498

Latest posts by matthijspals.bsky.social on Bluesky

Post image

Got prov. approval for 2 major grants in Neuro-AI & Dynamical Systems Reconstruction, on learning & inference in non-stationary environments, out-of-domain generalization, and DS foundation models. To all AI/math/DS enthusiasts: Expect job announcements (PhD/PostDoc) soon! Feel free to get in touch.

13.07.2025 06:23 โ€” ๐Ÿ‘ 34    ๐Ÿ” 8    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Vacatures bij de RUG

Jelmer Borst and I are looking for a PhD candidate to build an EEG-based model of human working memory! This is a really cool project that I've wanted to kick off for a while, and I can't wait to see it happen. Please share and I'm happy to answer any Qs about the project!
www.rug.nl/about-ug/wor...

03.07.2025 13:29 โ€” ๐Ÿ‘ 12    ๐Ÿ” 21    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Preview
Null and Noteworthy: Neurons tracking sequences donโ€™t fire in order Instead, neurons encode the position of sequential items in working memory based on when they fire during ongoing brain wave oscillationsโ€”a finding that challenges a long-standing theory.

The neurons that encode sequential information into working memory do not fire in that same order during recall, a finding that is at odds with a long-standing theory. Read more in this monthโ€™s Null and Noteworthy.

By @ldattaro.bsky.social

#neuroskyence

www.thetransmitter.org/null-and-not...

30.06.2025 16:08 โ€” ๐Ÿ‘ 42    ๐Ÿ” 19    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Preview
Abstract rule learning promotes cognitive flexibility in complex environments across species Nature Communications - Whether neurocomputational mechanisms that speed up human learning in changing environments also exist in other species remains unclear. Here, the authors show that both...

How do animals learn new rules? By systematically testing diff. behavioral strategies, guided by selective attn. to rule-relevant cues: rdcu.be/etlRV
Akin to in-context learning in AI, strategy selection depends on the animals' "training set" (prior experience), with similar repr. in rats & humans.

26.06.2025 15:30 โ€” ๐Ÿ‘ 8    ๐Ÿ” 2    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Post image

Out today in @nature.com: we show that individual neurons have diverse tuning to a decision variable computed by the entire population, revealing a unifying geometric principle for the encoding of sensory and dynamic cognitive variables.
www.nature.com/articles/s41...

25.06.2025 22:38 โ€” ๐Ÿ‘ 204    ๐Ÿ” 50    ๐Ÿ’ฌ 4    ๐Ÿ“Œ 4

Our new preprint ๐Ÿ‘€

09.06.2025 19:32 โ€” ๐Ÿ‘ 29    ๐Ÿ” 6    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Preview
Memory by a thousand rules: Automated discovery of functional multi-type plasticity rules reveals variety & degeneracy at the heart of learning Synaptic plasticity is the basis of learning and memory, but the link between synaptic changes and neural function remains elusive. Here, we used automated search algorithms to obtain thousands of str...

We just pushed โ€œMemory by a 1000 rulesโ€ onto bioRxiv, where we use clever #ML to find #plasticity quadruplets (EE, EI, IE, II) that learn basic stability in spiking nets. Why is it cool? We find 1000s!! of solutions, and they donโ€™t just stabilise. They #memorise! www.biorxiv.org/content/10.1...

02.06.2025 18:50 โ€” ๐Ÿ‘ 133    ๐Ÿ” 46    ๐Ÿ’ฌ 3    ๐Ÿ“Œ 5
A wide shot of approximately 30 individuals standing in a line, posing for a group photograph outdoors. The background shows a clear blue sky, trees, and a distant cityscape or hills.

A wide shot of approximately 30 individuals standing in a line, posing for a group photograph outdoors. The background shows a clear blue sky, trees, and a distant cityscape or hills.

Great news! Our March SBI hackathon in Tรผbingen was a huge success, with 40+ participants (30 onsite!). Expect significant updates soon: awesome new features & a revamped documentation you'll love! Huge thanks to our amazing SBI community! Release details coming soon. ๐Ÿฅ ๐ŸŽ‰

12.05.2025 14:29 โ€” ๐Ÿ‘ 25    ๐Ÿ” 7    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 1

Please RT๐Ÿ™

Reach out if you want to help understand cognition by modelling, analyzing and/or collect large scale intracortical data from ๐Ÿ‘ฉ๐Ÿ’๐Ÿ

We're a friendly, diverse group (n>25) w/ this terrace ๐Ÿ˜Ž in the center of Paris! See๐Ÿ‘‡ for + info about the lab

We have funding to support your application!

10.05.2025 14:23 โ€” ๐Ÿ‘ 39    ๐Ÿ” 21    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Jobs - mackelab The MackeLab is a research group at the Excellence Cluster Machine Learning at Tรผbingen University!

๐ŸŽ“Hiring now! ๐Ÿง  Join us at the exciting intersection of ML and Neuroscience! #AI4science
Weโ€™re looking for PhDs, Postdocs and Scientific Programmers that want to use deep learning to build, optimize and study mechanistic models of neural computations. Full details: www.mackelab.org/jobs/ 1/5

30.04.2025 13:43 โ€” ๐Ÿ‘ 23    ๐Ÿ” 13    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

Re-posting is appreciated: We have a fully funded PhD position in CMC lab @cmc-lab.bsky.social (at @tudresden_de). You can use forms.gle/qiAv5NZ871kv... to send your application and find more information. Deadline is April 30. Find more about CMC lab: cmclab.org and email me if you have questions.

20.02.2025 14:50 โ€” ๐Ÿ‘ 77    ๐Ÿ” 89    ๐Ÿ’ฌ 3    ๐Ÿ“Œ 8
Preview
Compositional simulation-based inference for time series Amortized simulation-based inference (SBI) methods train neural networks on simulated data to perform Bayesian inference. While this strategy avoids the need for tractable likelihoods, it often requir...

Excited to present our work on compositional SBI for time series at #ICLR2025 tomorrow!

If you're interested in simulation-based inference for time series, come chat with Manuel Gloeckler or Shoji Toyota

at Poster #420, Saturday 10:00โ€“12:00 in Hall 3.

๐Ÿ“ฐ: arxiv.org/abs/2411.02728

25.04.2025 08:53 โ€” ๐Ÿ‘ 25    ๐Ÿ” 4    ๐Ÿ’ฌ 2    ๐Ÿ“Œ 1
ICLR 2025 Comparing noisy neural population dynamics using optimal transport distances OralICLR 2025

Excited to announce that our paper on "Comparing noisy neural population dynamics using optimal transport distances" has been selected for an oral presentation in #ICLR2025 (1.8% top papers). Check the thread for paper details (0/n).

Presentation info: iclr.cc/virtual/2025....

22.04.2025 18:06 โ€” ๐Ÿ‘ 23    ๐Ÿ” 7    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

Happening tomorrow morning :).

28.03.2025 20:30 โ€” ๐Ÿ‘ 7    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Post image

The @mackelab.bsky.social is represented at @cosynemeeting.bsky.social #cosyne2025 in Montreal

with 3 posters, 2 workshop talks, and a main conference contributed talk (for the very first time in Mackelab history ๐ŸŽ‰)!

27.03.2025 14:03 โ€” ๐Ÿ‘ 16    ๐Ÿ” 4    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 2

Out now 'in print'--- a true labor of love, in more ways than one. See the paper and press-release below!

Also, go and see @matthijspals.bsky.social's talk at #Cosyne, where he will talk about related/follow up work!

uni-tuebingen.de/en/research/...

24.03.2025 15:06 โ€” ๐Ÿ‘ 35    ๐Ÿ” 12    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

Our study on sequence working memory using human spiking data and RNNs, is finally published :). Check it out! ๐Ÿ‘‡

24.03.2025 13:28 โ€” ๐Ÿ‘ 8    ๐Ÿ” 1    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Video thumbnail

In the physical world, almost all information is transmitted through traveling waves -- why should it be any different in your neural network?

Super excited to share recent work with the brilliant @mozesjacobs.bsky.social: "Traveling Waves Integrate Spatial Information Through Time"

1/14

10.03.2025 15:33 โ€” ๐Ÿ‘ 150    ๐Ÿ” 43    ๐Ÿ’ฌ 4    ๐Ÿ“Œ 6

Together with @dendritesgr.bsky.social, weโ€™ll be hosting a tutorial on constructing and optimizing biophysical models (via Jaxley & DendroTweaks) ๐Ÿš€

Join us in Florence if you like dendrites, biophysics, or optimization!

28.02.2025 08:08 โ€” ๐Ÿ‘ 23    ๐Ÿ” 9    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Post image

1) Some exciting science in turbulent times:

How do mice distinguish self-generated vs. object-generated looming stimuli? Our new study combines VR and neural recordings from superior colliculus (SC) ๐Ÿง ๐Ÿญ to explore this question.

Check out our preprint doi.org/10.1101/2024... ๐Ÿงต

03.02.2025 19:19 โ€” ๐Ÿ‘ 44    ๐Ÿ” 17    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 1

๐ŸšจExcited to share my first @biorxivpreprint.bsky.social๐Ÿšจ with the amazing Smith lab, @jbarbosa.org and Albert Compte who made this work possible.

We show that ๐Ÿ’prefrontal hemispheres combine redundancy (for precision) & weak connections (for capacity) for supporting spatial working memory (WM). 1/๐Ÿงต

20.01.2025 12:10 โ€” ๐Ÿ‘ 30    ๐Ÿ” 11    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 1

Hi, could you add me? :D

15.01.2025 16:21 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Post image

Pre-print ๐Ÿง ๐Ÿงช
Is mechanism modeling dead in the AI era?

ML models trained to predict neural activity fail to generalize to unseen opto perturbations. But mechanism modeling can solve that.

We say "perturbation testing" is the right way to evaluate mechanisms in data-constrained models

1/8

08.01.2025 16:33 โ€” ๐Ÿ‘ 115    ๐Ÿ” 46    ๐Ÿ’ฌ 4    ๐Ÿ“Œ 2

Talk to @vetterj.bsky.social and @gmoss13.bsky.social about sourcerer at #Neurips2024 today!
๐Ÿ“Poster #4006 (East; 11 am PT)

13.12.2024 16:49 โ€” ๐Ÿ‘ 13    ๐Ÿ” 1    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

That Eq. 3 can be low-D, while the observations (Eq. 5) can be very high dimensional is kind of a nice way to reconcile computational models of low-D dynamics with experimental recordings that are high-D, I think @bio-emergent.bsky.social has some nice work on this :).

12.12.2024 18:43 โ€” ๐Ÿ‘ 3    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

I agree that when very concretely linking the RNNs to data, this is important! There is a large line of work making the comparison more on the level of low-D "population" dynamics, where the exact biological equivalent of the neuron states is less important.

12.12.2024 18:37 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Preview
Mathematical Equivalence of Two Common Forms of Firing-Rate Models of Neural Networks We demonstrate the mathematical equivalence of two commonly used forms of firing-rate model equations for neural networks. In addition, we show that what is commonly interpreted as the firing rate in ...

Unless Iโ€™m missing something, you could still see 1 as describing a low-pass filtered version of the spike dynamics, see last paragraph of: pmc.ncbi.nlm.nih.gov/articles/PMC...

12.12.2024 17:09 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 2    ๐Ÿ“Œ 0

We reduce the cost of finding all fixed-points in piece-wise linear low-rank RNNs from 2^N to O(N^R)! This is part of our recent NeurIPS paper, presented tomorrow:
bsky.app/profile/mack...
Special thanks to Manuel Gloeckler for helping write out a proof! 7/7

11.12.2024 01:32 โ€” ๐Ÿ‘ 2    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Post image

But not all of the 2^N regions do! Each neuron partitions the R-dim subspace of dynamics with a hyperplane (here a line). N hyperplanes can partition R-dim space into at most O(N^R) regions. We can thus reduce our search space by only solving for fixed points in these!
6/7

11.12.2024 01:32 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Post image

Some of the the 2^N regions with linear dynamics intersect the subspace in which the dynamics unfold (span U), as the one here: 5/7

11.12.2024 01:32 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

@matthijspals is following 20 prominent accounts