Could be a cell type difference (Gardner 2022 focused on grid cells) or a region difference? Methods difference? Lots of interesting possibilities!
04.08.2025 20:35 — 👍 0 🔁 0 💬 0 📌 0@mattperich.bsky.social
Neuroscience, engineering, AI, music. Asst. Professor / PI at University of Montréal and Mila.
Could be a cell type difference (Gardner 2022 focused on grid cells) or a region difference? Methods difference? Lots of interesting possibilities!
04.08.2025 20:35 — 👍 0 🔁 0 💬 0 📌 0Though I will say that some evidence suggests it's not going to always be 1:1 with environment. E.g. the 2022 Gardner ERC place cell paper we mention in the article has place cells mapping even square environments into a toroid shape. Though the Guo 2024 CA1 paper has environment-hspaed manifolds
04.08.2025 20:34 — 👍 0 🔁 0 💬 2 📌 0There's a long and fun conversation to be had here 🙂. But I agree, the "many-to-few" nature of neurons to manifolds allows considerable drift in single neuron activity without changing the manifold. Important in next steps to find out when, and how, neural drift changes manifold-level properties!
04.08.2025 20:27 — 👍 3 🔁 0 💬 0 📌 0Indeed, IMO behavior (and environment, etc) are inextricably linked to manifold properties. For this reason, comparative (e.g. for same behavior, are manifolds different in different regions?) and causal (e.g., move activity on the manifold and predict behavioral changes) experiments are essential.
04.08.2025 20:24 — 👍 0 🔁 0 💬 1 📌 0Thanks for the kind words and really glad you enjoyed the article!
04.08.2025 20:23 — 👍 3 🔁 0 💬 0 📌 0Thanks to @emilysingerneuro.bsky.social for another opportunity to work with The Transmitter (which is an awesome publication), and of course the many, many long conversations on manifolds with @juangallego.bsky.social that shaped these articles 🙂
04.08.2025 18:47 — 👍 4 🔁 0 💬 0 📌 0📰 I really enjoyed writing this article with @thetransmitter.bsky.social! In it, I summarize parts of our recent perspective article on neural manifolds (www.nature.com/articles/s41...), with a focus on highlighting just a few cool insights into the brain we've already seen at the population level.
04.08.2025 18:45 — 👍 36 🔁 9 💬 1 📌 0🚨New paper🚨
Neural manifolds went from a niche-y word to an ubiquitous term in systems neuro thanks to many interesting findings across fields. But like with any emerging term, people use it very differently.
Here, we clarify our take on the term, and review key findings & challenges rdcu.be/ex8hW
'manifolds', and the overall conception of the brain using a dynamical systems framework, have come a long way.
30.07.2025 05:36 — 👍 20 🔁 5 💬 0 📌 0A lot has changed since we wrote our last perspective piece in 2017 (www.cell.com/neuron/fullt..., both in how we think about neural manifolds and in the prevalence in the field. We hope this paper provides a good primer for the ideas, and points towards some big open questions in this space.
29.07.2025 19:07 — 👍 5 🔁 0 💬 0 📌 0Check out our new review/perspective (w/ @juangallego.bsky.social & Devika Narain) on neural manifolds in the brain! It was a lot of fun to think through these ideas over the past couple of years, and I'm excited it's finally out in the world!
🔗: www.nature.com/articles/s41...
📄: rdcu.be/ex8hW
I guess I wouldn't think of babies as "learning" a foundation model. IMO a better analogy is that evolution learned the foundation model and babies during development are fine-tuning it (albeit in a "multi-task" way) for their bodies/experiences .
01.07.2025 19:02 — 👍 3 🔁 0 💬 2 📌 0Finally out!
Eight years after initiating this study with Simon Borgognon, Nicolo Macellari, and Gregoire Courtine, we have uncovered neural population dynamics shared among premotor, motor, and somatosensory cortices during various types of locomotor tasks.
www.nature.com/articles/s41...
Our new approach for scalable, generalizable, and efficient neural population decoding is now online! Here we focus on real-time BCI but I'm excited about all of our next steps building on this. Awesome work led by @averyryoo.bsky.social @nandahkrishna.bsky.social @ximengmao.bsky.social
11.06.2025 14:44 — 👍 25 🔁 1 💬 0 📌 0BluePrints, naturally
20.05.2025 20:20 — 👍 21 🔁 0 💬 2 📌 0Subjectivity is hard to investigate, indeed. But if this is true then all "non-aphantasics" are equally misdescribing their subjective experience too... (said as someone with what I believe to be quite extreme aphantasia, and who cannot square my experiences with those described by others)
16.04.2025 17:46 — 👍 1 🔁 0 💬 1 📌 0Very late, but had a 🔥 time at my first Cosyne presenting my work with @nandahkrishna.bsky.social, Ximeng Mao, @mattperich.bsky.social, and @glajoie.bsky.social on real-time neural decoding with hybrid SSMs. Keep an eye out for a preprint (hopefully) soon 👀
#Cosyne2025 @cosynemeeting.bsky.social
Many apparent disagreements over the utility of neural manifolds come from a lack of clarity on what the term really encompasses, argues @mattperich.bsky.social
#neuroskyence
www.thetransmitter.org/neural-dynam...
🧠 Our new (NIH funded!) paper reveals how the brain creates internal dynamics during both real- and imagined navigation. We recorded directly from the human hippocampus as participants moved through physical space and when they mentally navigated imagined routes.
www.nature.com/articles/s41...
Yeah and let’s not forget that in the US that college degree could easily put you $200,000+ in debt… it’s hard to justify knowledge for knowledge’s sake at that extreme personal cost, which IMO justifies the voiced regrets among American grads
04.03.2025 15:49 — 👍 1 🔁 0 💬 0 📌 0Want to hear more about how feedback can guide learning in RNNs for motor adaptation. Here is our new paper in Nat. Com. with Barbara Feulner and @juangallego.bsky.social www.nature.com/articles/s41...
21.02.2025 16:51 — 👍 67 🔁 19 💬 0 📌 0That's the earliest one I know of, at least on neural trajectories. There's a paper from ~ the 70s (I think?) that I've seen that used dim reduction to look at coactivation of motoneurons which starts to look slightly trajectory-like but not fully the same. Can't recall the authors though...
21.02.2025 20:01 — 👍 2 🔁 0 💬 0 📌 0💯 I'm not convinced the most fun and innovative science is actually all that predictable from existing literature/things for model training. I could see it someday being useful to ask "here's my study, tell me all of the necessary controls and alternative explanations you can think of"
20.02.2025 16:19 — 👍 11 🔁 0 💬 2 📌 0Amazing, congrats!
18.02.2025 17:14 — 👍 1 🔁 0 💬 0 📌 0Our paper from Junchol Park and collaborators that has been brewing for a while. Trying to capture our thinking about what action specification in striatum means and what would constitute evidence for such a model. Longer thread soon, but it’s online now. www.cell.com/neuron/fullt...
24.01.2025 01:34 — 👍 52 🔁 18 💬 1 📌 0This one from Fig 2 of POYO is a bit clearer; still a relatively small effect but this is in a simple task where single session models are pushing saturation, with 75%+ performance. I would hypothesize that if we subsampled single sessions to even smaller sets we'd get more gains from pre-training.
06.01.2025 21:14 — 👍 8 🔁 0 💬 0 📌 0Yeah, there's a huge variance with that one. A honeycrisp at peak is a beautiful thing; almost as beautiful as an average pink lady! A honeycrisp off peak...
19.12.2024 03:38 — 👍 6 🔁 0 💬 0 📌 0Yeah, I don't know nearly all of the apple varieties/hybrids here but something is off when Empire is lower than Gala and Pink Ladies are lower than Honeycrisp...
19.12.2024 03:31 — 👍 4 🔁 0 💬 3 📌 0Great initiative! The Transmitter is awesome.
18.12.2024 18:49 — 👍 7 🔁 0 💬 0 📌 0100% in agreement.
Though I'd take a small bolder step and say there's a non-zero value for understanding too. If a big model shows that a thing is possible, we now have places to look (and potentially phenomena to explain) that helps progress towards insight/understanding long-term.