penguin classics cover for Crash by JG Ballard, but the cover image is the blue screen of death from older windows systems
05.12.2025 17:13 β π 29 π 5 π¬ 1 π 0@dimma.bsky.social
Theor/Comp Neuroscientist (postdoc) Prev @TU Munich Stochastic&nonlin. dynamics @TU Berlin&@MPIDS Learning dynamics, plasticity&geometry of representations https://dimitra-maoutsa.github.io https://dimitra-maoutsa.github.io/M-Dims-Blog
penguin classics cover for Crash by JG Ballard, but the cover image is the blue screen of death from older windows systems
05.12.2025 17:13 β π 29 π 5 π¬ 1 π 0Penguin Classics cover of A Good Man is Hard to Find and other Stories by Flannery O'Connor. cover image is a page of Where's Waldo (on the beach)
18.10.2024 00:24 β π 543 π 115 π¬ 5 π 2The NeurIPS 'celebrity tagging' posts are landing a bit strange with me. "I saw Hinton! ZOMG!!!" You... saw him? Made a saccade, your retina captured light that bounced off him? Vicarious clout band of the electromagnetic spectrum? You are an academic, at an academic conference.
06.12.2025 18:57 β π 3 π 2 π¬ 2 π 1Fantastic thread on voltage vs calcium imaging. Via photon flux change per molecule per action potential, theyβre similar.
What makes voltage imaging so much harder, then?
Integration time (EPSPs/APs are short), lim ability to jam GEVIs into membrane, and small size of subthreshold Vm changes.
π§ͺπ§ π€
I've been exposed to enough peer review that I think we can look at two classes of reviewers:
1. Constructive peer review
2. Adversarial peer review
I'm looking at this mostly from a psychology/methodology perspective (but wonder what other fields experience)
π§΅ 1/
Could cognition emerge from matter at scales far below neurons? This @nature.com paper explores whether molecular self-assembly can perform neural-like classification. The work suggests that even physical processes may carry out sophisticated information processing.
www.nature.com/articles/s41...
There are way too many people in the world that have not heard 'fuck off' enough.
06.12.2025 06:14 β π 370 π 71 π¬ 14 π 17really unfortunate*
*that we don't get to watch her break them
LMAOOOOO
05.12.2025 01:29 β π 713 π 107 π¬ 26 π 12π§΅Excited to present our latest work at #Neurips25! Together with @avm.bsky.social, we discover ππ‘ππ§π§ππ₯π¬ ππ¨ π’π§ππ’π§π’ππ²: regions in neural networks loss landscapes where parameters diverge to infinity (in regression settings!)
We find that MLPs in these channels can take derivatives and compute GLUs π€―
Thoughtful review with some good recent historical perspective on the ongoing paradigm shift that is radically changing the way we think about what brain areas do.
www.nature.com/articles/s41...
Tomorrow at #NeurIPS2025! Oral at 10 am in UL Ballroom 20D and poster #2016 at 11 am. @haydari.bsky.social and I are looking forward to hearing your thoughts.
05.12.2025 00:35 β π 18 π 4 π¬ 0 π 0Authors greeting their reviewers after the OpenReview leak
04.12.2025 11:56 β π 35 π 2 π¬ 1 π 1Day 3 of the Brain's Blog symposium of 'The Idealized Mind'. Attached are links to Frances Egan's commentary and my response:
philosophyofbrains.com/2025/12/03/s...
philosophyofbrains.com/2025/12/03/a...
Should we think that nervous systems compute? That's the topic of Day 3!
Some good news for non native English speakers who manage to speak English well π
02.12.2025 17:15 β π 6 π 1 π¬ 0 π 0Exciting new work from @lindenmp.bsky.social and friends!
Inferring intrinsic neural timescales using optimal control theory
www.nature.com/articles/s41...
βI will die on the hill that population coding is the relevant level of encoding information in the brain.β In the latest βThis paper changed my life,β Nancy Padilla-Coreano discusses a paper on mixed selectivity neurons.
#neuroskyence
www.thetransmitter.org/this-paper-c...
0/10 Thanks for the interest in our preprint. Some takes say it negates or fully supports the βmanifold hypothesisβ, neither quite right. Our results show that if you only focus on the manifold capturing most of task-related variance, you could miss important dynamics that actually drive behavior.
02.12.2025 07:48 β π 48 π 22 π¬ 1 π 1The whole paper is thought provoking, and congrats to @ulisespereirao.bsky.social. I recommend people to read it entirely before claiming it against the manifold hypothesis, out of ~thousands of cells, 10 dimensions explained most of the variance
02.12.2025 14:14 β π 10 π 2 π¬ 1 π 01/3 How reward prediction errors shape memory: when people gamble and cues signal unexpectedly high reward probability, those incidental images are remembered better than ones on safe trials, linking RL computations to episodic encoding. #RewardSignals #neuroskyence www.nature.com/articles/s41...
30.11.2025 11:12 β π 20 π 6 π¬ 1 π 0New(ish) paper!
It's often said that hippocampal replay, which helps to build up a model of the world, is biased by reward. But the canonical temporal-difference learning requires updates proportional to reward-prediction error (RPE), not reward magnitude
1/4
rdcu.be/eRxNz
Fig. 6: Mathematical model
29.11.2025 08:13 β π 104 π 12 π¬ 2 π 3The basal forebrain plays the cortex like a piano.
28.11.2025 17:51 β π 67 π 12 π¬ 2 π 1A study finds that cats meow harder to greet their male caregivers than female. Females are more verbally interactive, more skilled at interpreting cat meows. Apparently males require more meows to notice and respond to the needs of their cats.
onlinelibrary.wiley.com/doi/10.1111/...
But do we know how long was this vulnerability there? Was it for years, and people have been exploiting this route to get their papers accepted (not surprised at all) or is it a newish bug?
28.11.2025 15:28 β π 1 π 0 π¬ 1 π 0I won't look up the names of the reviewers. But I would look up the names of the people who looked up the names of the reviewers.
27.11.2025 22:50 β π 29 π 3 π¬ 2 π 1Interesting remark. I think there's a difference between looking up the names of past reviewers out of curiosity, without having any consequence (eg, don't bully them), and looking up the names of current reviewers in order to bias the process. The later would be a big integrity failure.
28.11.2025 00:00 β π 8 π 1 π¬ 2 π 0OpenReview was breached. The names of authors, reviewers, ACs, etc, for all past and current conferences were visible for a time, making nothing anonymous anymore. These data have been released for this year's ICLR, but I fear it's also the case for the past 10 years of conferences.
28.11.2025 08:11 β π 6 π 5 π¬ 2 π 1In case someone missed it, an account called OpenReviewers has started posting public comments at ICLR submissions revealing the identity of reviewers. We are in a crazy time.
28.11.2025 10:13 β π 9 π 1 π¬ 1 π 0"An old saying about such follies is that βsix months in the lab can you save you an afternoon in the libraryβ; here we may have wasted a trillion dollars and several years to rediscover what cognitive science already knew."
garymarcus.substack.com/p/a-trillion...