Thank you π That means a lot to hear!
30.10.2025 20:24 β π 0 π 0 π¬ 0 π 0@evayixie.bsky.social
Comp Neuro PhD student @ Princeton. Visiting Scientist @ Allen Institute. MITβ24 https://minzsiure.github.io
Thank you π That means a lot to hear!
30.10.2025 20:24 β π 0 π 0 π¬ 0 π 0#neuroscience
30.10.2025 15:08 β π 1 π 0 π¬ 0 π 09/9 Lastly, we thank the colleagues @alleninstitute.org and @cosynemeeting.bsky.social for their insightful feedback on an early version of this work! Happy to chat: evayixie@princeton.edu; lukasz.kusmierz@alleninstitute.org.
30.10.2025 15:01 β π 1 π 0 π¬ 1 π 08/ @tyrellturing.bsky.social βs group recently shows brain-like learning with exponentiated gradients naturally gives rise to log-normal connectivity distributionsβour results offer a theoretical perspective that elucidates the dynamical consequences of these heavy-tailed structures.
30.10.2025 14:59 β π 1 π 0 π¬ 1 π 07/ For more details, implications of our results to neuroscience π§ and machine learning π€, + exciting future directions, please check out our full paper or visit our poster at #NeurIPS2025:
πOpenReview: openreview.net/forum?id=J0S...
πCode: github.com/AllenInstitu...
6/ Conclusion: Our results reveal a biologically aligned tradeoff between the robustness of dynamics and the richness of neural activity. Our results provide a tractable framework for understanding dynamics in realistically sized, heavy-tailed neural circuits.
30.10.2025 14:57 β π 1 π 0 π¬ 1 π 05/ βΌοΈResult 3: However, this robustness of slow transition comes with a tradeoff βοΈ: heavier tails reduce the Lyapunov dimension of the network attractor, indicating lower effective dimensionality.
30.10.2025 14:57 β π 2 π 0 π¬ 1 π 04/ (Side note: The computational benefit of being near the edge of chaos is well established for both feedforward and recurrent neural networks. We validate in Appendix L this indeed translates to improved info processing in simple reservoir-computing tasks. π€π§ )
30.10.2025 14:56 β π 3 π 0 π¬ 1 π 03/ πResult 2: Compared to Gaussian networks, we found finite heavy-tailed RNNs exhibit a broader gain regime near the edge of chaos: a *slow* transition to chaos. π’
30.10.2025 14:56 β π 2 π 0 π¬ 1 π 02/ πResult 1: While mean-field theory for the infinite system predicts ubiquitous chaos, our analysis reveals *finite-size* RNNs have a sharp transition between quiescent & chaotic dynamics.Β
We theoretically predict the gain of transition and validated it through simulations.
1/ Setup: With @mihalas.bsky.social and Lukasz Kusmierz, We study RNNs with weights drawn from biologically plausible LΓ©vy alpha-stable distributions, generalizing the Gaussian distribution to heavy tails.
30.10.2025 14:55 β π 1 π 0 π¬ 1 π 0https://tinyurl.com/heavyrnn
Connectome suggests brainβs synaptic weights follow heavy-tailed distributions, yet most analyses of RNNs assume Gaussian connectivity.Β
π§΅β¬οΈ Our @alleninstitute.org #NeurIPS2025 paper shows heavy-tailed weights can strongly affect dynamics, trade off robustness + attractor dimension.
Slow Transition to Low-Dimensional Chaos in Heavy-Tailed Recurrent Neural Networks https://www.biorxiv.org/content/10.1101/2025.10.24.684386v1
25.10.2025 06:15 β π 2 π 2 π¬ 0 π 0π¨ Only 4 days left to submit to Data on the Brain and Mind! π¨
Donβt miss your chance to contribute to our Findings or Tutorial tracks.
Weβre excited to feature oral presentations in both tracks!
π¨ Deadline Extended π¨
The submission deadline for the Data on the Brain & Mind Workshop (NeurIPS 2025) has been extended to Sep 8 (AoE)! π§ β¨
We invite you to submit your findings or tutorials via the OpenReview portal:
openreview.net/group?id=Neu...
π’ 10 days left to submit to the Data on the Brain & Mind Workshop at #NeurIPS2025!
π Call for:
β’ Findings (4 or 8 pages)
β’ Tutorials
If youβre submitting to ICLR or NeurIPS, consider submitting here tooβand highlight how to use a cog neuro dataset in our tutorial track!
π data-brain-mind.github.io
π¨ Excited to announce our #NeurIPS2025 Workshop: Data on the Brain & Mind
π£ Call for: Findings (4- or 8-page) + Tutorials tracks
ποΈ Speakers include @dyamins.bsky.social @lauragwilliams.bsky.social @cpehlevan.bsky.social
π Learn more: data-brain-mind.github.io