Alessandro Ingrosso's Avatar

Alessandro Ingrosso

@aingrosso.bsky.social

Theoretical neuroscience, machine learning and spin glasses. Assistant professor at Donders Institute, Nijmegen, The Netherlands. Website: https://aleingrosso.github.io/

239 Followers  |  514 Following  |  7 Posts  |  Joined: 25.11.2024  |  1.4118

Latest posts by aingrosso.bsky.social on Bluesky

Post image

Interested in doing a Ph.D. to work on building models of the brain/behavior? Consider applying to graduate schools at CU Anschutz:
1. Neuroscience www.cuanschutz.edu/graduate-pro...
2. Bioengineering engineering.ucdenver.edu/bioengineeri...

You could work with several comp neuro PIs, including me.

27.09.2025 20:30 β€” πŸ‘ 52    πŸ” 30    πŸ’¬ 1    πŸ“Œ 4
Preview
PhD Position: Theory of Learning in Artificial and Biologically Inspired Neural Networks | Radboud University Do you want to work as a PhD candidate Theory of Learning in Artificial and Biologically Inspired Neural Network? Check our vacancy!

Please RT - Open PhD position in my group at the Donders Center for Neuroscience, Radboud University.

We're looking for a PhD candidate interested in developing theories of learning in neural networks.

Applications are open until October 20th.

For more info: www.ru.nl/en/working-a...

22.09.2025 17:17 β€” πŸ‘ 14    πŸ” 13    πŸ’¬ 2    πŸ“Œ 1
Preview
Home The school will open the thematic period on Data Science and will be dedicated to the mathematical foundations and methods for high-dimensional data analysis. It will provide an in-depth introduction ...

Just got back from a great summer school at Sapienza University sites.google.com/view/math-hi... where I gave a short course on Dynamics and Learning in RNNs. I compiled a (very biased) list of recommended readings on the subject, for anyone interested: aleingrosso.github.io/_pages/2025_...

15.09.2025 11:57 β€” πŸ‘ 15    πŸ” 2    πŸ’¬ 0    πŸ“Œ 1

With all the sad developments in the US - go study in the Netherlands: relatively low tuition (and adequate job search visa after graduating) for high-quality programs like this one in Neurophysics or Cognitive Neuroscience at the neuroscience Donders hub

27.05.2025 19:57 β€” πŸ‘ 6    πŸ” 2    πŸ’¬ 0    πŸ“Œ 1

Fantastic. Congrats Will.

15.05.2025 15:54 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Preview
Statistical Mechanics of Transfer Learning in Fully Connected Networks in the Proportional Limit Tools from spin glass theory such as the replica method help explain the efficacy of transfer learning.

Our paper on the statistical mechanics of transfer learning is now published in PRL. Franz-Parisi meets Kernel Renormalization in this nice collaboration with friends in Bologna (F. Gerace) and Parma (P. Rodondo, R. Pacelli).
journals.aps.org/prl/abstract...

01.05.2025 16:13 β€” πŸ‘ 7    πŸ” 2    πŸ’¬ 0    πŸ“Œ 0
Preview
External seminar - Alessandro Ingrosso (Radboud University, NL) | QBio :

πŸ‡³πŸ‡± For the next FRESK seminar, Alessandro Ingrosso (Radboud University, NL) will give a lecture on "Statistical mechanics of transfer learning in the proportional limit"

@aingrosso.bsky.social

More info on Qbio's website ! ‡️
qbio.ens.psl.eu/en/events/ex...

28.03.2025 18:51 β€” πŸ‘ 0    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0
Post image

Announcing our StatPhys29 Satellite Workshop "Molecular biophysics at the transition state: from statistical mechanics to AI" to be held in Trento, Italy, from July 7th to 11th, 2025: indico.ectstar.eu/event/252/.
Co-organized with Raffaello Potestio and his lab in Trento.

11.03.2025 13:38 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Preview
On How Iterative Magnitude Pruning Discovers Local Receptive Fields in Fully Connected Neural Networks Since its use in the Lottery Ticket Hypothesis, iterative magnitude pruning (IMP) has become a popular method for extracting sparse subnetworks that can be trained to high performance. Despite this, t...

New paper with @aingrosso.bsky.social @sebgoldt.bsky.social and Zhangyang Wang β€œOn How Iterative Magnitude Pruning Discovers Local Receptive Fields in Fully Connected Neural Networksβ€œ accepted at the conference on parsimony and learning (CPAL) arxiv.org/abs/2412.06545 1/

27.02.2025 14:44 β€” πŸ‘ 2    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0
Preview
Density of states in neural networks: an in-depth exploration of... Learning in neural networks critically hinges on the intricate geometry of the loss landscape associated with a given task. Traditionally, most research has focused on finding specific weight...

Our paper on density of states in NNs is now published in TMLR. We show how the loss landscape in simple learning problems can be characterized by Wang-Landau sampling. A nice collaboration with the Potestio Lab in Trento, at the interface between ML and soft-matter.
openreview.net/forum?id=BLD...

18.02.2025 13:20 β€” πŸ‘ 3    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Preview
Event-based backpropagation on the neuromorphic platform SpiNNaker2 Neuromorphic computing aims to replicate the brain's capabilities for energy efficient and parallel information processing, promising a solution to the increasing demand for faster and more efficient ...

πŸ€– 🧠 πŸ§ͺ New #Preprint Alert! Imagine AI systems that can learn and adapt on-chip while displaying minimal energy usage. We've just made a step towards unlocking the final piece of the puzzle needed to deploy neuromorphic at scale using SpiNNaker2! (1/8)
arxiv.org/abs/2412.15021

28.01.2025 20:06 β€” πŸ‘ 28    πŸ” 12    πŸ’¬ 1    πŸ“Œ 1

New paper with @leonlufkin.bsky.social and @eringrant.bsky.social!

Why do we see localized receptive fields so often, even in models without sparisity regularization?

We present a theory in the minimal setting from @aingrosso.bsky.social and @sebgoldt.bsky.social

13.12.2024 10:49 β€” πŸ‘ 28    πŸ” 8    πŸ’¬ 0    πŸ“Œ 0

Excess kurtosis strikes back.

13.12.2024 08:56 β€” πŸ‘ 5    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

@aingrosso is following 20 prominent accounts