Flavio Martinelli

Flavio Martinelli

@flavioh.bsky.social

I like brains πŸ§Ÿβ€β™‚οΈ 🧠 PhD student in computational neuroscience supervised by Wulfram Gerstner and Johanni Brea https://flavio-martinelli.github.io/

122 Followers 170 Following 10 Posts Joined Oct 2023
1 month ago
Preview
Measuring and Controlling Solution Degeneracy Across Task-Trained Recurrent Neural Networks - Kempner Institute Despite reaching equal performance success when trained on the same task, artificial neural networks can develop dramatically different internal solutions, much like different students solving the sam...

πŸ€–πŸ“Š NEW in the Deeper Learning blog: @annhuang42.bsky.social & @kanakarajanphd.bsky.social break down their recent work examining how #RNNs solve the same task in different ways, and why that matters. Joint work with @satpreetsingh.bsky.social & @flavioh.bsky.social bit.ly/4kj4fVd #NeuroAI

26 6 0 1
3 months ago
Post image

[bonus] Here's a function that two neurons in a channel can implement

1 0 0 0
3 months ago
Preview
Flat Channels to Infinity in Neural Loss Landscapes The loss landscapes of neural networks contain minima and saddle points that may be connected in flat regions or appear in isolation. We identify and characterize a special structure in the loss lands...

More interesting details can be found in the paper: arxiv.org/abs/2506.14951

Or come by our poster if at Neurips (Session 3, poster #4200)

Wonderful team with Alex Van Meegen @avm.bsky.social, Berfin Simsek, Wulfram Gerstner @gerstnerlab.bsky.social and Johanni Brea

1 0 1 0
3 months ago
Post image

But what happens with standard gradient descent?

Channels to infinity get sharper with O(Ξ³^2), this is a clear example of the edge of stability phenomenon:
gradient descent does not converge to a minimum (at infinity) but gets stuck where the sharpness of the channel is 2/Ξ· (Ξ·: learning rate)

1 0 1 0
3 months ago
Post image

These channels are surprisingly common in MLPs, we find them to be a significant proportion of all minima reached in our training runs

But they can only be spotted by training for a long time, by following the gradient flow with ODE solvers

1 0 1 0
3 months ago
Post image

But what do these pairs of neurons compute?
In the limit of Ξ³β†’βˆž and Ξ΅β†’0 (where Ξ΅ is the distance of the two neurons input weights) they compute a directional derivative!

The MLP is learning to implement a Gated Linear Unit, with a non-linearity that is the derivative of the original

1 0 1 0
3 months ago
Post image

Here’s some more pictures from different angles

1 0 1 0
3 months ago
Post image

When perturbing networks from their saddle points, gradient trajectories get stuck in nearby channels that run parallel to the saddle line

The gradient dynamics are simple: after a first phase of alignment, trajectories are straight and Ξ³β†’βˆž

2 0 1 0
3 months ago
Post image

These channels are parallel to lines of saddle points arising from permutation symmetries, as described by Fukumizu & Amari in 2000

Saddles can be formed by taking a network at a local minimum and splitting a neuron's contribution into two, with splitting factor Ξ³

1 0 1 0
3 months ago
Post image

🧡Excited to present our latest work at #Neurips25! Together with @avm.bsky.social, we discover 𝐜𝐑𝐚𝐧𝐧𝐞π₯𝐬 𝐭𝐨 𝐒𝐧𝐟𝐒𝐧𝐒𝐭𝐲: regions in neural networks loss landscapes where parameters diverge to infinity (in regression settings!)

We find that MLPs in these channels can take derivatives and compute GLUs 🀯

14 6 2 0
3 months ago
Post image

πŸ“Excited to share that our paper was selected as a Spotlight at #NeurIPS2025!

arxiv.org/pdf/2410.03972

It started from a question I kept running into:

When do RNNs trained on the same task converge/diverge in their solutions?
πŸ§΅β¬‡οΈ

108 27 5 6
5 months ago
Preview
Male CNS Connectome A team of researchers has unveiled the complete connectome of a male fruit fly central nervous system β€”a seamless map of all the neurons in the brain and nerve cord of a single male fruit fly and the ...

Exciting news for #drosophila #connectomics and #neuroscience enthusiasts: the Drosophila male central nervous system connectome is now live for exploration. Find out more at the landing page hosted by our Janelia FlyEM collaborators www.janelia.org/project-team....

143 69 2 8
5 months ago

Lab members are at the Bernstein conference @bernsteinneuro.bsky.social with 9 posters! Here’s the list:

TUESDAY 16:30 – 18:00

P1 62 β€œMeasuring and controlling solution degeneracy across task-trained recurrent neural networks” by @flavioh.bsky.social

9 3 1 0
9 months ago

To our fellow researchers at Harvard and elsewhere. πŸ§ͺ🧠

I have funds for visiting PhDs or postdocs at TU in Vienna. For short stay or full PhD email me.

For professors, check for instance, this tenure track opening or ask in private for options
informatics.tuwien.ac.at/news/2909

11 3 0 0
1 year ago

Isn't NeuroAI a modern rebranding of computational neuroscience?
My take is that NeuroAI just sounds a little broader as a term, incorporating cognition and behaviour in the picture (that were not so accurately modelled before ANNs).
To me the goals of compneuro and NeuroAI are fully overlapping.

1 0 1 0