Kosio Beshkov's Avatar

Kosio Beshkov

@kosiob.bsky.social

Postdoc at the University of Oslo. Topology, machine learning, neuroscience and proteins. Homotopy equivalent to a scientist.

38 Followers  |  140 Following  |  7 Posts  |  Joined: 22.11.2024  |  1.5182

Latest posts by kosiob.bsky.social on Bluesky

Presenting the Active Neural Cellular Automata (ANCA)
YouTube video by Mikkel LepperΓΈd Presenting the Active Neural Cellular Automata (ANCA)

πŸ”¬ Excited to share our new paper: "Sensor Movement Drives Emergent Attention and Scalability in Active Neural Cellular Automata"

We found that when neural systems can move their sensors (like animals do!), they develop attention-like behaviors without being explicitly programmed to do so. Post 1/6

31.03.2025 13:16 β€” πŸ‘ 4    πŸ” 5    πŸ’¬ 1    πŸ“Œ 0

This method is intrinsic and does not require a choice of a metric. It can also be used to see how homology groups change through the layers of a network and there is much more in the paper. We even discuss some problems with this method. Let me know what you think!
7/7

06.02.2025 16:57 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Post image

In other words we are talking about quotient spaces. Therefore, we can define a relative homology theory for ReLU neural networks! Let's go back to the (non)circle example, using relative homology we get the right result and we can even see which points are glued together by the network.
6/7

06.02.2025 16:57 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

Even better, if the intersections between polyhedra and the data manifold are convex, then only the latter has an impact on homology groups. This happens since we can contract such regions and end up with a homotopy-equivalent space.
5/7

06.02.2025 16:57 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

It turns out that all topological changes to a manifold that occur through the layers of a ReLU network happen for two reasons.
1. The network is low rank over a polyhedron.
2. The network maps different polyhedra to each other by gluing them.
4/7

06.02.2025 16:57 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

In the specific case of ReLU neural networks we know that they are equivalent to continuous piecewise-linear functions and can be split into affine functions over convex polyhedra. This is called a polyhedral decomposition and looks really crazy (Cubism is the obvious reference here).
3/7

06.02.2025 16:57 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

This happens since we use don't know the real metric underlying the data manifold. So we end up classifying manifolds like the one below as a circle. Looks kind of right, but we know better.
2/7

06.02.2025 16:57 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

New prerpint out and my first ever blueprint(?)! Persistent homology is an amazing tool that lets us compute homology groups. But when we have highly nonlinear maps, like those in neural networks, it becomes hard to use. And that is annoying...
arxiv.org/pdf/2502.01360
1/7

06.02.2025 16:57 β€” πŸ‘ 6    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

@kosiob is following 20 prominent accounts