π§΅ 7/7
Given our results, these tasks will hopefully also serve as proper benchmarks for the TDL and GeomDL communities.
π» Code, data, and tutorial soon available!
@francescadomin8
@clabat9.bsky.social
Postdoc @Harvard | Topological Signal Processing β Deep Learning β AI for Health and Climate β Stochastic Optimization | Ex Visiting Associate @PennEngineers π cbattiloro.com
π§΅ 7/7
Given our results, these tasks will hopefully also serve as proper benchmarks for the TDL and GeomDL communities.
π» Code, data, and tutorial soon available!
@francescadomin8
π§΅ 6/7
π§ Based on these theoretical insights, we test SSNs on brain dynamics classification tasks, showing huge improvements upon existing methods by up to 50% over vanilla message-passing GNNs and by up to 27% over the second-best models.
π§΅ 5/7
β
We utilize SSNs to bridge the gap between neurotopology and the deep learning world, marking a first-time connection. In particular, we prove that SSNs are able to recover several key topological invariants that are critical to characterize brain activity.
π§΅ 4/7
β
SSNs can be implemented following any differentiable approach, not only message-passing.
β
We also introduce Routing-SSNs (R-SSNs), lightweight scalable variants that dynamically select the most relevant interactions in a learnable way.
π§΅ 3/7
π Contribution:
β
We present Semi-Simplicial Neural Networks (SSNs), models operating on semi-simplicial sets, representing the currently most comprehensive deep learning framework to capture topological higher-order directed interactions in data.
π§΅ 2/7
Thank you all, especially
@manuel_lecha, who has been the incredibly talented, driving force behind this work. I will not forget our endless conversations at every time of day and night! π
π€£π«
π§΅ 1/7
Our new preprint βDirected Semi-Simplicial Learning with Applications to Brain Activity Decoding" is online at arxiv.org/abs/2505.17939
An amazing project with a dream team, which led to a principled Topological Deep Learning architecture for a unique real-world use case!
π§΅ 4/7
β
SSNs can be implemented following any differentiable approach, not only message-passing.
β
We also introduce Routing-SSNs (R-SSNs), lightweight scalable variants that dynamically select the most relevant interactions in a learnable way.
π§΅ 3/7
π Contribution:
β
We present Semi-Simplicial Neural Networks (SSNs), models operating on semi-simplicial sets, representing the currently most comprehensive deep learning framework to capture topological higher-order directed interactions in data.
π§΅ 2/7
Thank you all, especially
@manuel_lecha, who has been the incredibly talented, driving force behind this work. I will not forget our endless conversations at every time of day and night! π
π€£π«
π§΅ 1/7
Our new preprint βDirected Semi-Simplicial Learning with Applications to Brain Activity Decoding" is online at arxiv.org/abs/2505.17939
An amazing project with a dream team, which led to a principled Topological Deep Learning architecture for a unique real-world use case!
Thank you Guillermo BernΓ‘rdez, @clabat9.bsky.social @ninamiolane.bsky.social
for making this work possible!
π @geometric-intel.bsky.social @ucsb.bsky.social
π©TopoTune takes any neural network as input and builds the most general TDL model to date, complete with permutation equivariance and unparalleled expressivity.
βοΈ Thanks to its implementation in TopoBench, defining and training these models only requires a few lines of code.
TopoTune is going to ICML 2025!ππ¨π¦
Curious to try topological deep learning with your custom GNN or your specific dataset? We built this for you! Find out how to get started at geometric-intelligence.github.io/topotune/
π’π’π’"The Relativity of Causal Knowledge"
THANK YOU to @clabat9.bsky.social for working side by side with me on this exciting project and making this article possible. Big thanks also to @hansmriess.bsky.social and Fabio Massimo Zennaro for their valuable feedback on an earlier version of the paper.
When we view causality subjectively, it turns into relative causal knowledgeβmuch like a fact can seem like an opinion when seen subjectively. Still, causality is causality and a fact remains a fact, but it becomes understandable only when viewed within the entire network.π§΅ 9/9
18.03.2025 13:34 β π 0 π 0 π¬ 0 π 0Each subject in a network of relations has then its subjective Causal Knowledge, BUT it can be be accessed by the other subjects of the network only trough their own perspective. Imagine how important is this in an agent AI network (or in any human network).
π§΅ 8/9
Overall, what we did was using these tools to go beyond Structural Causal Models as we usually intend them and define a broad notion of Causal Knowledge.
π§΅ 7/9
Although they are sophisticated mathematical frameworks, we invite any ML practitioner to read this work, it is self-contained and has immediate methodological and practical implications.Β
π§΅ 6/9
We believe this was a necessary step toward a better understanding of causality and will have significant implications on AI.
The moment our conceptual goal was clear, we found the technical tools needed to implement it: Network sheaves and category theory.
π§΅ 5/9
By stripping causality of its oracular and absolute meaning, the relativity of causal knowledge situates it within a different ontological setting, where truth is not monolithic but emerges inevitably and relatively from a set of relationships.Β
π§΅ 4/9
We ended up converging on a simple but technically unexplored concept: any causal model is an imperfect and subjective representation of the world, and it cannot be severed from the network of relations the subject is immersed in.
π§΅ 3/9
One day, Gabriele and I were talking about Grothendieck and his relativism, and we asked ourselves how his approach could be philosophically framed and how it could be used for causal theory.
π§΅ 2/9
πβThe Relativity of Causal Knowledgeβ
arxiv.org/abs/2503.11718
Bluntly, this was the funniest paper I ever worked on. A thank you to @officiallydac, who was amazing at making our vision a technically sound reality.
(Spoiler: more categories and sheaves for you π)
π§΅ 1/9
π₯π According to the @IPCC_CH , over 40% of the global population (about 3.5 billion people) live in contexts of extreme climate vulnerability.
The also IPCC identified 127 risks that affect every aspect of private, social, and economic life of everyone.
Everyone.
π§΅ 10/10
These numbers help dismantle the false narrative of a Western World under siege by migrants and highlight how most migration occurs within or around the regions most affected.
π§΅ 9/10
π
Contrary to the rhetoric of an βinvasion,β the Report points out that:
β’ 76 million peopleβthe majority of those 120 millionβare internally displaced in their own countries;
β’ 69% of recognized refugees are located in neighboring countries to those in crisis.
π§΅ 8/10
ππΆββοΈIt is no coincidence that @Refugees estimates that, in 2024, around 120 million people worldwide are forcibly displaced, and 75% of them come from countries with high exposure to climate risks.
π§΅ 7/10
This finding underscores how the effects of global warmingβdroughts, desertification, extreme weather events, and the depletion of agricultural landβare directly linked to individualsβ migratory choices.
π§΅ 6/10
ποΈπ From a survey conducted by interviewing 348 migrants in various reception centers in Italy, it emerged that 69% of those classified as βeconomic migrantsβ still identify climate change as one of the contributing factors to their decision to move.
π§΅ 5/10