To enable this, we introduce MUREL, an 85.2M-token multicultural resource, and a comprehensive pipeline to separate language- vs. culture-related neurons and assess their roles via targeted ablations.
Grateful to @lukasgalke.bsky.social for his guidance and support!
05.09.2025 13:20 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0
We study how cultural information is represented inside multilingual LLMs by localizing and intervening on neuron subsets across four models and six cultures, including English, German, Danish, Chinese, Russian, and Persian.
05.09.2025 13:20 โ ๐ 1 ๐ 0 ๐ฌ 1 ๐ 0
Danial Namazifard, Lukas Galke
Isolating Culture Neurons in Multilingual Large Language Models
https://arxiv.org/abs/2508.02241
05.08.2025 07:55 โ ๐ 1 ๐ 1 ๐ฌ 0 ๐ 0
Paria Khoshtab, Danial Namazifard, Mostafa Masoudi, Ali Akhgary, Samin Mahdizadeh Sani, Yadollah Yaghoobzadeh
Comparative Study of Multilingual Idioms and Similes in Large Language Models
https://arxiv.org/abs/2410.16461
23.10.2024 05:30 โ ๐ 9 ๐ 2 ๐ฌ 0 ๐ 0