The best part though: Working with amazing graduate student *Maria Servetnik* from our lab, who did all the heavy lifting. Not to mention lots of inspiration & input from @mjwolff.bsky.social. I'm am one very lucky PI ๐ n/n
21.01.2026 12:47 โ ๐ 5 ๐ 1 ๐ฌ 1 ๐ 0
Visual representations in the human brain rely on a reference frame that is in between allocentric and retinocentric coordinates
Visual information in our everyday environment is anchored to an allocentric reference frame โ a tall building remains upright even when you tilt your head, which changes the projection of the building on your retina from a vertical to a diagonal orientation. Does retinotopic cortex represent visual information in an allocentric or retinocentric reference frame? Here, we investigate which reference frame the brain uses by dissociating allocentric and retinocentric reference frames via a head tilt manipulation combined with electroencephalography (EEG). Nineteen participants completed between 1728โ2880 trials during which they briefly viewed (150 ms) and then remembered (1500 ms) a randomly oriented target grating. In interleaved blocks of trials, the participantโs head was either kept upright, or tilted by 45ยบ using a custom rotating chinrest. The target orientation could be decoded throughout the trial (using both voltage and alpha-band signals) when training and testing within head-upright blocks, and within head-tilted blocks. Importantly, we directly addressed the question of reference frames via cross-generalized decoding: If target orientations are represented in a retinocentric reference frame, a decoder trained on head-upright trials would predict a 45ยบ offset in decoded orientation when tested on head-tilted trials (after all, a vertical building becomes diagonal on the retina after head tilt). Conversely, if target representations are allocentric and anchored to the real world, no such offset should be observed. Our analyses reveal that from the earliest stages of perceptual processing all the way throughout the delay, orientations are represented in between an allocentric and retinocentric reference frame. These results align with previous findings from physiology studies in non-human primates, and are the first to demonstrate that the human brain does not rely on a purely allocentric or retinocentric reference frame when representing visual information. ### Competing Interest Statement The authors have declared no competing interest. NIH Common Fund, https://ror.org/001d55x84, NEI R01-EY025872, NIMH R01-MH087214
Check out our *preprint* for some cool correlations with behavior (for foblique effect fans). For now, Iโm just happy that these fun data are out in the world. Itโs been a minute Chaipat Chunharas & I ventured to dissociate allocentric and retinocentric reference frames (7+ years ago?! ๐คซ)... 10/n
21.01.2026 12:45 โ ๐ 12 ๐ 3 ๐ฌ 1 ๐ 0
No matter the exact time point, no matter how we quantified the shift, no matter if we looked at decoding or at representational geometry ยฌโ the reference frame used by the brain to represent orientations was always smack dab in between retinocentric and allocentric 9/n
21.01.2026 12:41 โ ๐ 4 ๐ 0 ๐ฌ 3 ๐ 0
Well, throughout perception (when the orientation is on the screen) as well as the entire memory delay (the orientation is held in mind), we discovered a reference frame that is in between retinocentric and allocentric coordinates! 8/n
21.01.2026 12:35 โ ๐ 4 ๐ 0 ๐ฌ 1 ๐ 0
two german shepherds are laying on the floor in front of a fireplace .
ALT: two german shepherds are laying on the floor in front of a fireplace .
Conversely, if representations are allocentric and anchored to the real world, no such shift should be observed. In other words: Cross-generalized decoding to the rescue! If you had to guessโฆ What reference frame do you think visual cortex uses for visual processing? 7/n
21.01.2026 12:35 โ ๐ 2 ๐ 0 ๐ฌ 1 ๐ 0
The trick? If orientations are represented in a retinocentric reference frame, a decoder trained on head-upright trials would predict a 45ยบ shift in decoded orientation when tested on head-tilted trials (after all, a vertical building becomes diagonal on the retina after head tilt). 6/n
21.01.2026 12:34 โ ๐ 2 ๐ 0 ๐ฌ 1 ๐ 0
Now, even if the pattern *completely shifts* with head tilt, standard (within time point) decoding can only ever infer the exact same label! After all, we as researchers do not know the underlying shift, only the orientation (and hence the label) that was on the screen. 5/n
21.01.2026 12:33 โ ๐ 2 ๐ 0 ๐ฌ 1 ๐ 0
We want to decode visual orientation from the EEG signal to uncover the reference frame used by the brain. But we have a problemโฆ A decoder only learns the association between a label (e.g., 45ยบ) and a pattern of brain activity. Presented with a new pattern of activity, the label is inferred. 4/n
21.01.2026 12:32 โ ๐ 2 ๐ 0 ๐ฌ 1 ๐ 0
Do visual parts of the brain represent visual information in an allocentric or retinocentric reference frame? We used a simple orientation recall task while measuring electroencephalography (EEG) signals from human visual cortex. People had their head upright ๐ or tilted ๐ซ ! 3/n
21.01.2026 12:31 โ ๐ 2 ๐ 0 ๐ฌ 1 ๐ 0
Visual information in our environment is anchored to an allocentric reference frame โ a tall building remains upright even when you tilt your head. But head tilt changes the retinal projection of the building from vertical to diagonal. The building is diagonal in a retinocentic reference frame. 2/n
21.01.2026 12:29 โ ๐ 3 ๐ 0 ๐ฌ 1 ๐ 0
a husky puppy is laying on the floor with its tongue out and wearing a blue collar .
ALT: a husky puppy is laying on the floor with its tongue out and wearing a blue collar .
Hereโs a thought that might make you tilt your head in curiosity: With every movement of your eyes, head, or body, the visual input to your eyes shifts! Nevertheless, it doesn't feel like the world does suddenly tilts sideways whenever you tilt your head. How can this be? TWEEPRINT ALERT! ๐จ๐งต 1/n
21.01.2026 12:28 โ ๐ 46 ๐ 18 ๐ฌ 1 ๐ 3
As someone who once tried to recruit Natalie, I can of course only recommend hiring this extremely smart scientist!!
16.01.2026 11:44 โ ๐ 2 ๐ 0 ๐ฌ 0 ๐ 0
๐ Applications are open! The IBRO Exchange Fellowships give early career #neuroscientists to conduct lab visits with several expenses covered during the exchange.
๐ Apply by 15 Apr: https://ibro.org/grant/exchange-fellowships/
#grant #IBROinAsiaPacific #IBROinUSCanada #IBROinAfrica #IBROinLatAm
15.01.2026 12:01 โ ๐ 6 ๐ 6 ๐ฌ 0 ๐ 1
#BrainMeeting ๐ง Alert! ๐บ
This Friday, January 16th, the Brain Meeting speaker will be Janneke Jehee giving a talk entitled "Uncertainty in perceptual decision-making"
In person or online. For more information:
www.fil.ion.ucl.ac.uk/event
12.01.2026 09:14 โ ๐ 17 ๐ 8 ๐ฌ 0 ๐ 0
Please spread the word๐My lab is looking to hire two international postdocs. If you want to do comp neuro, combine machine learning and awesome math to understand neural circuit activity, then come work with us! Bonn is such a cool place for neuroscience now, you don't want to miss out.
10.01.2026 17:39 โ ๐ 32 ๐ 35 ๐ฌ 1 ๐ 1
New preprint from the lab!
06.01.2026 21:53 โ ๐ 8 ๐ 6 ๐ฌ 0 ๐ 0
What if we could tell you how well youโll remember your next visit to your local coffee shop? โ๏ธ
In our new Nature Human Behaviour paper, we show that the ๐พ๐๐ฎ๐น๐ถ๐๐ ๐ผ๐ณ ๐ฎ ๐๐ฝ๐ฎ๐๐ถ๐ฎ๐น ๐ฟ๐ฒ๐ฝ๐ฟ๐ฒ๐๐ฒ๐ป๐๐ฎ๐๐ถ๐ผ๐ป can be measured with neuroimaging โ and ๐๐ต๐ฎ๐ ๐๐ฐ๐ผ๐ฟ๐ฒ ๐ฝ๐ฟ๐ฒ๐ฑ๐ถ๐ฐ๐๐ ๐ต๐ผ๐ ๐๐ฒ๐น๐น ๐ป๐ฒ๐ ๐ฒ๐
๐ฝ๐ฒ๐ฟ๐ถ๐ฒ๐ป๐ฐ๐ฒ๐ ๐๐ถ๐น๐น ๐๐๐ถ๐ฐ๐ธ.
05.01.2026 18:43 โ ๐ 68 ๐ 24 ๐ฌ 3 ๐ 2
This is very cool! The link between spikes and LFPโs is something that comes up frequently in our (human neuroimaging) lab. Nice to learn more about it!
05.01.2026 20:47 โ ๐ 5 ๐ 0 ๐ฌ 0 ๐ 0
Research Specialist
The Attention, Distractions, and Memory (ADAM) Lab at Rice University is recruiting a full-time Research Specialist (Research Specialist I). The ADAM Lab (PI: Kirsten Adam) conducts cognitive neurosci...
The ADAM lab is hiring a Research Specialist to join us! This role involves conducting human subjects research (EEG experiments on attention + working memory) and assisting with the execution and administration of ongoing projects.
Job posting: emdz.fa.us2.oraclecloud.com/hcmUI/Candid...
02.01.2026 15:21 โ ๐ 11 ๐ 14 ๐ฌ 0 ๐ 0
Noise in Competing Representations Determines the Direction of Memory Biases
Our memories are reconstructions, prone to errors. Historically treated as a mere nuisance, memory errors have recently gained attention when found to be systematically shifted away from or towards no...
@shansmann-roth.bsky.social and I finally finished our paper confirming a unique prediction of the Demixing Model (DM): inter-item biases in #visualworkingmemory depend on the _relative_ noise of targets and non-targets, potentially going in opposing directions. ๐งต1/9
www.biorxiv.org/content/10.6...
26.12.2025 16:39 โ ๐ 10 ๐ 4 ๐ฌ 1 ๐ 1
The neural basis of working memory has been debated. What we like to call โThe Standard Modelโ of working memory posits that persistent discharges generated by neurons in the prefrontal cortex constitute the neural correlate of working memory (2/10)
29.12.2025 14:41 โ ๐ 2 ๐ 1 ๐ฌ 1 ๐ 0
๐จ New paper in @pnas.org to end 2025 with a bang!๐จ
Behavioral, experiential, and physiological signatures of mind blanking
www.pnas.org/doi/10.1073/...
with Esteban Munoz-Musat, @arthurlecoz.bsky.social @corcorana.bsky.social, Laouen Belloli and Lionel Naccache
Illustration: Ana Yael.
1/n
29.12.2025 10:10 โ ๐ 47 ๐ 18 ๐ฌ 3 ๐ 2
No exciting plans for year-end yet?
Why not gear up for your next grant proposal? ๐ธ
Check out our website for recurring and one-time funding lines, awards, and programs! ๐ bernstein-network.de/en/newsroom/...
23.12.2025 08:01 โ ๐ 7 ๐ 1 ๐ฌ 0 ๐ 0
various computational neuroscience / MEEG / LFP short courses and summer schools
๐ updated for 2026!
list of summer schools & short courses in the realm of (computational) neuroscience or data analysis of EEG / MEG / LFP: ๐ docs.google.com/spreadsheets...
19.12.2025 16:37 โ ๐ 99 ๐ 60 ๐ฌ 3 ๐ 0
๐ Excited to announce that I'm looking for people (PhD/Postdoc) to join my Cognitive Modelling group @uniosnabrueck.bsky.social.
If you want to join a genuinely curious, welcoming and inclusive community of Coxis, apply here:
tinyurl.com/coxijobs
Please RT - deadline is Jan 4โผ๏ธ
18.12.2025 14:52 โ ๐ 76 ๐ 54 ๐ฌ 1 ๐ 5
All in all, we characterize human memory for speed, showing that speed is better recalled for spatiotemporally bound than texture-like stimuli (the added dimension of space helps!). Thanks for reading, and stay tuned for Giulianaโs next adventures linking speed memory to motion extrapolation! 9/9
17.12.2025 16:41 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0
We looked hysteresis effects (yes they exist in these data!), the role of eye movements (no they can't explain these findings), and more. But importantly, people are MUCH BETTER at recalling the speed of a single dot moving around fixation, then the speed of more texture-like dot motion!! 8/n
17.12.2025 16:40 โ ๐ 0 ๐ 0 ๐ฌ 1 ๐ 0
Another cool finding: The memory target and the probe could either move in congruent (e.g., both clockwise) or incongruent (e.g., target moved clockwise, the probe counterclockwise) directions. Speed recall was better for congruent motion! 7/n
17.12.2025 16:37 โ ๐ 0 ๐ 0 ๐ฌ 1 ๐ 0
The stimulus was presented for 4โ6 seconds, and remembered for 1โ8 seconds. This mattered not for dot motion (blue), but it *did* matter for the single dot (red) such that errors were lower when people had more time to encode the speed, and higher at longer delays. 6/n
17.12.2025 16:36 โ ๐ 0 ๐ 0 ๐ฌ 1 ๐ 0
Research Group leader @Max Planck Institute of Psychiatry
Systems Neuroscience
Postdoc fellow investigating time processing in the human brain with fMRI @timelab.bsky.social โข PhD in Cognitive Neuroscience @SISSA โข Neurobiologist @unipv
Professor of Computational Cognitive Science | @AI_Radboud | @Iris@scholar.social on ๐ฆฃ | http://cognitionandintractability.com | she/they ๐ณ๏ธโ๐
Cognitive psychologist ๐ง | Working in Academia and Industry๐ฉโ๐ป | Science Officer opensciencetools.org (creators of @psychopy.org)๐ | Post doc multisensorytcd.com (studying Multisensory Perception ๐๐) | Obsessed with my dog ๐ถ Trying to handstand ๐คธ
Professor of Psychology, Brock University. Cycling. Sailing. Food. Beer. Man about town.
Metascience, statistics, psychology, philosophy of science. Eindhoven University of Technology, The Netherlands. Omnia probate. ๐ช๐บ
https://olivia.science
assistant professor of computational cognitive science ยท she/they ยท cypriot/kฤฑbrฤฑslฤฑ/ฮบฯ
ฯฯฮฑฮฏฮฑ ยท ฯแฝบฮฝ แผฮธฮทฮฝแพท ฮบฮฑแฝถ ฯฮตแฟฯฮฑ ฮบฮฏฮฝฮตฮน
Professor, Department of Psychology and Center for Brain Science, Harvard University
https://gershmanlab.com/
Science-ing, trying to improve science. Metascience, open access, reforming scholarly authorship practices. Cognitive and perceptual psychologist.
Mastodon: @alexh@fediscience.org
Cognitive neuroscientist @NYU interested in the neural mechanisms underlying multimodal perception and prediction.๐ง
head of the Embodied Cognition Group (ECG) at the University of Goettingen
Lecturer at Lancaster University (UK), interested in working memory and children's cognitive development. Views my own. She/her.
Professor of Psychology at NYU (jayvanbavel.com) | Author of The Power of Us Book (powerofus.online) | Director of NYU Center for Conflict & Cooperation | trying to write a new book about collective decisions
Vision scientist at U of T Mississauga and co-director of the APPLY lab ๐ง ๐๏ธ. Puns usually intended.
computational cog sci โข problem solving and social cognition โข asst prof at NYU โข https://codec-lab.github.io/
Professor of Psychology & Cognitive Neuroscience at Cambridge, FBA FMedSci FRS. Adolescent brain development. Views my own. Book: http://amazon.co.uk/dp/1784161349
Professor of Psychology at UCSD interested in language & conceptual development.
Prof @UCDavis. Cognitive control - influences, consequences, mechanisms, variations, developments. Mom, boarder, pursuer of Quadrant 2.
Associate Professor of Psychology | University of California, Berkeley.
Interested in Idiographic Science, Group-to-Individual Generalizability, and Personalization. EMA, time series, physiology, methods and statistics.