1/ Why are we so easily distracted? π§ In our new EEG preprint w/ Henry Jones, @monicarosenb.bsky.social and @edvogel.bsky.social we show that distractibility is associated w/ reduced neural connectivity β and can be predicted from EEG with ~80% accuracy using machine learning.
28.09.2025 19:14 β π 58 π 24 π¬ 1 π 1
Very excited to announce my first paper is out in @currentbiology.bsky.social! Using EEG, we identify an item-based measure of storage in working memory that generalizes across auditory and visual items.
authors.elsevier.com/a/1ljFF3QW8S...
#PsychSciSky #neuroskyence #workingmemory
04.09.2025 16:34 β π 15 π 7 π¬ 1 π 2
Sensory eye dominance varies over the horizontal axis of the visual field: left eye dominance for the right visual field; right eye dominance for the left visual field.
Surely you know about eye dominance. You probably donβt know itβs not a unitary phenomenon: in this paper I show that sensory eye dominance varies over the visual field. In the Discussion I propose an explanation for why this variation might exist. Curious? Read it here: doi.org/10.1167/jov....
02.07.2025 07:46 β π 6 π 3 π¬ 1 π 0
Had a blast at last week's symposium! Inspiring to hear all the talks about EEG and attention.
I presented on the neural correlates of saccade preparation and covert spatial attention. Check out the preprint here: doi.org/10.1101/2025...
30.06.2025 09:48 β π 6 π 0 π¬ 1 π 0
Last week's symposium titled "Advances in the Encephalographic Study of Attention" was a great success! Held in the KNAW building in Amsterdam and sponsored by the NWO, many of (Europe's) leading attention researchers assembled to discuss the latest advances in attention research using M/EEG.
30.06.2025 07:12 β π 25 π 7 π¬ 4 π 3
In this new preprint, in review @elife.bsky.social, we show what processing steps make up the reaction time using single trial #EEG modelling in a contrast #decision task.
In this π§΅ I'm telling the story behind it as I think it is quite interesting and I can't write it like this in the paper...
26.06.2025 07:36 β π 25 π 11 π¬ 2 π 3
Thrilled to share that I successfully defended my PhD dissertation on Monday June 16th!
The dissertation is available here: doi.org/10.33540/2960
18.06.2025 14:21 β π 16 π 2 π¬ 3 π 1
New paper out at Journal of Memory and Language! We knew that individual differences in working memory predict source memory, but did it predict simple item recognition memory (that relied on less attention resources than source memory)? Our answer is: YES! (with @edvogel.bsky.social ) 1/5
18.06.2025 15:56 β π 9 π 3 π¬ 1 π 0
Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social
Open Access link: doi.org/10.3758/s134...
12.06.2025 07:21 β π 14 π 8 π¬ 0 π 0
Thanks to the support of the Dutch Research Council (NWO) and @knaw-nl.bsky.social , we're thrilled to announce the international symposium "Advances in the Encephalographic study of Attention"! π§ π
π
Date: June 25th & 26th
π Location: Trippenhuis, Amsterdam
04.06.2025 20:14 β π 9 π 8 π¬ 2 π 0
Sensory sensitivity
We study how people react to sensory input like lights and sounds. You can help by completing a short online questionnaire. You can also sign up for an optional 3-hour lab session in Utrecht (β¬12/ hour) involving EEG and hearing tests.
Take the survey here: tinyurl.com/457z89ta
04.06.2025 08:43 β π 2 π 1 π¬ 0 π 0
a soccer player with the number 10 on his jersey is being shown a red card
ALT: a soccer player with the number 10 on his jersey is being shown a red card
Long overdue! Didn't promote this one amid twitter/X chaos. But nearing the end of my PhD, I want to do this project justice and post it here:
Is visual working memory used differently when errors are penalized?
Out already 1+ year ago in JEP:LMC: research-portal.uu.nl/ws/files/258...
π§΅ (1/3)
29.05.2025 07:48 β π 11 π 1 π¬ 1 π 1
Good morning #VSS2025, if you care for a chat about the role of attention in binding object features (during perceptual encoding and memory maintenance), drop by my poster now (8:30-12:30) in the pavilion (422). Hope to see you there!
19.05.2025 12:36 β π 14 π 3 π¬ 3 π 0
@vssmtg.bsky.social
presentations today!
R2, 15:00
@chrispaffen.bsky.social:
Functional processing asymmetries between nasal and temporal hemifields during interocular conflict
R1, 17:15
@dkoevoet.bsky.social:
Sharper Spatially-Tuned Neural Activity in Preparatory Overt than in Covert Attention
18.05.2025 09:41 β π 6 π 4 π¬ 1 π 0
Also donβt forget @chrispaffen.bsky.socialβs talk tomorrow in the binocular vision session!
17.05.2025 18:08 β π 0 π 0 π¬ 0 π 0
Attending @vssmtg.bsky.social? Come check out my talk on EEG decoding of preparatory overt and covert attention!
Tomorrow in the Attention: Neural Mechanisms session at 17:15. You can check out the preprint in the meantime:
17.05.2025 18:06 β π 6 π 3 π¬ 1 π 0
As always, thanks to all people involved in the project: @cstrauch.bsky.social, Marnix Naber, @stigchel.bsky.social. @attentionlab.bsky.social
16.05.2025 13:36 β π 2 π 0 π¬ 0 π 0
We previously showed that affordable eye movements are preferred over costly ones. What happens when salience comes into play?
In our new paper, we show that even when salience attracts gaze, costs remain a driver of saccade selection.
OA paper here:
doi.org/10.3758/s134...
16.05.2025 13:36 β π 9 π 4 π¬ 1 π 0
A big thanks to all those involved in the project: Vicky Voet, @henryjones.bsky.social, Edward Awh @cstrauch.bsky.social
@stigchel.bsky.social!
13.05.2025 07:51 β π 2 π 0 π¬ 0 π 0
Preparing overt eye movements and directing covert attention are neurally coupled. Yet, this coupling breaks down at the single-cell level. What about populations of neurons?
We show: EEG decoding dissociates preparatory overt from covert attention at the population level:
doi.org/10.1101/2025...
13.05.2025 07:51 β π 16 π 8 π¬ 2 π 2
Thrilled to share that, as of May 1st, I have started as a postdoc at The University of Manchester!
I will investigate looked-but-failed-to-see (LBFTS) errors in visual search, under the expert guidance of Johan Hulleman and Jeremy Wolfe. Watch this space!
07.05.2025 12:37 β π 16 π 2 π¬ 1 π 0
A move you can afford
Where a person will look next can be predicted based on how much it costs the brain to move the eyes in that direction.
Where a person will look next can be predicted based on how much it costs the brain to move the eyes in that direction.
24.04.2025 10:03 β π 7 π 2 π¬ 0 π 0
If you're interested in this article, here's the link: trebuchet.public.springernature.app/get_content/... Thanks to my supervisors @suryagayet.bsky.social @chrispaffen.bsky.social and @Stefan Van der Stigchel for their support!π
23.04.2025 03:23 β π 11 π 4 π¬ 1 π 0
@andresahakian.bsky.social Do you know?
21.04.2025 06:42 β π 0 π 0 π¬ 0 π 0
Thanks so much to @cstrauch.bsky.social and all other co-authors: Laura Van Zantwijk, @sebastiaanmathot.bsky.social, Marnix Naber and Stefan Van der Stigchel. @attentionlab.bsky.social. Really happy how this turned out! Stay tuned for follow-up experiments!
08.04.2025 08:06 β π 4 π 1 π¬ 0 π 0
In our latest paper @elife.bsky.social we show that we choose to move our eyes based on effort minimization. Put simply, we prefer affordable over more costly eye movements.
eLife's digest:
elifesciences.org/digests/9776...
The paper:
elifesciences.org/articles/97760
#VisionScience
08.04.2025 08:06 β π 13 π 4 π¬ 1 π 2
A move you can afford
Where a person will look next can be predicted based on how much it costs the brain to move the eyes in that direction.
We show that eye-movements are selected based on effort minimization - finally final in @elife.bsky.social
eLife's digest:
elifesciences.org/digests/9776...
& the 'convincing & important' paper:
elifesciences.org/articles/97760
I consider this my coolest ever project!
#VisionScience #Neuroscience
07.04.2025 19:28 β π 81 π 26 π¬ 4 π 3
Heat map of gaze locations overlaid on top of a feature-rich collage image. There is a seascape with a kitesurfer, mermaid, turtle, and more.
New preprint!
We present two very large eye tracking datasets of museum visitors (4-81 y.o.!) who freeviewed (n=1248) or searched for a +/x (n=2827) in a single feature-rich image.
We invite you to (re)use the dataset and provide suggestions for future versions π
osf.io/preprints/os...
28.03.2025 09:34 β π 23 π 6 π¬ 2 π 1
<em>Psychophysiology</em> | SPR Journal | Wiley Online Library
Previous studies have shown that the pupillary light response (PLR) can physiologically index covert attention, but only with highly simplistic stimuli. With a newly introduced technique that models ....
Out in Psychophysiology (OA):
Typically, pupillometry struggles with complex stimuli. We introduced a method to study covert attention allocation in complex video stimuli -
effects of top-down attention, bottom-up attention, and pseudoneglect could all be recovered.
doi.org/10.1111/psyp.70036
21.03.2025 14:53 β π 23 π 6 π¬ 2 π 0
Check out my longer posts on Substack (mostly detailed tutorials about data science and LLMs). mikexcohen.substack.com
Explore my video-based courses and books β sincxpress.com
Computational Neuroscientist β’ Associate prof @Donders Institute/Radboud University β’ Member De Jonge Akademie β’ founder Dutch Brain Olympiad and BrainHelpDesk β’ website: fleurzeldenrust.nl β’ ORCID: 0000-0002-9084-9520
Cognitive neuroscientist studying how we pay attention, associate professor of Psychology at UChicago, cablab.uchicago.edu director
CNRS Researcher (DR) at CerCo, in Toulouse (France). Interested in brain oscillations and visual perception/ awareness.
Assistant Professor at Wake Forest University. I previously did a post doc with Brian Anderson at Texas A&M University and one with Nick Gaspelin (who is now at Mizzou) at SUNY Binghamton. I study visual attention with a focus on distractor suppression.
PhD Candidate in the Visual Cognitive Neuroscience group & Predictive Brain Lab @ Donders Institute. I investigate laminar interactions between visual imagery and perception.
AKA Johnny Foxe & Sean Mac an tSionnaigh
Editor-in-Chief, European Journal of Neuroscience
Director, The Del Monte Institute for Neuroscience, University of Rochester, New York.
working in neuroscience / psychology / cognition @UChicago w/ Awh-Vogel Lab
Cognitive neuroscientist interested in high level vision (faces, scenes etc.), learning and plasticity. All views are my own.
Assistant Professor of Psychology at the University of Victoria studying memory, eye movements, and aging.
wynnlab.org
Welcome to the VAL (also known as the Wolfe Lab)
Our lab specializes in Visual Search and is run by Dr. Jeremy Wolfe
We are affiliated with Brigham and Women's Hospital and Harvard Medical School
Vision scientist. Lecturer in psychology at Queen Mary University of London.π¦πΊπ¨π¦π©πͺπ¬π§
π www.emmaemstewart.com
Postdoctoral researcher in the CoCoSys & Temporal
Attention Labs, Leiden University
phijoh.github.io
Husband, uncle, son, dog dad, scientist, beer enthusiast, sports nut, skeptic.
esterlabunr.com
Professor of #CogSci and #Stats @UCIrvine; Pursuer of Lofty Undertakings; Purveyor of Articles Odd and Quaint; and Protector of the Realm. #blm #trahr he/him
Computational cognitive scientist at NYU. Founder of Growing up in Science.
PhD student in the CAB Lab at the University of Chicago studying sustained attention fluctuations and their consequences. She/her. https://annacorriveau.github.io/