Damian Koevoet's Avatar

Damian Koevoet

@dkoevoet.bsky.social

PhD Candidate interested in visual attention and memory | Utrecht University | AttentionLab @attentionlab.bsky.social

154 Followers  |  211 Following  |  20 Posts  |  Joined: 13.11.2024  |  2.0139

Latest posts by dkoevoet.bsky.social on Bluesky

Sensory eye dominance varies over the horizontal axis of the visual field: left eye dominance for the right visual field; right eye dominance for the left visual field.

Sensory eye dominance varies over the horizontal axis of the visual field: left eye dominance for the right visual field; right eye dominance for the left visual field.

Surely you know about eye dominance. You probably don’t know it’s not a unitary phenomenon: in this paper I show that sensory eye dominance varies over the visual field. In the Discussion I propose an explanation for why this variation might exist. Curious? Read it here: doi.org/10.1167/jov....

02.07.2025 07:46 β€” πŸ‘ 6    πŸ” 3    πŸ’¬ 1    πŸ“Œ 0

Had a blast at last week's symposium! Inspiring to hear all the talks about EEG and attention.

I presented on the neural correlates of saccade preparation and covert spatial attention. Check out the preprint here: doi.org/10.1101/2025...

30.06.2025 09:48 β€” πŸ‘ 6    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

Last week's symposium titled "Advances in the Encephalographic Study of Attention" was a great success! Held in the KNAW building in Amsterdam and sponsored by the NWO, many of (Europe's) leading attention researchers assembled to discuss the latest advances in attention research using M/EEG.

30.06.2025 07:12 β€” πŸ‘ 24    πŸ” 7    πŸ’¬ 4    πŸ“Œ 3

In this new preprint, in review @elife.bsky.social, we show what processing steps make up the reaction time using single trial #EEG modelling in a contrast #decision task.

In this 🧡 I'm telling the story behind it as I think it is quite interesting and I can't write it like this in the paper...

26.06.2025 07:36 β€” πŸ‘ 23    πŸ” 11    πŸ’¬ 2    πŸ“Œ 2
Post image Post image

Thrilled to share that I successfully defended my PhD dissertation on Monday June 16th!

The dissertation is available here: doi.org/10.33540/2960

18.06.2025 14:21 β€” πŸ‘ 16    πŸ” 2    πŸ’¬ 3    πŸ“Œ 1

New paper out at Journal of Memory and Language! We knew that individual differences in working memory predict source memory, but did it predict simple item recognition memory (that relied on less attention resources than source memory)? Our answer is: YES! (with @edvogel.bsky.social ) 1/5

18.06.2025 15:56 β€” πŸ‘ 9    πŸ” 3    πŸ’¬ 1    πŸ“Œ 0

Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social

Open Access link: doi.org/10.3758/s134...

12.06.2025 07:21 β€” πŸ‘ 14    πŸ” 8    πŸ’¬ 0    πŸ“Œ 0

Thanks to the support of the Dutch Research Council (NWO) and @knaw-nl.bsky.social , we're thrilled to announce the international symposium "Advances in the Encephalographic study of Attention"! πŸ§ πŸ”

πŸ“… Date: June 25th & 26th
πŸ“ Location: Trippenhuis, Amsterdam

04.06.2025 20:14 β€” πŸ‘ 9    πŸ” 8    πŸ’¬ 2    πŸ“Œ 0

Sensory sensitivity

We study how people react to sensory input like lights and sounds. You can help by completing a short online questionnaire. You can also sign up for an optional 3-hour lab session in Utrecht (€12/ hour) involving EEG and hearing tests.
Take the survey here: tinyurl.com/457z89ta

04.06.2025 08:43 β€” πŸ‘ 2    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0
Preview
a soccer player with the number 10 on his jersey is being shown a red card ALT: a soccer player with the number 10 on his jersey is being shown a red card

Long overdue! Didn't promote this one amid twitter/X chaos. But nearing the end of my PhD, I want to do this project justice and post it here:

Is visual working memory used differently when errors are penalized?

Out already 1+ year ago in JEP:LMC: research-portal.uu.nl/ws/files/258...
🧡 (1/3)

29.05.2025 07:48 β€” πŸ‘ 11    πŸ” 1    πŸ’¬ 1    πŸ“Œ 1
Post image

Good morning #VSS2025, if you care for a chat about the role of attention in binding object features (during perceptual encoding and memory maintenance), drop by my poster now (8:30-12:30) in the pavilion (422). Hope to see you there!

19.05.2025 12:36 β€” πŸ‘ 14    πŸ” 3    πŸ’¬ 3    πŸ“Œ 0

@vssmtg.bsky.social
presentations today!

R2, 15:00
@chrispaffen.bsky.social:
Functional processing asymmetries between nasal and temporal hemifields during interocular conflict

R1, 17:15
@dkoevoet.bsky.social:
Sharper Spatially-Tuned Neural Activity in Preparatory Overt than in Covert Attention

18.05.2025 09:41 β€” πŸ‘ 6    πŸ” 4    πŸ’¬ 1    πŸ“Œ 0

Also don’t forget @chrispaffen.bsky.social’s talk tomorrow in the binocular vision session!

17.05.2025 18:08 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

Attending @vssmtg.bsky.social? Come check out my talk on EEG decoding of preparatory overt and covert attention!

Tomorrow in the Attention: Neural Mechanisms session at 17:15. You can check out the preprint in the meantime:

17.05.2025 18:06 β€” πŸ‘ 6    πŸ” 3    πŸ’¬ 1    πŸ“Œ 0

As always, thanks to all people involved in the project: @cstrauch.bsky.social, Marnix Naber, @stigchel.bsky.social. @attentionlab.bsky.social

16.05.2025 13:36 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

We previously showed that affordable eye movements are preferred over costly ones. What happens when salience comes into play?

In our new paper, we show that even when salience attracts gaze, costs remain a driver of saccade selection.

OA paper here:
doi.org/10.3758/s134...

16.05.2025 13:36 β€” πŸ‘ 9    πŸ” 4    πŸ’¬ 1    πŸ“Œ 0

A big thanks to all those involved in the project: Vicky Voet, @henryjones.bsky.social, Edward Awh @cstrauch.bsky.social
@stigchel.bsky.social!

13.05.2025 07:51 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Post image

Preparing overt eye movements and directing covert attention are neurally coupled. Yet, this coupling breaks down at the single-cell level. What about populations of neurons?

We show: EEG decoding dissociates preparatory overt from covert attention at the population level:
doi.org/10.1101/2025...

13.05.2025 07:51 β€” πŸ‘ 16    πŸ” 8    πŸ’¬ 2    πŸ“Œ 2

Thrilled to share that, as of May 1st, I have started as a postdoc at The University of Manchester!

I will investigate looked-but-failed-to-see (LBFTS) errors in visual search, under the expert guidance of Johan Hulleman and Jeremy Wolfe. Watch this space!

07.05.2025 12:37 β€” πŸ‘ 16    πŸ” 2    πŸ’¬ 1    πŸ“Œ 0
Preview
A move you can afford Where a person will look next can be predicted based on how much it costs the brain to move the eyes in that direction.

Where a person will look next can be predicted based on how much it costs the brain to move the eyes in that direction.

24.04.2025 10:03 β€” πŸ‘ 7    πŸ” 2    πŸ’¬ 0    πŸ“Œ 0

If you're interested in this article, here's the link: trebuchet.public.springernature.app/get_content/... Thanks to my supervisors @suryagayet.bsky.social @chrispaffen.bsky.social and @Stefan Van der Stigchel for their support!πŸŽ‰

23.04.2025 03:23 β€” πŸ‘ 11    πŸ” 4    πŸ’¬ 1    πŸ“Œ 0

@andresahakian.bsky.social Do you know?

21.04.2025 06:42 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

Thanks so much to @cstrauch.bsky.social and all other co-authors: Laura Van Zantwijk, @sebastiaanmathot.bsky.social, Marnix Naber and Stefan Van der Stigchel. @attentionlab.bsky.social. Really happy how this turned out! Stay tuned for follow-up experiments!

08.04.2025 08:06 β€” πŸ‘ 4    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0

In our latest paper @elife.bsky.social we show that we choose to move our eyes based on effort minimization. Put simply, we prefer affordable over more costly eye movements.

eLife's digest:
elifesciences.org/digests/9776...

The paper:
elifesciences.org/articles/97760

#VisionScience

08.04.2025 08:06 β€” πŸ‘ 13    πŸ” 4    πŸ’¬ 1    πŸ“Œ 2
Preview
A move you can afford Where a person will look next can be predicted based on how much it costs the brain to move the eyes in that direction.

We show that eye-movements are selected based on effort minimization - finally final in @elife.bsky.social
eLife's digest:
elifesciences.org/digests/9776...
& the 'convincing & important' paper:
elifesciences.org/articles/97760

I consider this my coolest ever project!

#VisionScience #Neuroscience

07.04.2025 19:28 β€” πŸ‘ 82    πŸ” 27    πŸ’¬ 4    πŸ“Œ 3
Heat map of gaze locations overlaid on top of a feature-rich collage image. There is a seascape with a kitesurfer, mermaid, turtle, and more.

Heat map of gaze locations overlaid on top of a feature-rich collage image. There is a seascape with a kitesurfer, mermaid, turtle, and more.

New preprint!

We present two very large eye tracking datasets of museum visitors (4-81 y.o.!) who freeviewed (n=1248) or searched for a +/x (n=2827) in a single feature-rich image.

We invite you to (re)use the dataset and provide suggestions for future versions πŸ“‹

osf.io/preprints/os...

28.03.2025 09:34 β€” πŸ‘ 23    πŸ” 6    πŸ’¬ 2    πŸ“Œ 1
Preview
<em>Psychophysiology</em> | SPR Journal | Wiley Online Library Previous studies have shown that the pupillary light response (PLR) can physiologically index covert attention, but only with highly simplistic stimuli. With a newly introduced technique that models ....

Out in Psychophysiology (OA):

Typically, pupillometry struggles with complex stimuli. We introduced a method to study covert attention allocation in complex video stimuli -
effects of top-down attention, bottom-up attention, and pseudoneglect could all be recovered.

doi.org/10.1111/psyp.70036

21.03.2025 14:53 β€” πŸ‘ 23    πŸ” 6    πŸ’¬ 2    πŸ“Œ 0

As always, big thanks to all my co-authors Marnix Naber, @cstrauch.bsky.social, Stefan van der Stigchel, as well as @attentionlab.bsky.social!

19.03.2025 08:28 β€” πŸ‘ 3    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Preview
<em>Psychophysiology</em> | SPR Journal | Wiley Online Library Dominant theories posit that attentional shifts prior to saccades enable a stable visual experience despite abrupt changes in visual input caused by saccades. However, recent work may challenge this ...

Presaccadic attention facilitates visual continuity across eye movements. However, recent work may suggest that presaccadic attention doesn't shift upward. What's going on?

Our paper shows that presaccadic attention moves up- and downward using the pupil light response.

doi.org/10.1111/psyp.70047

19.03.2025 08:28 β€” πŸ‘ 8    πŸ” 2    πŸ’¬ 1    πŸ“Œ 0
Post image

New paper out now in JEP:G.

"Individual differences in working memory and attentional control continue to predict memory performance despite extensive learning."

psycnet.apa.org/doi/10.1037/xg…

29.01.2025 18:36 β€” πŸ‘ 17    πŸ” 5    πŸ’¬ 1    πŸ“Œ 0

@dkoevoet is following 20 prominent accounts