Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social
Open Access link: doi.org/10.3758/s134...
@andresahakian.bsky.social
PhD candidate at Utrecht University | Interested in working memory, decision making, bayesian stats, online experiments, open science | AttentionLab & CAP-Lab
Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social
Open Access link: doi.org/10.3758/s134...
As always, thanks to @chrispaffen.bsky.social, @suryagayet.bsky.social, and @stigchel.bsky.social!
For the APA/JEP fans, here's the DOI: doi.org/10.1037/xlm0...
Also, how cool is the Taverne Amendment? It made it article open access, didn't even have to ask!
www.openaccess.nl/en/policies/...
Surprisingly, NO! They didn't load their memory more, even though they had capacity to spare (they did load more for an other manipulation).
They instead behaved more cautiously/less risky!
We used a copying task: PPs copied colored shapes from an (always available) example.
Now comes the kicker: we put PPs in a penalty box for 0.5 or 5 whole seconds every time they copied a piece incorrectly.
Did they try to memorize the info better when 5s (vs 0.5s) of their life was at stake?
Long overdue! Didn't promote this one amid twitter/X chaos. But nearing the end of my PhD, I want to do this project justice and post it here:
Is visual working memory used differently when errors are penalized?
Out already 1+ year ago in JEP:LMC: research-portal.uu.nl/ws/files/258...
π§΅ (1/3)
Good morning #VSS2025, if you care for a chat about the role of attention in binding object features (during perceptual encoding and memory maintenance), drop by my poster now (8:30-12:30) in the pavilion (422). Hope to see you there!
19.05.2025 12:36 β π 14 π 3 π¬ 3 π 0If you are at #VSS2025, come to @chrispaffen.bsky.social talk (room 2 at 3PM), where he discusses how binocular conflict in real-world vision may be resolved in an adaptive manner (favoring either nasal or temporal hemi-fields) to optimally perceive partially occluded objects of interest.
18.05.2025 16:12 β π 7 π 2 π¬ 0 π 0Attending @vssmtg.bsky.social? Come check out my talk on EEG decoding of preparatory overt and covert attention!
Tomorrow in the Attention: Neural Mechanisms session at 17:15. You can check out the preprint in the meantime:
Frequentist stats can only reject the null, or not. An unrejected null is no evidence for absence of the effect by design.
Frequentist equivalence tests are one solution. Or a Bayesian approach with a well defined stopping rule (stopping at X amount of evidence in favor OR against the effect).
We previously showed that affordable eye movements are preferred over costly ones. What happens when salience comes into play?
In our new paper, we show that even when salience attracts gaze, costs remain a driver of saccade selection.
OA paper here:
doi.org/10.3758/s134...
@cstrauch.bsky.social, this one!
15.05.2025 16:30 β π 1 π 0 π¬ 0 π 0Huge thanks to advisors/co-authors @suryagayet.bsky.social, @chrispaffen.bsky.social and @stigchel.bsky.social .
15.05.2025 09:56 β π 2 π 0 π¬ 0 π 0One of the interesting solutions we discuss is that the unrestricted vs. forced-choice distinction is key.
We argue that incorporating aspects of natural behavior in VWM paradigms, can reveal a lot about how humans actually use their VWM.
Now for the less obvious beans: this pattern (longer view -> slower decay) does not show up in the typical (forced-choice) VWM paradigms: decay rates are independent from viewing time.
What might be up?
To spill the obvious beans first: memory performance got better with longer views, and it got worse with longer delays after viewing.
What's more: the LONGER a view was, the SLOWER performance got worse.
We had a bunch of people recreate example arrangements of funky shapes, however they wanted, while we tracked where they looked and what they did.
15.05.2025 09:56 β π 1 π 0 π¬ 1 π 0About time our latest project about time got out!
How do self-paced encoding and retention relate to performance in (working) memory-guided actions?
Find out now in Memory and Cognition: doi.org/10.3758/s134...
(or check the short version below)π§΅
Preparing overt eye movements and directing covert attention are neurally coupled. Yet, this coupling breaks down at the single-cell level. What about populations of neurons?
We show: EEG decoding dissociates preparatory overt from covert attention at the population level:
doi.org/10.1101/2025...
Thrilled to share that, as of May 1st, I have started as a postdoc at The University of Manchester!
I will investigate looked-but-failed-to-see (LBFTS) errors in visual search, under the expert guidance of Johan Hulleman and Jeremy Wolfe. Watch this space!
@suryagayet.bsky.social
01.05.2025 17:21 β π 2 π 0 π¬ 0 π 0We show that eye-movements are selected based on effort minimization - finally final in @elife.bsky.social
eLife's digest:
elifesciences.org/digests/9776...
& the 'convincing & important' paper:
elifesciences.org/articles/97760
I consider this my coolest ever project!
#VisionScience #Neuroscience
yea I imagine it's not really an issue for most use cases.
Haha so cool to hear you're planning to do copy tasks! And very happy to hear you found the plugin to be useful :)
Feel free to reach out if you have any copy task/plugin related questions. Very curious what you're up to!
Yes! Check out pipe.jspsych.org, if you haven't already. The setup is quite straightforward, and has worked well for me.
Only potential drawback (if i recall correctly) is that your experiment files are accessible (also to participants), unless you have a paid/private github account.
Last Friday the irreplaceable @luzixu.bsky.social successfully defended her PhD (at @utrechtuniversity.bsky.social). This has been an incredibly productive 3+ years, and we are sad to see her leave, but are very proud of her accomplishments (with @attentionlab.bsky.social, @chrispaffen.bsky.social)!
17.03.2025 15:05 β π 19 π 1 π¬ 2 π 1