Christoph Strauch's Avatar

Christoph Strauch

@cstrauch.bsky.social

Assistant professor @utrechtuniversity.bsky.social studying spatial attention, eye-movements, pupillometry, and more. Co-PI @attentionlab.bsky.social

456 Followers  |  443 Following  |  75 Posts  |  Joined: 07.11.2023  |  2.279

Latest posts by cstrauch.bsky.social on Bluesky

I'm still waiting for you to write that package that recovers pupil size from mri recordings - gotta find a way to make all that fmri data useful ;-)

25.09.2025 14:55 β€” πŸ‘ 4    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Post image

Incredible study by Raut et al.: by tracking a single measure (pupil size), you can model slow, large-scale dynamics in neuronal calcium, metabolism, and brain blood oxygen through a shared latent space! www.nature.com/articles/s41...

25.09.2025 08:53 β€” πŸ‘ 66    πŸ” 17    πŸ’¬ 1    πŸ“Œ 1

I'll show some (I think) cool stuff about how we can measure the phenomenology of synesthesia in a physiological way at #ECVP - Color II, atrium maximum, 9:15, Thursday.

say hi and show your colleagues that you're one of the dedicated ones by getting up early on the last day!

27.08.2025 21:13 β€” πŸ‘ 9    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0
Redirecting

A good reason to make it to the early #visualcognition session of today's #ecvp2025 πŸ‘‰ @anavili.bsky.social will talk about how attending to fuzzy bright/dark patches that have faded from awareness (through adaptation-like processes) still affect pupil size! βš«πŸ‘€βšͺ Paper: dx.doi.org/10.1016/j.co...

27.08.2025 05:45 β€” πŸ‘ 6    πŸ” 2    πŸ’¬ 0    πŸ“Œ 0
Preview
Anticipated Relevance Prepares Visual Processing for Efficient Memory-Guided Selection Finding an object typically involves the use of working memory to prioritize relevant visual information at the right time. For example, successfully detecting a highway exit sign is useless when your...

Excited to present at #ECVP2025 - Monday afternoon, Learning & Memory - about how anticipating relevant visual events prepares visual processing for efficient memory-guided visual selection! 🧠πŸ₯³
@attentionlab.bsky.social @ecvp.bsky.social

Preprint for more details: www.biorxiv.org/content/10.1...

24.08.2025 13:47 β€” πŸ‘ 10    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0

#ECVP @ecvp.bsky.social will be so much fun!

24.08.2025 16:31 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Post image

#ECVP2025 starts with a fully packed room!

I'll show data, demonstrating that synesthetic perception is perceptual, automatic, and effortless.
Join my talk (Thursday, early morning.., Color II) to learn how the qualia of synesthesia can be inferred from pupil size.
Join and (or) say hi!

24.08.2025 16:28 β€” πŸ‘ 18    πŸ” 3    πŸ’¬ 1    πŸ“Œ 1
Preview
A review of the costs of eye movements Nature Reviews Psychology - Eye movements are the most frequent movements that humans make. In this Review, SchΓΌtz and Stewart integrate evidence regarding the costs of eye movements and...

Eye movements are cheap, right? Not necessarily! πŸ’° In our review just out in @natrevpsychol.nature.com, Alex SchΓΌtz and I discuss the different costs associated with making an eye movement, how these costs affect behaviour, and the challenges of measuring this… rdcu.be/eAm69 #visionscience #vision

12.08.2025 10:44 β€” πŸ‘ 23    πŸ” 10    πŸ’¬ 1    πŸ“Œ 0

I think there is a lot one doesn't think of intuitively. Lossy compression of audio files directly built on psychophysics, for instance (no (hardcore experimental)psychology, no spotify!). Or take all the work foundational for artificial neural networks that comes from cognitive psych&modeling

01.08.2025 20:12 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

Together with @ajhoogerbrugge.bsky.social, Roy Hessels and Ignace Hooge - thanks all!

29.07.2025 07:39 β€” πŸ‘ 3    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
data saturation for gaze heatmaps. Initially, any additional participant will bring the total NSS or AUC as measures for heatmap similarity a lot closer to the full sample. However, the returns diminish increasingly at higher n.

data saturation for gaze heatmaps. Initially, any additional participant will bring the total NSS or AUC as measures for heatmap similarity a lot closer to the full sample. However, the returns diminish increasingly at higher n.

Gaze heatmaps (are popular especially for eye-tracking beginners and in many applied domains. How many participants should be tested?
Depends of course, but our guidelines help navigating this in an informed way.

Out now in BRM (free) doi.org/10.3758/s134...
@psychonomicsociety.bsky.social

29.07.2025 07:37 β€” πŸ‘ 9    πŸ” 2    πŸ’¬ 1    πŸ“Œ 0

good god

23.07.2025 10:56 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Vacatures bij de RUG

Jelmer Borst and I are looking for a PhD candidate to build an EEG-based model of human working memory! This is a really cool project that I've wanted to kick off for a while, and I can't wait to see it happen. Please share and I'm happy to answer any Qs about the project!
www.rug.nl/about-ug/wor...

03.07.2025 13:29 β€” πŸ‘ 15    πŸ” 21    πŸ’¬ 1    πŸ“Œ 1
Preview
PhD position β€” Rademaker lab

Curious about the visual human brain, a vibrant and collaborative lab, and pursuing a PhD in the heart of Europe? My lab is recruiting for a 3-year PhD position. More details: www.rademakerlab.com/job-add

01.07.2025 06:43 β€” πŸ‘ 47    πŸ” 46    πŸ’¬ 1    πŸ“Œ 4

so nice, they are lucky to have you over there!

19.06.2025 10:57 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

We had a splendid day: great weather, got to wear peculiar/special clothes, and then Alex even defended his PhD (and nailed it!).

Congratulations dr. Alex, super proud of your achievements!!!

18.06.2025 14:50 β€” πŸ‘ 4    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social

Open Access link: doi.org/10.3758/s134...

12.06.2025 07:21 β€” πŸ‘ 14    πŸ” 8    πŸ’¬ 0    πŸ“Œ 0

@vssmtg.bsky.social
presentations today!

R2, 15:00
@chrispaffen.bsky.social:
Functional processing asymmetries between nasal and temporal hemifields during interocular conflict

R1, 17:15
@dkoevoet.bsky.social:
Sharper Spatially-Tuned Neural Activity in Preparatory Overt than in Covert Attention

18.05.2025 09:41 β€” πŸ‘ 6    πŸ” 4    πŸ’¬ 1    πŸ“Œ 0

Cool new preprint by Damian. Among other findings: eye pupillometry, EEG & IEMs show that the premotor theory of attention can't be the full story: eye movements are associated with an additional, separable, spatially tuned process compared to covert attention, hundreds of ms before shifts happen.

13.05.2025 08:17 β€” πŸ‘ 4    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0
Preview
A move you can afford Where a person will look next can be predicted based on how much it costs the brain to move the eyes in that direction.

Where a person will look next can be predicted based on how much it costs the brain to move the eyes in that direction.

04.05.2025 08:21 β€” πŸ‘ 15    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0

Thanks!

26.04.2025 06:19 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

Let me know if you're still unconvinced and, if so, why. I'm also happy to present it in more detail at a lab meeting or online.
Cheers!

25.04.2025 07:46 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Redirecting

All together, it's certainly correct that pupillometry requires care as it's just two output systems and in many (but not all) ways just one with multiple inputs. But they are well understood (shameless plug to my tins papers here):
doi.org/10.1016/j.ti...
doi.org/10.1016/j.ti...

25.04.2025 07:34 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Preview
The Costs of Paying Overt and Covert Attention Assessed With Pupillometry - Damian Koevoet, Christoph Strauch, Marnix Naber, Stefan Van der Stigchel, 2023 Attention can be shifted with or without an accompanying saccade (i.e., overtly or covertly, respectively). Thus far, it is unknown how cognitively costly these...

With pupil size you can also measure the costs of shifting covert attention, see our paper from 2023 (doi.org/10.1177/0956...).

25.04.2025 07:30 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

Lastly, are there other physiological measures that point to similar effects? Yes. Saccade latencies show similar effects - providing convergent evidence to our bottom line of effort driving saccade selection. Latencies are just not as nice as they are not separate from the movement itself

25.04.2025 07:28 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

There are very systematic investigations into pupil size across fixation location for methodological reasons (the foreshortening of pupil size relative to the eyetracker). Fixation location itself does not show up/down differences on pupil size.

25.04.2025 07:00 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

The difference in pupil size when preparing upward vs downward directed saccades (~0.1z) is absolutely meaningless relative to the effect of a PLR (~3-4 is not uncommon, depending on luminance change). If there would be a hard-coded map built in, it wouldn't serve any meaningful protective role

25.04.2025 06:55 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

both the up/down and the cardinal/oblique differences are highly predictive of participants' free choices in the separate saccade selection task (no delay here, no pupil measurement here). But your point would then be that cardinal/oblique indeed reflects effort minimization, up/down doesn't?

25.04.2025 06:55 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

It's important to realize that our pupil measures are taken when the eye is perfectly still in the center. The only difference being that participants know which direction the will have to make a saccade to later.
Are you on board with larger pupil size = more effort here when preparing diagonals?

25.04.2025 06:48 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

the PLR has two primary functions, one of which is to protect the retina from excess luminance. The other is to optimize contrast perception. A hard-wired up/down map would jeopardize the later in many cases. Given how important contrast perception is for vision, I highly doubt that would be smart.

25.04.2025 06:46 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

@cstrauch is following 20 prominent accounts