π¨ Preprint alert! π¨
Check out @suzibot.bsky.social's preprint: βVisual search is constrained by the variability of object-category templates.β
Includes some neat findings on individual differences in search π
@dkaiserlab.bsky.social
Studying natural vision at JLU Giessen using psychophysics, neuroimaging, and computational models, PI: Daniel Kaiser. danielkaiser.net
π¨ Preprint alert! π¨
Check out @suzibot.bsky.social's preprint: βVisual search is constrained by the variability of object-category templates.β
Includes some neat findings on individual differences in search π
Weβre happy to announce that Gongting Wang has successfully defended his PhD thesis at Freie UniversitΓ€t Berlin. Congratulations, Dr. Wang! π
27.01.2026 10:20 β π 13 π 0 π¬ 1 π 0π¨ Preprint alert! π¨
16.12.2025 15:27 β π 6 π 0 π¬ 0 π 0Check out @michaengesee.bsky.social 's preprint on individual differences in expectations about natural scenes and how they shape how we perceive and neurally represent scenes. ποΈπ§ π¨βπ¦±π©βπ¦°π©βπ¦³
08.12.2025 10:20 β π 7 π 1 π¬ 0 π 02025 - Christmas Party Crew! π
08.12.2025 10:11 β π 13 π 1 π¬ 0 π 0Here's a press release (in German):β¨www.uni-giessen.de/de/ueber-uns...β¨β¨
And here's tagging some of the great people involved: @kathadobs.bsky.social , @martinhebart.bsky.social , @haplab.bsky.social, @peelen.bsky.social .
Super happy to announce that our Research Training Group "PIMON" is funded by the @dfg.de ! Starting in October, we will have exciting opportunities for PhD students that want to explore object and material perception & interaction in GieΓen @jlugiessen.bsky.social ! Just look at this amazing team!
03.12.2025 12:46 β π 31 π 5 π¬ 1 π 1I am very excited to share our new preprint, spearheaded by the brilliant @lunahuestegge.bsky.social, w/ @peterkok.bsky.social and others: βAn attempt to push mental imagery over the reality threshold using non-invasive brain stimulationβ
doi.org/10.31234/osf...
Decoding the rhythmic representation and communication of visual contents
www.cell.com/trends/neuro...
#neuroscience
From Michaβs farewell gathering before his temporary leave π« - heβll still be missed!
22.10.2025 12:24 β π 7 π 0 π¬ 0 π 0A few snapshots from this yearβs Kaiser Lab retreat πΏπ§ β¨
14.10.2025 10:47 β π 12 π 2 π¬ 0 π 0Together with @seeingxie.bsky.social , @singerjohannes.bsky.social , Bati Yilmaz, @dkaiserlab.bsky.social y.social , Radoslaw M. Cichy.
10.10.2025 14:37 β π 0 π 0 π¬ 0 π 0By utilizing the visual backward masking paradigm, this study aimed to disentangle the contributions of feedforward and recurrent processing, revealing that recurrent processing significantly shapes the object representations across the ventral visual stream.
journals.plos.org/plosbiology/...
How can we characterize the contents of our internal models of the world? We highlight participant-driven approaches, from drawings to descriptions, to study how we expect scenes to look! π€©
08.10.2025 09:27 β π 9 π 1 π¬ 1 π 0In this paper we present flexible methods for participants to express their expectations about natural scenes.
08.10.2025 09:13 β π 9 π 1 π¬ 0 π 0Time to expand how we study natural scene perception! π
w/ @michaengesee.bsky.social , @suzibot.bsky.social l, Ilker Duymaz, Gongting Wang, Matthew J. Foxwell, Radoslaw M. Cichy, David Pitcher & @dkaiserlab.bsky.social
From line drawings to scene perception β our new review argues for moving beyond experimenter-driven manipulations toward participant-driven approaches to reveal whatβs in our internal models of the visual world. ποΈβοΈπ
royalsocietypublishing.org/doi/10.1098/...
Kaiser Lab is at ECVP this year! Come check out our studies π
25.08.2025 08:24 β π 12 π 1 π¬ 0 π 0Go Lu!! Congratulations π
26.06.2025 18:35 β π 4 π 0 π¬ 0 π 0We had an awesome time discussing the role of feedback in perception @unil.bsky.social π§ and exploring the beautiful city of Lausanne! πποΈ Thanks to @icepfl.bsky.social @davidpascucci.bsky.social
26.06.2025 18:34 β π 6 π 1 π¬ 0 π 0All-topographic ANNs! Now out in @nathumbehav.nature.com , led by @zejinlu.bsky.social , in collaboration with @timkietzmann.bsky.social.
See below for a summary π
Donβt miss out again! If you are interested in our studies presented at @VSSMtg , you can find our posters here: π
drive.google.com/drive/mobile...
Kaiser Lab goes #Exzellenz! The excellence cluster βThe Adaptive Mindβ got funding for the next 7 years! Proud to be part of this awesome team and looking forward to all the exciting research to come. πβ¨π₯
23.05.2025 11:44 β π 22 π 1 π¬ 1 π 0#VSS2025 was a blast - great science, fun people, beach vibes. Weβll be back! @vssmtg.bsky.social
22.05.2025 15:37 β π 14 π 2 π¬ 0 π 0How fast do we extract distance cues from scenes π€οΈποΈποΈ and integrate them with retinal size π§Έπ» to infer real object size? Our new EEG study in Cortex has answers! w/ @dkaiserlab.bsky.social @suryagayet.bsky.social @peelen.bsky.social
Check it out: www.sciencedirect.com/science/arti...
We are at @vssmtg.bsky.social 2025. ποΈβοΈπΉcome check out our studies!
15.05.2025 09:04 β π 12 π 2 π¬ 0 π 1Great talk on Beauty and the Brain at CMBB Day in Marburg! π§ ποΈ
11.05.2025 11:30 β π 5 π 0 π¬ 0 π 0Time to celebrate a new ANR-DFG funded project in collaboration with the stellar @dkaiserlab.bsky.social
Stay totally tuned!!! π
Glad to share our EEG study now is out in JNP. π We showed that alpha rhythms automatically interpolate occluded motion based on the surrounding contextual cues! πΆββοΈποΈ
@dkaiserlab.bsky.social
journals.physiology.org/doi/full/10....
Representational shifts from bottom-up gamma to top-down alpha dynamics drive visual integration, highlighting the crucial role of cortical feedback in the construction of seamless perceptual experiences.
@lixiangchen.bsky.social @dkaiserlab.bsky.social
doi.org/10.1038/s42003-025-08011-0