Go Lu!! Congratulations π
26.06.2025 18:35 β π 3 π 0 π¬ 0 π 0@dkaiserlab.bsky.social
Studying natural vision at JLU Giessen using psychophysics, neuroimaging, and computational models, PI: Daniel Kaiser. danielkaiser.net
Go Lu!! Congratulations π
26.06.2025 18:35 β π 3 π 0 π¬ 0 π 0We had an awesome time discussing the role of feedback in perception @unil.bsky.social π§ and exploring the beautiful city of Lausanne! πποΈ Thanks to @icepfl.bsky.social @davidpascucci.bsky.social
26.06.2025 18:34 β π 5 π 1 π¬ 0 π 0All-topographic ANNs! Now out in @nathumbehav.nature.com , led by @zejinlu.bsky.social , in collaboration with @timkietzmann.bsky.social.
See below for a summary π
Donβt miss out again! If you are interested in our studies presented at @VSSMtg , you can find our posters here: π
drive.google.com/drive/mobile...
Kaiser Lab goes #Exzellenz! The excellence cluster βThe Adaptive Mindβ got funding for the next 7 years! Proud to be part of this awesome team and looking forward to all the exciting research to come. πβ¨π₯
23.05.2025 11:44 β π 22 π 1 π¬ 1 π 0#VSS2025 was a blast - great science, fun people, beach vibes. Weβll be back! @vssmtg.bsky.social
22.05.2025 15:37 β π 13 π 2 π¬ 0 π 0How fast do we extract distance cues from scenes π€οΈποΈποΈ and integrate them with retinal size π§Έπ» to infer real object size? Our new EEG study in Cortex has answers! w/ @dkaiserlab.bsky.social @suryagayet.bsky.social @peelen.bsky.social
Check it out: www.sciencedirect.com/science/arti...
We are at @vssmtg.bsky.social 2025. ποΈβοΈπΉcome check out our studies!
15.05.2025 09:04 β π 12 π 2 π¬ 0 π 1Great talk on Beauty and the Brain at CMBB Day in Marburg! π§ ποΈ
11.05.2025 11:30 β π 5 π 0 π¬ 0 π 0Time to celebrate a new ANR-DFG funded project in collaboration with the stellar @dkaiserlab.bsky.social
Stay totally tuned!!! π
Glad to share our EEG study now is out in JNP. π We showed that alpha rhythms automatically interpolate occluded motion based on the surrounding contextual cues! πΆββοΈποΈ
@dkaiserlab.bsky.social
journals.physiology.org/doi/full/10....
Representational shifts from bottom-up gamma to top-down alpha dynamics drive visual integration, highlighting the crucial role of cortical feedback in the construction of seamless perceptual experiences.
@lixiangchen.bsky.social @dkaiserlab.bsky.social
doi.org/10.1038/s42003-025-08011-0
Now out in @royalsocietypublishing.org Proceedings B! Check out Gongting Wang's EEG work on individual differences in scene perception: Scenes that are more typical for individual observers are represented in an enhanced yet more idiosyncratic way. Link:
royalsocietypublishing.org/doi/10.1098/...
We search for an object in an array just like we search for a person in a crowd. Or no? Here the performance in visual search suggests distinct algorithms or implementation in social vs. non-social scene perception. With N. Goupil & @dkaiserlab.bsky.social psycnet.apa.org/record/2025-...
11.03.2025 07:53 β π 18 π 5 π¬ 0 π 0TeaP 2025 - It was a pleasure! π€π
16.03.2025 11:23 β π 18 π 1 π¬ 1 π 0Hey hey! Weβre on the Cover of @cp-trendsneuro.bsky.social.
-check out our article on rhythmic representations in the visual system below. ποΈπ§ π
We make about 3-4 fast eye movements a second, yet our world appears stable. How is this possible? In a preprint led by @lucakaemmer.bsky.social we test the intriguing idea that anticipatory signals in the fovea may explain visual stability.
www.biorxiv.org/content/10.1...
In this review article, I summarize some of our recent work on the neural basis of visual search in scenes, showing how attention and expectation interactively drive preparatory activity in visual cortex and jointly modulate the visual processing of potential target objects. doi.org/10.1177/0963...
20.02.2025 17:36 β π 24 π 10 π¬ 2 π 0ππ£οΈNEW #PhD opportunity in #Psychology and #Neuroscience with Prof Neil Roach (Nottingham) and me (Leicester):
This ESRC-funded project aims to use individual differences to study the socially relevant components of time perception.
Application deadline: 24 February
www.uni-giessen.de/de/ueber-uns...
If you're interested in applying, feel free to get in touch beforehand - happy to informally answer any questions you may have.
Feel free to forward, too!
Job alert! We are now looking for a PhD student starting Sept 2025, for a project on visual relations between people and objects, using behavior, EEG, fMRI, and ANNs. The project involves a collaboration with brilliant @ljubapi.bsky.social. You can find the official advertisement below.π
18.02.2025 15:36 β π 7 π 4 π¬ 1 π 1Conceptual Roadmap of the present study. To examine the relationship between the metabolic costs of visual processing and aesthetic pleasure, we used both computational and physiological measures to quantify metabolic costs during visual processing: 1. Model-derived estimates of metabolic costs based on the activation of a deep neural network; 2. Metabolic activity of human brains, specifically in the visual processing areas. We found that both measures were inversely related to aesthetic pleasure.
Energy efficiency drives evolution, and humans may have evolved pleasure-based signals to optimize actions. Does this extend to aesthetic pleasure?
Yes!
We find strong evidence in silico and human observers!
osf.io/preprints/ps...
With Yikai Tang and Wil Cunningham.
@uoftpsychology.bsky.social
A PhD position is open in @peelen.bsky.social Lab at the Donders Institute - please spread the word and consider applying if you are interested about how imagery and perception relate to each other www.ru.nl/en/working-a... - plus we are fun people to work with ! :)
25.01.2025 07:52 β π 46 π 36 π¬ 1 π 1Now out in @cp-trendsneuro.bsky.social : We discuss how the contents of visual perception, imagery, and prediction can be decoded from rhythmic brain activity and argue that such rhythmic representations offer new insights into neural information propagation. www.sciencedirect.com/science/arti...
16.01.2025 08:46 β π 55 π 26 π¬ 0 π 1Our review on the theoretical status of oscillations and field potentials is out! What are their effects, and what can electrophysiology signals reveal about how the brain works?
w/ @dlevenstein.bsky.social @prokraustinator.bsky.social Bradley Voytek @rdgao.bsky.social
www.cell.com/trends/cogni...
Hello Bluesky! This is the Kaiser Lab.
We are cognitive neuroscientists studying the neural computations that support real-world vision using fMRI, EEG, and TMS. Our main research areas include individual differences, categorization, and aesthetics.
For more information: danielkaiser.net