Dan Wang's Avatar

Dan Wang

@danwang7.bsky.social

PhD candidate Utrecht University|AttentionLab UU | CAP-Lab | Visual working memory | Attention

214 Followers  |  81 Following  |  20 Posts  |  Joined: 19.11.2024  |  1.7746

Latest posts by danwang7.bsky.social on Bluesky

Preview
Sensory reformatting for a working visual memory A core function of visual working memory (WM) is to sustain mental representations of recent visual inputs, thereby bridging moments of experience. Th…

Sensory reformatting for a working visual memory www.sciencedirect.com/science/arti...

10.10.2025 03:18 β€” πŸ‘ 1    πŸ” 2    πŸ’¬ 0    πŸ“Œ 0

Many thanks to my co-authors!! @suryagayet.bsky.social @jthee.bsky.social @arora-borealis.bsky.social @Stefan Van der Stigchel @Samson Chota

27.08.2025 21:35 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

In conclusion, we show that the dynamic interplay between top-down control and bottom-up saliency directly impacts early visual responses, thereby illuminating a complete timeline of attentional competition in visual cortex.

27.08.2025 21:33 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Post image

Last, the greater the RIFT responses to the target compared to the distractor, the faster the participant responded to the target, demonstrating that the RIFT responses capture behaviorally relevant processes.

27.08.2025 21:32 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

2) The presence of a distractor attenuated the initial RIFT response to the target, reflecting competition during the initial stages of visual processing

27.08.2025 21:31 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

For conditional comparisons of the RIFT responses, we found that 1)Both target and distractor evoked stronger initial RIFT responses than nontargets, reflecting top-down and bottom-up attentional effects on early visual processing. And RIFT responses to the distractor eventually be suppressed.

27.08.2025 21:29 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

For tagging manipulation, we tagged target and distractor in distractor present condition, tagged target and one of the nontarget in distractor absent condition. And frequency-tagging manipulation successfully elicited corresponding frequency-specific neural responses

27.08.2025 21:25 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

We found that the salient distractor captured attention on behavioral level

27.08.2025 21:23 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

In this study, to determine how top-down and bottom-up processes unfold over time in early visual cortex, we employed Rapid Invisible Frequency Tagging (RIFT) while participants performed the additional singleton task.

27.08.2025 21:18 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Preview
Dynamic competition between bottom-up saliency and top-down goals in early visual cortex Task-irrelevant yet salient stimuli can elicit automatic, bottom-up attentional capture and compete with top-down, goal-directed processes for neural representation. However, the temporal dynamics und...

🧠 Excited to share that our new preprint is out!🧠
In this work, we investigate the dynamic competition between bottom-up saliency and top-down goals in the early visual cortex using rapid invisible frequency tagging (RIFT).

πŸ“„ Check it out on bioRxiv: www.biorxiv.org/cgi/content/...

27.08.2025 21:16 β€” πŸ‘ 31    πŸ” 12    πŸ’¬ 2    πŸ“Œ 1
Overview of CAP-Lab presentations at ECVP 2025:

Lasse Dietz: Anticipated relevance modulates early visual processing
Monday 14:45, talk @ Linke Aula 		Learning & Memory

Surya Gayet: Perceptual precedence for expected and dreaded visual events
Tuesday 8:30, talk @ RW 1 		Using interocular suppression during consciousness research

Kabir Arora: Dissociating external and internal attentional selection
Tuesday 9:15, talk @ Atrium Maximum 	Object Recognition & Visual Attention

Dan Wang: Unraveling the time course of attentional capture: an EEG-RIFT study
Tuesday 15:30-17:00, poster @ Foyer Philosophicum

Yichen Yuan: Decoding auditory working memory load from alpha oscillations
Wednesday 10:00-11:30, poster @ Foyer Philosophicum

Overview of CAP-Lab presentations at ECVP 2025: Lasse Dietz: Anticipated relevance modulates early visual processing Monday 14:45, talk @ Linke Aula Learning & Memory Surya Gayet: Perceptual precedence for expected and dreaded visual events Tuesday 8:30, talk @ RW 1 Using interocular suppression during consciousness research Kabir Arora: Dissociating external and internal attentional selection Tuesday 9:15, talk @ Atrium Maximum Object Recognition & Visual Attention Dan Wang: Unraveling the time course of attentional capture: an EEG-RIFT study Tuesday 15:30-17:00, poster @ Foyer Philosophicum Yichen Yuan: Decoding auditory working memory load from alpha oscillations Wednesday 10:00-11:30, poster @ Foyer Philosophicum

Looking forward to joining #ECVP2025 tomorrow. CAP-Lab is well represented, with 3 talks (@lassedietz.bsky.social on Monday, and @arora-borealis.bsky.social and I on Tuesday), and 2 posters (by @danwang7.bsky.social on Tuesday, and @yichen-yuan.bsky.social on Wednesday). Please come by for a chat! πŸ’œ

24.08.2025 20:33 β€” πŸ‘ 7    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0
Preview
Rapid Invisible Frequency Tagging (RIFT) with a consumer monitor: A proof-of-concept Rapid Invisible Frequency Tagging (RIFT) enables neural frequency tagging at rates above the flicker fusion threshold, eliciting steady-state responses to flicker that is almost imperceptible. While R...

🚨 New preprint: Invisible neural frequency tagging (RIFT) for the underfunded researcher:
πŸ‘‰ www.biorxiv.org/cgi/content/...

RIFT uses high-frequency flicker to probe attention in M/EEG with minimal stimulus visibility and little distraction. Until now, it required a costly high-speed projector.

22.08.2025 11:52 β€” πŸ‘ 33    πŸ” 17    πŸ’¬ 4    πŸ“Œ 2
Preview
Anticipated Relevance Prepares Visual Processing for Efficient Memory-Guided Selection Finding an object typically involves the use of working memory to prioritize relevant visual information at the right time. For example, successfully detecting a highway exit sign is useless when your...

Excited to present at #ECVP2025 - Monday afternoon, Learning & Memory - about how anticipating relevant visual events prepares visual processing for efficient memory-guided visual selection! 🧠πŸ₯³
@attentionlab.bsky.social @ecvp.bsky.social

Preprint for more details: www.biorxiv.org/content/10.1...

24.08.2025 13:47 β€” πŸ‘ 10    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0
Post image

Excited to give a talk at #ECVP2025 (Tuesday morning, Attention II) on how spatially biased attention during VWM does not boost excitability the same way it does when attending the external world, using Rapid Invisible Frequency Tagging (RIFT). @attentionlab.bsky.social @ecvp.bsky.social

24.08.2025 13:13 β€” πŸ‘ 14    πŸ” 3    πŸ’¬ 0    πŸ“Œ 1

Excited to share that I’ll be presenting my poster at #ECVP2025 on August 26th (afternoon session)!

🧠✨ Our work focused on dynamic competition between bottom-up saliency and top-down goals in early visual cortex by using Rapid Invisible Frequency Tagging

@attentionlab.bsky.social @ecvp.bsky.social

24.08.2025 13:28 β€” πŸ‘ 9    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0
IRL/CAP-Lab meeting

IRL/CAP-Lab meeting

I had loads of fun today, sharing thoughts and projects during a joint lab-meeting with @nadinedijkstra.bsky.social's Imagine Reality Lab. Two hours were way too short to discuss all the cool projects!

Thanks everyone for your contributions πŸ’œ

24.07.2025 21:00 β€” πŸ‘ 17    πŸ” 4    πŸ’¬ 1    πŸ“Œ 0

Best hunter trainer ever🫑

15.07.2025 12:39 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Preview
Unattended working memory items are coded by persistent activity in human medial temporal lobe neurons - Nature Human Behaviour Paluch et al. show that unattended working memory items, as well as attended ones, are encoded in persistent activity in the medial temporal lobe.

In this Article, Paluch et al. show that unattended working memory items, as well as attended ones, are encoded in persistent activity in the medial temporal lobe. @jankaminski.bsky.social
www.nature.com/articles/s41...

09.07.2025 11:33 β€” πŸ‘ 25    πŸ” 10    πŸ’¬ 0    πŸ“Œ 0
Preview
Attentional sampling resolves competition along the visual hierarchy Navigating the environment involves engaging with multiple objects, each activating specific neuronal populations. When objects appear together, these populations compete. Classical attention theories...

Thrilled to share our new opinion pieceβ€”hot off the pressβ€”on attentional sampling, co-authored with the magnificent Flor Kusnir and Daniele Re. It captures where our thinking has landed on this topic after years of work.

www.cell.com/trends/cogni...

09.07.2025 09:20 β€” πŸ‘ 20    πŸ” 8    πŸ’¬ 2    πŸ“Œ 1
Post image

Last week's symposium titled "Advances in the Encephalographic Study of Attention" was a great success! Held in the KNAW building in Amsterdam and sponsored by the NWO, many of (Europe's) leading attention researchers assembled to discuss the latest advances in attention research using M/EEG.

30.06.2025 07:12 β€” πŸ‘ 25    πŸ” 7    πŸ’¬ 4    πŸ“Œ 3

Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social

Open Access link: doi.org/10.3758/s134...

12.06.2025 07:21 β€” πŸ‘ 14    πŸ” 8    πŸ’¬ 0    πŸ“Œ 0
Preview
Guided visual search is associated with target boosting and distractor suppression in early visual cortex Communications Biology - Magnetoencephalography in human participants paired with Rapid Invisible Frequency Tagging reveals that excitability in early visual cortex is modulated to boost targets...

In our new MEG/RIFT study from @thechbh.bsky.social by
@katduecker.bsky.social , we show that feature-guidance in visual search alters neuronal excitability in early visual cortex β€”supporting a priority-map-based attentional mechanism.
rdcu.be/eqFX7

12.06.2025 10:17 β€” πŸ‘ 43    πŸ” 13    πŸ’¬ 0    πŸ“Œ 0

Thanks to the support of the Dutch Research Council (NWO) and @knaw-nl.bsky.social , we're thrilled to announce the international symposium "Advances in the Encephalographic study of Attention"! πŸ§ πŸ”

πŸ“… Date: June 25th & 26th
πŸ“ Location: Trippenhuis, Amsterdam

04.06.2025 20:14 β€” πŸ‘ 9    πŸ” 8    πŸ’¬ 2    πŸ“Œ 0

Through experience, humans can learn to suppress locations that frequently contain distracting stimuli. Using SSVEPs and ERPs, this study shows that such learned suppression modulates early neural responses, indicating it occurs during initial visual processing.
www.jneurosci.org/content/jneu...

26.05.2025 10:13 β€” πŸ‘ 16    πŸ” 9    πŸ’¬ 0    πŸ“Œ 1
Post image

Good morning #VSS2025, if you care for a chat about the role of attention in binding object features (during perceptual encoding and memory maintenance), drop by my poster now (8:30-12:30) in the pavilion (422). Hope to see you there!

19.05.2025 12:36 β€” πŸ‘ 14    πŸ” 3    πŸ’¬ 3    πŸ“Œ 0

Thanks for the recommendation! Really nice paper!

21.12.2024 00:21 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

Many thanks to my co-authors! ❀️❀️

20.12.2024 00:28 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

In conclusion, observers can flexibly de-prioritize and re-prioritize VWM contents based on current task demands, allowing observers to exert control over the extent to which VWM contents influence concurrent visual processing.

20.12.2024 00:26 β€” πŸ‘ 3    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0
Post image

In the end, we found no evidence that the influence of non-prioritized memory items on early visual processing differs between the three experimental paradigms. And we found non-prioritized memory items influence early visual processing when we combined the data from three experiments.

20.12.2024 00:23 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

In Experiment 3, we also found that only prioritized memory items influenced early visual processing in terms of the allocation of spatial attention.

20.12.2024 00:15 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

@danwang7 is following 20 prominent accounts