(from lapidow & @ebonawitz.bsky.social's awesome 2023 explore-exploit paper)
14.10.2025 21:45 β π 2 π 0 π¬ 0 π 0@talboger.bsky.social
third-year phd student at jhu psych | perception + cognition https://talboger.github.io/
(from lapidow & @ebonawitz.bsky.social's awesome 2023 explore-exploit paper)
14.10.2025 21:45 β π 2 π 0 π¬ 0 π 0methods from lapidow & bonawitz, 2023. children are "dropped"
a falling child
can't believe the IRB approved this part β hope the children are ok!
14.10.2025 21:44 β π 65 π 7 π¬ 2 π 2What a lovely 'spotlight' of @talboger.bsky.social's work on style perception! Written by @aennebrielmann.bsky.social in @cp-trendscognsci.bsky.social.
See Aenne's paper below, as well as Tal's original work here: www.nature.com/articles/s41...
When a butterfly becomes a bear, perception takes center stage.
Research from @talboger.bsky.social, @chazfirestone.bsky.social and the Perception & Mind Lab.
Out today!
www.cell.com/current-biol...
important question for dev people: when reporting demographics for a paper involving both kids and adults, we want some consistency in how we report that information. so do you call the kids "men" and "women", or do you call the adults "boys" and βgirls"?
01.10.2025 15:33 β π 4 π 0 π¬ 1 π 0sami is such a creative, thoughtful, and fun mentor. anyone who gets to work with him is so lucky!
15.09.2025 18:23 β π 2 π 0 π¬ 0 π 1Visual adaptation is viewed as a test of whether a feature is represented by the visual system.
In a new paper, Sam Clarke and I push the limits of this test. We show spatially selective, putatively "visual" adaptation to a clearly non-visual dimension: Value!
www.sciencedirect.com/science/arti...
It's true: This is the first project from our lab that has a "Merch" page!
Get yours @ www.perceptionresearch.org/anagrams/mer...
The present work thus serves as a βcase studyβ of sorts. It yields concrete discoveries about real-world size, and it also validates a broadly applicable tool for psychology and neuroscience. We hope it catches on!
19.08.2025 16:39 β π 7 π 0 π¬ 1 π 0Though we manipulated real-world size, you could generate anagrams of happy faces and sad faces, tools and non-tools, or animate and inanimate objects, overcoming low-level confounds associated with such stimuli. Our approach is perfectly general.
19.08.2025 16:39 β π 5 π 0 π¬ 1 π 0Overall, our work confronts the longstanding challenge of disentangling high-level properties from their lower-level covariates. We found that, once you do so, most (but not all) of the relevant effects remain.
19.08.2025 16:39 β π 11 π 0 π¬ 1 π 0(Never fear, though: As we say in our paper, that last result is consistent with the original work, which suggested that mid-level features β the sort preserved in βtexformβ stimuli β may well explain these search advantages.)
19.08.2025 16:39 β π 8 π 0 π¬ 1 π 0whereas previous work shows efficient visual search for real-world size, we did not find a similar effect with anagrams. our study included a successful replication of these previous findings with ordinary objects (i.e., non-anagram images).
Finally, visual search. Previous work shows targets are easier to find when they differ from distractors in their real-world size. However, in our experiments with anagrams, this was not the case (even though we easily replicated this effect with ordinary, non-anagram images).
19.08.2025 16:38 β π 11 π 0 π¬ 1 π 0people prefer to view real-world large objects as larger than real-world small objects, even with visual anagrams.
Next, aesthetic preferences. People think real-world large objects look better when displayed large, and vice versa for small objects. Our experiments show that this is true with anagrams too!
19.08.2025 16:37 β π 14 π 2 π¬ 1 π 0results from the real-world size Stroop effect with anagrams. performance is better when displayed size is congruent with real-world size.
First, the βreal-world size Stroop effectβ. If you have to say which of two images is larger (on the screen, not in real life), itβs easier if displayed size is congruent with real-world size. We found this to be true even when the images were perfect anagrams of one another!
19.08.2025 16:36 β π 16 π 0 π¬ 1 π 0Then, we placed these images in classic experiments on real-world size, to see if observed effects arise even under such highly controlled conditions.
(Spoiler: Most of these effects *did* arise with anagrams, confirming that real-world size per se drives many of these effects!)
anagrams we generated, where rotating the object changes its real-world size.
We generated images using this technique (see examples). Each pair differs in real-world size but are otherwise identical* in lower-level features, because theyβre the same image down to the last pixel.
(*avg orientation, aspect-ratio, etc, may still vary. ask me about this!)
depiction of the "visual anagrams" model by Geng et al.
This challenge may seem insurmountable. But maybe it isnβt! To overcome it, we used a new technique from Geng et al. called βvisual anagramsβ, which allows you to generate images whose interpretations vary as a function of orientation.
19.08.2025 16:34 β π 23 π 0 π¬ 1 π 1the mind encodes differences in real-world size. but differences in size also carry differences in shape, spatial frequency, and contrast.
Take real-world size. Tons of cool work shows that itβs encoded automatically, drives aesthetic judgments, and organizes neural responses. But thereβs an interpretive challenge: Real-world size covaries with other features that may cause these effects independently.
19.08.2025 16:33 β π 17 π 0 π¬ 2 π 1The problem: We often study βhigh-levelβ image features (animacy, emotion, real-world size) and find cool effects. But high-level properties covary with lower-level features, like shape or spatial frequency. So what seem like high-level effects may have low-level explanations.
19.08.2025 16:33 β π 18 π 0 π¬ 2 π 1On the left is a rabbit. On the right is an elephant. But guess what: Theyβre the *same image*, rotated 90Β°!
In @currentbiology.bsky.social, @chazfirestone.bsky.social & I show how these imagesβknown as βvisual anagramsββcan help solve a longstanding problem in cognitive science. bit.ly/45BVnCZ
Out today! www.nature.com/articles/s41...
05.08.2025 21:58 β π 60 π 18 π¬ 4 π 2Lab-mate got my ass on the lab when2meet
24.07.2025 15:56 β π 200 π 10 π¬ 5 π 2Amazing new work from @gabrielwaterhouse.bsky.social and @samiyousif.bsky.social! I'm convinced the crowd size illusion is real, but the rooms full of people watching Gabe give awesome talks at @socphilpsych.bsky.social and @vssmtg.bsky.social were no illusion!
26.06.2025 16:37 β π 6 π 0 π¬ 0 π 0@sallyberson.bsky.social in action at @socphilpsych.bsky.social! #SPP2025
21.06.2025 00:10 β π 15 π 2 π¬ 0 π 0Susan Carey sitting in the front row of a grad student talk (by @talboger.bsky.social) and going back and forth during Q&A is what makes the @socphilpsych.bsky.social so special! Loved this interaction π€
19.06.2025 19:39 β π 35 π 1 π¬ 1 π 1Now officially out! psycnet.apa.org/record/2026-...
(Free version here: talboger.github.io/files/Boger_...)
A visual advertisement of VSS projects from the Firestone Lab
It's @vssmtg.bsky.social! So excited to share this year's projects from the lab, including brand new research directions and some deep dives on foundational issues.
More info @ perception.jhu.edu/vss/.
See you on the π!
#VSS2025
Together, these results demonstrate multiple new phenomena of stylistic perception, and more generally introduce a scientific approach to the study of style. Stay tuned for more projects on this theme, including developmental work analyzing stylistic representation in kids!
14.05.2025 16:45 β π 5 π 0 π¬ 1 π 0