Out today! www.nature.com/articles/s41...
05.08.2025 21:58 β π 56 π 16 π¬ 4 π 2@talboger.bsky.social
second-year phd student at jhu psych | perception + cognition https://talboger.github.io/
Out today! www.nature.com/articles/s41...
05.08.2025 21:58 β π 56 π 16 π¬ 4 π 2Lab-mate got my ass on the lab when2meet
24.07.2025 15:56 β π 201 π 10 π¬ 5 π 2Amazing new work from @gabrielwaterhouse.bsky.social and @samiyousif.bsky.social! I'm convinced the crowd size illusion is real, but the rooms full of people watching Gabe give awesome talks at @socphilpsych.bsky.social and @vssmtg.bsky.social were no illusion!
26.06.2025 16:37 β π 6 π 0 π¬ 0 π 0@sallyberson.bsky.social in action at @socphilpsych.bsky.social! #SPP2025
21.06.2025 00:10 β π 15 π 2 π¬ 0 π 0Susan Carey sitting in the front row of a grad student talk (by @talboger.bsky.social) and going back and forth during Q&A is what makes the @socphilpsych.bsky.social so special! Loved this interaction π€
19.06.2025 19:39 β π 36 π 1 π¬ 1 π 1Now officially out! psycnet.apa.org/record/2026-...
(Free version here: talboger.github.io/files/Boger_...)
A visual advertisement of VSS projects from the Firestone Lab
It's @vssmtg.bsky.social! So excited to share this year's projects from the lab, including brand new research directions and some deep dives on foundational issues.
More info @ perception.jhu.edu/vss/.
See you on the π!
#VSS2025
Together, these results demonstrate multiple new phenomena of stylistic perception, and more generally introduce a scientific approach to the study of style. Stay tuned for more projects on this theme, including developmental work analyzing stylistic representation in kids!
14.05.2025 16:45 β π 5 π 0 π¬ 1 π 0Finally, we sought a computational account of stylistic similarity. We found that an object recognition model (with no explicit knowledge of style) successfully predicts human judgments of similarity across styles.
14.05.2025 16:45 β π 3 π 0 π¬ 1 π 0But these cases involve extracting style to discard it in a sense. Might we also use style to generate new representations? Suppose youβre shown the fork and spoon from a styled cutlery set; can you imagine the knife? We used a priming task to find exactly these representations.
14.05.2025 16:44 β π 5 π 0 π¬ 1 π 0Next, we considered cases of βdiscountingβ in vision, like when we discount lighting conditions to discern an objectβs color, or when we discount a cloth to discern the object beneath it. We found similar effects for style: Vision discounts style to discriminate images.
14.05.2025 16:44 β π 3 π 0 π¬ 1 π 0First, we were inspired by βfont tuningβ, wherein the mind adapts to typefaces in ways that aid text comprehension. Might similar effects arise for style? In other words, might perception tune to the style of images in ways that aid scene comprehension? We show: Yes!
14.05.2025 16:44 β π 3 π 0 π¬ 1 π 0So, we thought, letβs study style perception like we study those processes! We adapted a number of paradigms used in those literatures to study how the mind represents style.
14.05.2025 16:43 β π 4 π 0 π¬ 1 π 0Hereβs the idea: Seeing style involves βparsingβ content from form. The mind does this in other contexts too, like reading (separating typeface from letter identity) and maybe even color constancy (separating surface reflectance from lighting conditions).
14.05.2025 16:43 β π 5 π 0 π¬ 2 π 0Style is the subject of considerable humanistic study, from art history to sociology to political theory. But a scientific account of style perception has remained elusive.
Using style transfer algorithms, we generated stimuli in various styles to use in psychophysics studies.
Looking at Van Goghβs Starry Night, we see not only its content (a French village beneath a night sky) but also its *style*. How does that work? How do we see style?
In @nathumbehav.nature.com, @chazfirestone.bsky.social & I take an experimental approach to style perception! osf.io/preprints/ps...
Sam Clarke and I have been writing a lot about adaptation -- what it is, what it isn't, what it reveals about perception. We've just released a preprint that pushes the boundaries of adaptation even further. We document spatially selective adaptation to arbitrary *value*.
philpapers.org/rec/CLACWS
Danny wolf transferred from Yale to Michigan, what is he, writing a friggin metaethics dissertation, folks?
22.03.2025 23:40 β π 7 π 1 π¬ 0 π 0I'm really, really excited about our recent paper on children's understanding of topological spatial relations (w/ Lily Goldstein and Liz Brannon). I've linked the paper here, but I'll summarize the thread below.
direct.mit.edu/opmi/article...
Check out the pre-print here (osf.io/preprints/ps...), and see all the experiments for yourself β including the illusory soccer balls β here (perceptionstudies.github.io/persistence).
04.03.2025 18:15 β π 2 π 0 π¬ 1 π 0One fun thing that came up in the review process is that different people have vastly different definitions of what βobject persistenceβ is. We find this really interesting, so we wrote a full section devoted to spelling out these differences in hopes of opening up some discussion.
04.03.2025 18:15 β π 3 π 0 π¬ 1 π 0We suggest that object persistence may be the simplest and best explanation for event completion. In our paper, we discuss how this might relate to other memory distortions (like representational momentum), and object/event cognition more broadly.
04.03.2025 18:15 β π 2 π 0 π¬ 1 π 0We found large filling-in effects almost everywhere (not just due to inattention or an object-presence bias) β including when we disrupted cues previously proposed to create event completion. But abolishing object persistence made event completion effects disappear entirely.
04.03.2025 18:14 β π 2 π 0 π¬ 1 π 0This allowed us to systematically disrupt various cues that have been proposed to create event completion effects. These included causality, continuity, familiarity, physical coherence, event coherence, and object persistence.
04.03.2025 18:14 β π 2 π 0 π¬ 1 π 0We rendered animations in Blender β like the one you just saw β with an object either present or absent in each half. Participants watched these animations and simply had to complete a forced-choice judgment about whether the ball was present or absent in a given half.
04.03.2025 18:13 β π 3 π 0 π¬ 1 π 0Event completion is a phenomenon where you falsely remember (i.e., βcompleteβ) a part of an event that wasnβt really shown. People propose lots of interesting mechanisms for event completion (and other kinds of event-based distortions). But what really explains it?
04.03.2025 18:13 β π 3 π 0 π¬ 1 π 0Watch this video.
Do you remember seeing a ball in the second half of the video? Up to 37% of our participants reported seeing a ball, even though it wasnβt there. Why?
In a new paper in press @ Cognition, Brent Strickland and I ask what causes event completion. osf.io/preprints/ps...
Thus, we think random behavior can β and should! β be viewed as trait-like. That is: Just like your personality traits are stable across contexts and time, so too is the way in which you may behave randomly.
05.02.2025 17:23 β π 7 π 0 π¬ 1 π 0results from experiment 3. behavior was stable across a one-year delay!
But for random behavior to be truly trait-like, it should also be stable over *time*. So, in Experiment 3, we tested the same participants from Experiment 2 one full year later (!). We found remarkably stable behavior across these two timepoints.
05.02.2025 17:23 β π 1 π 0 π¬ 1 π 0results from experiment 2. behavior was stable across a random-number-generation and a two-dimensional random-location-generation task.
This provided initial evidence for stable random behavior across tasks. However, numbers and one-dimensional locations share a representational format (i.e., a mental number line). In Experiment 2, we extended this to two-dimensional random locations and found the same pattern of results.
05.02.2025 17:22 β π 0 π 0 π¬ 1 π 0