Katrina Rose Quinn's Avatar

Katrina Rose Quinn

@mightyrosequinn.bsky.social

Neuroscientist in TΓΌbingen & mother of dragons. Interested in visual perception, decision-making & expectations.

87 Followers  |  144 Following  |  14 Posts  |  Joined: 24.11.2024  |  1.8387

Latest posts by mightyrosequinn.bsky.social on Bluesky

🧠✨ Exciting new research alert! ✨🧠

Did you know that catecholamines can reduce choice history biases in perceptual decision-making? πŸ§πŸ”

Paper: journals.plos.org/plosbiology/...

With @donnerlab.bsky.social and @swammerdamuva.bsky.social

05.09.2025 08:05 β€” πŸ‘ 22    πŸ” 8    πŸ’¬ 1    πŸ“Œ 0
Preview
Contents of visual predictions oscillate at alpha frequencies Predictions of future events have a major impact on how we process sensory signals. However, it remains unclear how the brain keeps predictions online in anticipation of future inputs. Here, we combin...

@dotproduct.bsky.social's first first author paper is finally out in @sfnjournals.bsky.social! Her findings show that content-specific predictions fluctuate with alpha frequencies, suggesting a more specific role for alpha oscillations than we may have thought. With @jhaarsma.bsky.social. 🧠🟦 πŸ§ πŸ€–

21.10.2025 11:05 β€” πŸ‘ 94    πŸ” 38    πŸ’¬ 4    πŸ“Œ 2
Preview
Hierarchical interactions between sensory cortices defy predictive coding Perceptual experience depends on recurrent interactions between lower and higher cortices. One theory, predictive coding, posits that feedback from hi…
23.10.2025 14:01 β€” πŸ‘ 41    πŸ” 10    πŸ’¬ 2    πŸ“Œ 2
Post image

Long time in the making: our preprint of survey study on the diversity with how people seem to experience #mentalimagery. Suggests #aphantasia should be redefined as absence of depictive thought, not merely "not seeing". Some more take home msg:
#psychskysci #neuroscience

doi.org/10.1101/2025...

02.10.2025 18:10 β€” πŸ‘ 112    πŸ” 35    πŸ’¬ 11    πŸ“Œ 2

Really enjoyed my weekend read on 𝐚𝐜𝐭𝐒𝐯𝐞 𝐟𝐒π₯𝐭𝐞𝐫𝐒𝐧𝐠: local recurrence amplifies natural input patterns and suppresses stray activity. This review beautifully argues that sensory cortex itself is a site of memory and prediction. Food for thought on hallucinations!

#neuroskyence #neuroscience

27.09.2025 15:12 β€” πŸ‘ 46    πŸ” 14    πŸ’¬ 0    πŸ“Œ 1
Post image

✨ Meet our speakers! ✨

Among our speakers this year at #SNS2025 we have Marlene Cohen (@marlenecohen.bsky.social), from University of Chicago

Read the abstract here πŸ’¬ πŸ‘‡
meg.medizin.uni-tuebingen.de/sns_2025/abs...

#neuroskyence

29.09.2025 11:16 β€” πŸ‘ 3    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0
Post image

✨ Meet our speakers! ✨

Among our speakers this year at #SNS2025 we have Floris de Lange (@predictivebrain.bsky.social)

Read the abstract here πŸ’¬ πŸ‘‡
meg.medizin.uni-tuebingen.de/sns_2025/abs...

#neuroskyence

26.09.2025 08:24 β€” πŸ‘ 13    πŸ” 6    πŸ’¬ 0    πŸ“Œ 0
Post image

✨ Meet our speakers! ✨

Among our speakers this year at #SNS2025 we have Tim Kietzmann (@timkietzmann.bsky.social)

Read the abstract here πŸ’¬ πŸ‘‡
meg.medizin.uni-tuebingen.de/sns_2025/abs...

#neuroskyence #compneurosky #NeuroAI

25.09.2025 16:12 β€” πŸ‘ 12    πŸ” 4    πŸ’¬ 0    πŸ“Œ 0
An array of 9 purple discs on a blue background. Figure from Hinnerk Schulz-Hildebrandt.

An array of 9 purple discs on a blue background. Figure from Hinnerk Schulz-Hildebrandt.

A nice shift in perceived colour between central and peripheral vision. The fixated disc looks purple while the others look blue.

The effect presumably comes from the absence of S-cones in the fovea.

From Hinnerk Schulz-Hildebrandt:
arxiv.org/pdf/2509.115...

24.09.2025 10:16 β€” πŸ‘ 704    πŸ” 263    πŸ’¬ 30    πŸ“Œ 42
Post image

✨ Meet our speakers! ✨

Among our speakers this year at #SNS2025 we also have Sylvia SchrΓΆder (@sylviaschroeder.bsky.social), from University of Sussex

Read the abstract here πŸ’¬ πŸ‘‡
meg.medizin.uni-tuebingen.de/sns_2025/abs...

#neuroskyscience

22.09.2025 12:19 β€” πŸ‘ 3    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0
Preview
a woman in front of a white board with the words take your time written on it ALT: a woman in front of a white board with the words take your time written on it

πŸ“’ Deadline extended! πŸ“’

The registration deadline for #SNS2025 has been extended to Sunday, September 28th!

Register here πŸ‘‰ meg.medizin.uni-tuebingen.de/sns_2025/reg...

PS: Students of the GTC (Graduate Training Center for Neuroscience) in TΓΌbingen can earn 1 CP for presenting a poster! πŸ‘€

17.09.2025 13:11 β€” πŸ‘ 6    πŸ” 6    πŸ’¬ 0    πŸ“Œ 0
Post image

✨ Meet our speakers! ✨

Among our speakers this year at #SNS2025 we have Simone Ebert (@simoneebert.bsky.social) & Jan Lause (@janlause.bsky.social), from Hertie AI Institute, University of TΓΌbingen

Read the abstract here πŸ’¬ πŸ‘‡
meg.medizin.uni-tuebingen.de/sns_2025/abs...

#neuroskyscience

16.09.2025 18:06 β€” πŸ‘ 10    πŸ” 5    πŸ’¬ 0    πŸ“Œ 0
Post image

✨ Meet our speakers! ✨

Next speaker to present is Arthur Lefevre, from University of Lyon

Read the abstract here πŸ’¬ πŸ‘‡
meg.medizin.uni-tuebingen.de/sns_2025/abs...

#neuroscience #neuroskyscience

15.09.2025 10:20 β€” πŸ‘ 5    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0
Post image

✨ Meet our speakers! ✨

Next speaker to present is Mara Wolter, PhD student at University of TΓΌbingen in Yulia Oganian lab

Read the abstract here πŸ’¬ πŸ‘‡
meg.medizin.uni-tuebingen.de/sns_2025/abs...

12.09.2025 09:39 β€” πŸ‘ 5    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0
Post image

πŸ”΅ TΓΌbingen SNS2025 πŸ”΅

Over the next few days, we’ll be introducing you to the brilliant scientists who will be deliver their talks at #SNS2025!

✨ Get ready to meet our speakers! ✨

We are starting with Liina PylkkΓ€nen, from New York University

πŸ’¬ meg.medizin.uni-tuebingen.de/sns_2025/abs...

11.09.2025 08:30 β€” πŸ‘ 9    πŸ” 6    πŸ’¬ 1    πŸ“Œ 0

Great work from a great team. Congrats guys!πŸŽ‰

09.09.2025 10:42 β€” πŸ‘ 5    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Preview
A brain-wide map of neural activity during complex behaviour - Nature The International Brain Laboratory presents a brain-wide electrophysiological map obtained from pooling data from 12 laboratories that performed the same standardized perceptual decision-making task i...

The two key studies of the International Brain Laboratory @intlbrainlab.bsky.social are out today!

A brain-wide map of neural activity during complex behaviour
www.nature.com/articles/s41...

Brain-wide representations of prior information in mouse decision-making
www.nature.com/articles/s41...

03.09.2025 15:46 β€” πŸ‘ 144    πŸ” 49    πŸ’¬ 1    πŸ“Œ 1
Preview
From bench to bot: Why AI-powered writing may not deliver on its promise Efficiency isn’t everything. The cognitive work of struggling with prose may be a crucial part of what drives scientific progress.

β€œCompetent prose generated by a machine, I’ve come to realize, might not be what science actually needs.” ?

By Tim Requarth

#neuroskyence

www.thetransmitter.org/from-bench-t...

02.09.2025 15:42 β€” πŸ‘ 12    πŸ” 7    πŸ’¬ 0    πŸ“Œ 8
Preview
Noradrenaline drives learning across scales of time and neurobiological organisation The noradrenergic system plays a diverse role in learning, from optimising learning behaviour to modulating plasticity. Work bridging across micro- and macroscale levels is revealing how noradrenaline...

Excellent new review by @claireocallaghan.bsky.social on how noradrenaline drives learning across multiple scales of neurobiological organization - from cells to networks www.cell.com/trends/cogni...

29.08.2025 05:05 β€” πŸ‘ 68    πŸ” 15    πŸ’¬ 2    πŸ“Œ 0

Not long to go now! For those of you who enjoy a more intimate conference with a chance to get to know your favourite speakers I would highly recommend this right here. Reach out if you have any questions :)

29.08.2025 08:07 β€” πŸ‘ 3    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0

Can't wait to see this fantastic line-up 🀩

09.07.2025 09:48 β€” πŸ‘ 3    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Video thumbnail

Can humans use artificial limbs for body augmentation as flexibly as their own hands?
🚨 Our new interdisciplinary study put this question to the test with the Third Thumb (@daniclode.bsky.social), a robotic extra digit you control with your toes!
www.biorxiv.org/content/10.1...
🧡1/10

07.07.2025 15:46 β€” πŸ‘ 18    πŸ” 8    πŸ’¬ 1    πŸ“Œ 2
Overview of the simulation strategy and analysis. a) Pial and white matter boundaries
surfaces are extracted from anatomical MRI volumes. b) Intermediate equidistant surfaces are
generated between the pial and white matter surfaces (labeled as superficial (S) and deep (D)
respectively). c) Surfaces are downsampled together, maintaining vertex correspondence across
layers. Dipole orientations are constrained using vectors linking corresponding vertices (link vectors).
d) The thickness of cortical laminae varies across the cortical depth (70–72), which is evenly sampled
by the equidistant source surface layers. e) Each colored line represents the model evidence (relative
to the worst model, Ξ”F) over source layer models, for a signal simulated at a particular layer (the
simulated layer is indicated by the line color). The source layer model with the maximal Ξ”F is
indicated by β€œΛ„β€. f) Result matrix summarizing Ξ”F across simulated source locations, with peak
relative model evidence marked with β€œΛ„β€. g) Error is calculated from the result matrix as the absolute
distance in mm or layers from the simulated source (*) to the peak Ξ”F (Λ„). h) Bias is calculated as the
relative position of a peak Ξ”F(Λ„) to a simulated source (*) in layers or mm.

Overview of the simulation strategy and analysis. a) Pial and white matter boundaries surfaces are extracted from anatomical MRI volumes. b) Intermediate equidistant surfaces are generated between the pial and white matter surfaces (labeled as superficial (S) and deep (D) respectively). c) Surfaces are downsampled together, maintaining vertex correspondence across layers. Dipole orientations are constrained using vectors linking corresponding vertices (link vectors). d) The thickness of cortical laminae varies across the cortical depth (70–72), which is evenly sampled by the equidistant source surface layers. e) Each colored line represents the model evidence (relative to the worst model, Ξ”F) over source layer models, for a signal simulated at a particular layer (the simulated layer is indicated by the line color). The source layer model with the maximal Ξ”F is indicated by β€œΛ„β€. f) Result matrix summarizing Ξ”F across simulated source locations, with peak relative model evidence marked with β€œΛ„β€. g) Error is calculated from the result matrix as the absolute distance in mm or layers from the simulated source (*) to the peak Ξ”F (Λ„). h) Bias is calculated as the relative position of a peak Ξ”F(Λ„) to a simulated source (*) in layers or mm.

🚨🚨🚨PREPRINT ALERT🚨🚨🚨
Neural dynamics across cortical layers are key to brain computations - but non-invasively, we’ve been limited to rough "deep vs. superficial" distinctions. What if we told you that it is possible to achieve full (TRUE!) laminar (I, II, III, IV, V, VI) precision with MEG!

02.06.2025 11:54 β€” πŸ‘ 112    πŸ” 45    πŸ’¬ 4    πŸ“Œ 8

It's gotta be a Zelda playlist for me - those games trained me to problem-solve to that music πŸ˜†

28.05.2025 12:19 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Two examples of how contextual information can bias visual perception. Top: Luminance illusion created by shadows (source: https://persci.mit.edu/gallery/checkershadow). Square B looks brighter than square A but has the same luminance, i.e., they have identical grayscale values in the picture. Bottom: Perception of object motion is biased by self-motion. The combination of leftward self-motion and up-left object motion in the world produces retinal motion that is up-right. If the animal partially subtracts the optic flow vector (orange dashed arrow) generated by self-motion (yellow arrow) from the image motion on the retina (black arrow), they may have a biased perception of object motion (red arrow) that lies between retinal and world coordinates (green arrow).

Two examples of how contextual information can bias visual perception. Top: Luminance illusion created by shadows (source: https://persci.mit.edu/gallery/checkershadow). Square B looks brighter than square A but has the same luminance, i.e., they have identical grayscale values in the picture. Bottom: Perception of object motion is biased by self-motion. The combination of leftward self-motion and up-left object motion in the world produces retinal motion that is up-right. If the animal partially subtracts the optic flow vector (orange dashed arrow) generated by self-motion (yellow arrow) from the image motion on the retina (black arrow), they may have a biased perception of object motion (red arrow) that lies between retinal and world coordinates (green arrow).

Rewarding animals to accurately report their subjective #percept is challenging. This study formalizes this problem and overcomes it with a #Bayesian method for estimating an animal’s subjective percept in real time during the experiment @plosbiology.org πŸ§ͺ plos.io/3HaxiuB

27.05.2025 18:07 β€” πŸ‘ 12    πŸ” 2    πŸ’¬ 0    πŸ“Œ 0
Post image

🚨 New WP! πŸ“„ "Publish or Procreate: The Effect of Motherhood on Research Performance" (w/ @valentinatartari.bsky.social
πŸ‘©β€πŸ”¬πŸ‘¨β€πŸ”¬ We investigate how parenthood affects scientific productivity and impact β€” and find that the impact is far from equal for mothers and fathers.

22.05.2025 08:03 β€” πŸ‘ 203    πŸ” 99    πŸ’¬ 2    πŸ“Œ 7

Press release on our new paper from @hih-tuebingen.bsky.social 🧠πŸ₯³
Link: www.nature.com/articles/s42...
Thread: bsky.app/profile/migh...
#neuroskyence #compneurosky #magnetoencephalography

26.05.2025 11:58 β€” πŸ‘ 17    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0
The members of the Cluster of Excellence "Machine Learning: New Perspectives for Science" raise their glasses and celebrate securing another funding period.

The members of the Cluster of Excellence "Machine Learning: New Perspectives for Science" raise their glasses and celebrate securing another funding period.

We're super happy: Our Cluster of Excellence will continue to receive funding from the German Research Foundation @dfg.de ! Here’s to 7 more years of exciting research at the intersection of #machinelearning and science! Find out more: uni-tuebingen.de/en/research/... #ExcellenceStrategy

22.05.2025 16:23 β€” πŸ‘ 74    πŸ” 20    πŸ’¬ 4    πŸ“Œ 5
Preview
Communication of perceptual predictions from the hippocampus to the deep layers of the parahippocampal cortex High-resolution neuroimaging reveals stimulus-specific predictions sent from hippocampus to the neocortex during perception.

Our study using layer fMRI to study the direction of communication between the hippocampus and cortex during perceptual predictions is finally out in Science Advances! Predicted-but-omitted shapes are represented in CA2/3 and correlate specifically with deep layers of PHC, suggesting feedback. 🧠🟦

22.05.2025 01:55 β€” πŸ‘ 166    πŸ” 53    πŸ’¬ 4    πŸ“Œ 1
Preview
Abstract choice representations during stable choice-response associations - Communications Biology Human magnetoencephalography reveals neural representations of perceptual choices that are abstracted from motor-responses even during stable choice-response associations. This suggests a general role...

Human MEG reveals neural representations of perceptual choices abstracted from motor-responses even during stable choice-response associations. @mightyrosequinn.bsky.social @siegellab.bsky.social @fsandhaeger.bsky.social @nimanoury.bsky.social @ezezelic.bsky.social. www.nature.com/articles/s42...

16.05.2025 19:39 β€” πŸ‘ 6    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0

@mightyrosequinn is following 20 prominent accounts