Can't wait to see this fantastic line-up π€©
09.07.2025 09:48 β π 3 π 0 π¬ 0 π 0@mightyrosequinn.bsky.social
Neuroscientist in TΓΌbingen & mother of dragons. Interested in visual perception, decision-making & expectations.
Can't wait to see this fantastic line-up π€©
09.07.2025 09:48 β π 3 π 0 π¬ 0 π 0Can humans use artificial limbs for body augmentation as flexibly as their own hands?
π¨ Our new interdisciplinary study put this question to the test with the Third Thumb (@daniclode.bsky.social), a robotic extra digit you control with your toes!
www.biorxiv.org/content/10.1...
π§΅1/10
Overview of the simulation strategy and analysis. a) Pial and white matter boundaries surfaces are extracted from anatomical MRI volumes. b) Intermediate equidistant surfaces are generated between the pial and white matter surfaces (labeled as superficial (S) and deep (D) respectively). c) Surfaces are downsampled together, maintaining vertex correspondence across layers. Dipole orientations are constrained using vectors linking corresponding vertices (link vectors). d) The thickness of cortical laminae varies across the cortical depth (70β72), which is evenly sampled by the equidistant source surface layers. e) Each colored line represents the model evidence (relative to the worst model, ΞF) over source layer models, for a signal simulated at a particular layer (the simulated layer is indicated by the line color). The source layer model with the maximal ΞF is indicated by βΛβ. f) Result matrix summarizing ΞF across simulated source locations, with peak relative model evidence marked with βΛβ. g) Error is calculated from the result matrix as the absolute distance in mm or layers from the simulated source (*) to the peak ΞF (Λ). h) Bias is calculated as the relative position of a peak ΞF(Λ) to a simulated source (*) in layers or mm.
π¨π¨π¨PREPRINT ALERTπ¨π¨π¨
Neural dynamics across cortical layers are key to brain computations - but non-invasively, weβve been limited to rough "deep vs. superficial" distinctions. What if we told you that it is possible to achieve full (TRUE!) laminar (I, II, III, IV, V, VI) precision with MEG!
It's gotta be a Zelda playlist for me - those games trained me to problem-solve to that music π
28.05.2025 12:19 β π 1 π 0 π¬ 1 π 0Two examples of how contextual information can bias visual perception. Top: Luminance illusion created by shadows (source: https://persci.mit.edu/gallery/checkershadow). Square B looks brighter than square A but has the same luminance, i.e., they have identical grayscale values in the picture. Bottom: Perception of object motion is biased by self-motion. The combination of leftward self-motion and up-left object motion in the world produces retinal motion that is up-right. If the animal partially subtracts the optic flow vector (orange dashed arrow) generated by self-motion (yellow arrow) from the image motion on the retina (black arrow), they may have a biased perception of object motion (red arrow) that lies between retinal and world coordinates (green arrow).
Rewarding animals to accurately report their subjective #percept is challenging. This study formalizes this problem and overcomes it with a #Bayesian method for estimating an animalβs subjective percept in real time during the experiment @plosbiology.org π§ͺ plos.io/3HaxiuB
27.05.2025 18:07 β π 12 π 2 π¬ 0 π 0π¨ New WP! π "Publish or Procreate: The Effect of Motherhood on Research Performance" (w/ @valentinatartari.bsky.social
π©βπ¬π¨βπ¬ We investigate how parenthood affects scientific productivity and impact β and find that the impact is far from equal for mothers and fathers.
Press release on our new paper from @hih-tuebingen.bsky.social π§ π₯³
Link: www.nature.com/articles/s42...
Thread: bsky.app/profile/migh...
#neuroskyence #compneurosky #magnetoencephalography
The members of the Cluster of Excellence "Machine Learning: New Perspectives for Science" raise their glasses and celebrate securing another funding period.
We're super happy: Our Cluster of Excellence will continue to receive funding from the German Research Foundation @dfg.de ! Hereβs to 7 more years of exciting research at the intersection of #machinelearning and science! Find out more: uni-tuebingen.de/en/research/... #ExcellenceStrategy
22.05.2025 16:23 β π 74 π 20 π¬ 4 π 5Our study using layer fMRI to study the direction of communication between the hippocampus and cortex during perceptual predictions is finally out in Science Advances! Predicted-but-omitted shapes are represented in CA2/3 and correlate specifically with deep layers of PHC, suggesting feedback. π§ π¦
22.05.2025 01:55 β π 162 π 52 π¬ 3 π 1Human MEG reveals neural representations of perceptual choices abstracted from motor-responses even during stable choice-response associations. @mightyrosequinn.bsky.social @siegellab.bsky.social @fsandhaeger.bsky.social @nimanoury.bsky.social @ezezelic.bsky.social. www.nature.com/articles/s42...
16.05.2025 19:39 β π 6 π 3 π¬ 0 π 0Suggesting that far from abstract choices being the exception, they could rather be the rule. 7/7
19.05.2025 07:45 β π 0 π 0 π¬ 0 π 0Figure showing distinct cortical distributions of neural information for choice and motor-response as a function of time.
Furthermroe, choice and motor-response showed distinct cortical distributions with choice over fronto-parietal regions. 6/7
19.05.2025 07:45 β π 0 π 0 π¬ 1 π 0Figure showing the time-course of abstract choice information.
We found neural representations of the perceptual choice, independent of those for motor-response and stimulus. 5/7
19.05.2025 07:45 β π 0 π 0 π¬ 1 π 0To test this we used a motion discrimination task where choice-motor mappings were held stable over longer time periods. We did this in conjunction with MEG measurements and an encoding framework that would allow us to disentangle representations of choice and motor-response. 4/7
19.05.2025 07:45 β π 0 π 0 π¬ 1 π 0But one limitation of these tasks has been that the mapping between perceptual choice and motor-response has varied rapidly. This could lead to a more flexible representation, in the form of an abstract decision variable, that might not be present in more stable contexts. 3/7
19.05.2025 07:45 β π 0 π 0 π¬ 1 π 0Perceptual decisions are often entangled with the motor-response used to make them, but we also make decisions that are not immediately linked to an action plan. Evidence from the lab has shown neural representations for these "abstract choices", even when the motor-response is known in advance. 2/7
19.05.2025 07:45 β π 2 π 0 π¬ 1 π 0Fresh off the press - our latest publication on abstract choices with @siegellab.bsky.social @fsandhaeger.bsky.social @nimanoury.bsky.social @ezezelic.bsky.social at @commsbio.nature.com
Link: www.nature.com/articles/s42...
π§΅below (1/7)
I'm so sorry you've had to go through this. I wish you all the best for your recovery and look forward to a conference catch-up down the line. Best wishes to you and your family ππ»
15.05.2025 08:23 β π 1 π 0 π¬ 0 π 0#Swarm science just got a shake-up! π¦π¦π¦For decades, #locusts were thought to move like particles but new research indicates they actually use sensory & cognitive mechanisms, not simple alignment @icouzin.bsky.social @mpi-animalbehav.bsky.social @sercansayin.bsky.social
01.03.2025 07:38 β π 36 π 8 π¬ 1 π 0Robust encoding of stimulusβresponse mapping by neurons in visual cortex
doi.org/10.1073/pnas...
#neuroscience
ππ»ββοΈ
01.03.2025 11:50 β π 1 π 0 π¬ 0 π 0Intro time!
We are the Siegel Lab, located at the @unituebingen.bsky.social and @hih-tuebingen.bsky.social
Our central goal is to investigate how cognition emerges from dynamic interactions across widely distributed neuronal ensemble, combining mainly human MEG and animal electrophysiology.