Last week to apply for a 3yr postdoc with @tsawallis.bsky.social, Frank Jäkel and myself. Deadline is March 15th hmc-lab.com/TAMPostdoc.h...
Do you think this connects to your findings from self touch?
Our computational model explains their findings by assuming that the lack of vision increases the uncertainty about sensory feedback. Therefore, state estimation in the movement task should rely more on the predictions of the forward model, i.e. suppression should be stronger.
Interesting work thanks! How do you think this connects to the work of Colino et al. 2017 measuring suppression during reach-to-grasp movements? link.springer.com/article/10.1...
The same receptors targeted by vibration are relevant for state estimation, see e.g. the references cited on page 4 of our preprint. Our computational model prescribes how much this sensory feedback should be weighted against the predictions of a forward model in the ongoing movement task.
How suppression related to movement-dependent sensory predictions interacts with suppression related to predicted (self-)touch remains, in our view, an open question. Checkout this paper from Katja and Dimitris for some ideas: www.nature.com/articles/s41...
In contrast, the studies you cite measured sensitivity on a static finger that was expected to be touched by the other hand.
However, in our experiment, tactile sensitivity was measured on the moving forearm, where our experimental manipulations should not alter predictions of tactile input because of expected contact with the screen.
(2) We think this differs from the type of tactile suppression described in the studies you mention. Those concern prediction of actual touch, more precisely self-touch. We think that prediction of touch is a major driver of tactile suppression, as shown in self-touch and haptic exploration.
We therefore interpret tactile suppression as a proxy for how much somatosensory feedback is weighted during online sensorimotor control.
Computationally, this has the consequence that these two signal sources are weighted differently depending on their relative uncertainty over time, which leads to different amounts of "suppression" during movement.
The external stimulus itself is not predicted per se. The nervous system combines movement-related sensory input with internal predictions to best estimate its state over time.
(1) We are examining externally generated touch during movement. A substantial body of neurophysiological and behavioral work shows that cutaneous afferents carry kinesthetic information relevant for body state estimation (details in the paper) e.g., physoc.onlinelibrary.wiley.com/doi/full/10....
Hi Konstantina,
thank you for being so interested in our work! 🙂
Thank you 🙏
For those wondering: What does cutaneous touch (like our vibrotactile stimuli) have to do with body state estimation?
Neurophysiological and behavioral work has shown that cutaneous afferents carry kinesthetic information and contribute functionally to movement detection and execution.
Huge thanks to my collaborators Dimitris Voudouris, @dominikstrb.bsky.social, Katja Fiehler, @c-rothkopf.bsky.social and the excellence cluster "The Adaptive Mind" for providing such a stimulating research environment. 🙏
Together, these results provide a quantitative, normative account of tactile suppression during movement and point to a general principle for how tactile input may be regulated during action.
As internal uncertainty about hand position increases, suppression decreases—consistent with greater reliance on somatosensory input. This makes a purely fixed gating account unlikely, as suppression changes systematically when we explicitly manipulate uncertainty.
Why is touch perceived as weaker during movement?
In our new preprint 📝, we examine tactile suppression during reaching.
Using optimal control theory, we show that tactile suppression reflects dynamic, uncertainty-dependent integration of forward model predictions and sensory feedback.
Looking forward to meeting you #ECVP2025 Mainz this week, including collaborative work with @tobnie.bsky.social @dominikstrb.bsky.social @ookenfooken.bsky.social @fatatai.bsky.social @tsawallis.bsky.social @mamassian.bsky.social @guidomaiello.bsky.social @mariaeckstein.bsky.social and many others
Happy to announce that I am presenting a poster today at #CogSci25: Physical reasoning during motor learning aids people at transferring mass, but not motor control mappings.
This is joint work with Dominik Ürüm, @mariaeckstein.bsky.social and @c-rothkopf.bsky.social
Find out more at P3-T-192!
Our latest work on understanding the behavior of bounded agents in more naturalistic tasks accepted at #ICLR2025: Inverse decision-making using neural amortized Bayesian actors, with @dominikstrb.bsky.social @tobnie.bsky.social and @jan-peters.bsky.social based on @tobnie.bsky.social MSc thesis
Finally, we conducted a systematic model comparison to show that our data can not be accounted for with simpler explanations, implying that participants really accounted for how the friction impacts the puck's sliding behavior.
Find this and more in our pre-print:
www.biorxiv.org/content/10.1...
We fitted the data with a generative model of sliding based on statistical decision theory, and participants even seem to account for their personal variability. Subjects with higher variability undershoot more:
Statistical decision theory predicts that if people maximize their task outcome but also account for how the friction impacts the puck, they should undershoot the target more the higher the penalty but also the larger the distance. This is also what we find in the data:
Participants controlled the puck's initial velocity, to hit the green area to receive a positive score but avoid a penalty from hitting the red area. This process is subject to signal dependent noise, which is further transformed by the physics of friction, determining the final puck position.
To study whether people can do this, we built this fun setup, where people get to slide a real standard hockey puck at targets, while viewing the scene through virtual reality. This offers a natural way of interacting with objects for the participants, but still allows for experimental control.
1. Sensorimotor-control, subject to inherent uncertainty and variability;
2. Intuitive physical reasoning as object manipulations are subject to physical dynamics;
3. Economic decision-making when actions have outcomes with monetary consequences.