Antonin Fourcade's Avatar

Antonin Fourcade

@toninfrc.bsky.social

PhD student at Max Planck School of Cognition and MPI CBS/MPDCC (Berlin). Background in Biomedical Engineering and Neuroscience. Interested in brain-heart interactions, emotions and VR

509 Followers  |  105 Following  |  19 Posts  |  Joined: 10.10.2023
Posts Following

Posts by Antonin Fourcade (@toninfrc.bsky.social)

Preview
From Body to Brain and Back: Multimodal Evidence for Interoceptive Alterations in Schizophrenia Spectrum Disorders When the brain and body misalign, emotional experience and sense of reality can be disrupted. Although such atypical experiences are central to schizophrenia spectrum disorders (SSD), interoception, p...

How the brain listens to the body matters.
Our new preprint investigates interoceptive processing in schizophrenia spectrum disorders across phenomenology, behavior, and heartbeat-evoked brain responses. ๐Ÿง ๐Ÿซ€DOI: doi.org/10.64898/202...

20.01.2026 09:40 โ€” ๐Ÿ‘ 15    ๐Ÿ” 8    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 1

Trying to build an experiment in Unity and slowly losing your patience?
Spoiler: thatโ€™s completely normal!

Meet EDIA - a modular framework for building studies in Unity.
๐Ÿงฉ Reusable modules
๐Ÿ“Š Data sync
๐Ÿ•ถ๏ธ Multi-headset support

And yes, a โ€œFind Waldoโ€ demo is included - because science should be fun!

07.11.2025 16:37 โ€” ๐Ÿ‘ 6    ๐Ÿ” 3    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

Symposium 1.1, here we go!

To read more about AffectTracker, check out our latest publication: doi.org/10.3389/frvi...

@toninfrc.bsky.social @therealspr.bsky.social

16.10.2025 12:58 โ€” ๐Ÿ‘ 10    ๐Ÿ” 5    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Post image

Happy also to chat about our Brain-Body Analysis Special Interest Group (BBSIG) pipelines for preprocessing and analysing ECG, PPG and respiration (soon), openly available and ready-to-use with BIDS data as Jupyter Notebooks ๐Ÿซ€๐Ÿซ

Work of +20 wonderful collaborators! โœจ

๐Ÿ“‘ Documentation: www.bbsig.de

16.10.2025 15:17 โ€” ๐Ÿ‘ 16    ๐Ÿ” 6    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Villringer et al. Figure 1. Conceptual framework for brainโ€“body states

Villringer et al. Figure 1. Conceptual framework for brainโ€“body states

Villringer et al. Figure 2 Brainโ€“body micro-, meso-, and macro-states can be distinguished on the basis of their duration and reversibility

Villringer et al. Figure 2 Brainโ€“body micro-, meso-, and macro-states can be distinguished on the basis of their duration and reversibility

'Brainโ€“body states as a link between cardiovascular and mental health'

by Arno Villringer, Vadim Nikulin & Michael Gaebler @mbe-lab.bsky.social @michaelgaebler.com @mpicbs.bsky.social sky.social

www.cell.com/trends/neuro...

23.09.2025 20:13 โ€” ๐Ÿ‘ 36    ๐Ÿ” 13    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 2

Check out our new article for young readers (ages 8-15) on heart-brain interactions and interoception! ๐Ÿง ๐Ÿซ€

I had so much fun co-writing this with @agatapatyczek.bsky.social @el-rei.bsky.social with the support of @michaelgaebler.com โœ๏ธ

๐Ÿ‘‰ Share it widely with curious young minds
Yay for #scicomm โœจ

12.09.2025 08:26 โ€” ๐Ÿ‘ 32    ๐Ÿ” 11    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 1
GitHub - afourcade/AffectTracker: AffectTracker: real-time continuous rating of affective experience in immersive virtual reality AffectTracker: real-time continuous rating of affective experience in immersive virtual reality - afourcade/AffectTracker

Our studies confirmed AffectTracker is reliable, with high user experience and low interference. It opens new avenues for linking subjective experience to physiological dynamics. The tool is open-source and available on GitHub!
#OpenScience

23.09.2025 10:20 โ€” ๐Ÿ‘ 13    ๐Ÿ” 2    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 1
Video thumbnail

AffectTracker allows users to continuously rate their valence and arousal during VR experiences. It features customizable feedback options, including a simplified affect grid and a novel abstract shape ("Flubber"), designed to be intuitive and minimally interfering.

23.09.2025 10:20 โ€” ๐Ÿ‘ 8    ๐Ÿ” 3    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Video thumbnail

๐Ÿ‘ฅAn amazing team effort by:
@fra-malandrone.bsky.social
@lucyroe.bsky.social
A. Ciston
@thefirstfloor.bsky.social
A. Villringer
S. Carletto
@michaelgaebler.com

#neuroskyence #vr #emotion #affect #selfreports

23.09.2025 10:20 โ€” ๐Ÿ‘ 12    ๐Ÿ” 3    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 2
Preview
Frontiers | AffectTracker: real-time continuous rating of affective experience in immersive virtual reality Subjective experience is key to understanding affective states, characterized by valence and arousal. Traditional experiments using post-stimulus summary rat...

๐Ÿ“ขOur peer-reviewed article about the AffectTracker is finally out! ๐Ÿ˜ฒ๐Ÿ•น๏ธ๐Ÿ“ˆ
Traditional methods for rating emotion often miss the dynamic, moment-to-moment nature of feelings. We designed a tool to capture this continuous affective experience in real-time during dynamic emotional stimulation.

23.09.2025 10:20 โ€” ๐Ÿ‘ 43    ๐Ÿ” 14    ๐Ÿ’ฌ 2    ๐Ÿ“Œ 2

๐Ÿ“ฃ We're at the #MindBrainBody Symposium in Berlin, starting today! Looking forward to connect with everyone and share our latest research ๐Ÿง 

Our group has an exciting lineup of posters - come chat with us! ๐Ÿ’ฌ Check out the previews below to see where and when to meet us ๐Ÿ“Œ

#MBBS24 #neuroskyence

10.03.2025 08:09 โ€” ๐Ÿ‘ 18    ๐Ÿ” 6    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Preview
Tools and Software

We centralized our open-science contributions in a new "Tools & Software" section on our website; check out

- open stimuli (eg. 3D objects)
- open data (eg. MindBrainBody)
- tools (eg. excite-o-meter, AffectTracker)
- analysis scripts
- & more

www.cbs.mpg.de/departments/...

#researchtransparency

30.01.2025 06:25 โ€” ๐Ÿ‘ 37    ๐Ÿ” 16    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 2

The 1-min videos in study 1 are monoscopic, chosen as intermediate stimuli between static images and long videos to extend the classical short event-related stimulus approach. Also finding suitable free videos was challenging. Study 2's 23-min video is stereoscopic, a step further in stimuli type

17.12.2024 09:00 โ€” ๐Ÿ‘ 2    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

3๏ธโƒฃTool offers a novel way to study affective dynamics with minimal interference, effectively capturing the nuances of subjective experiences. It opens new research opportunities to link affective states with physiological dynamics
๐ŸŒŸStay tuned for the full paper & we welcome feedback & discussions! ๐Ÿ’ญ

16.12.2024 13:08 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

2๏ธโƒฃEmpirically evaluated in 2 studies at 2 sites (Berlin & Torino; N = 134) with both shorter 1-min 360ยฐ videos (low affective variability [AV] ใ€ฐ๏ธ) and longer more dynamic 23-min stimulus (high AV ๐Ÿ“ˆ)
Both Grid & Flubber โžก๏ธ high user experience ๐Ÿ˜ƒ & low interference with the affective experience itself

16.12.2024 13:08 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Video thumbnail

1๏ธโƒฃParticipants can rate in real-time and continuously, using the touchpad or joystick of a VR controller ๐ŸŽฎ(here: HTC Vive Pro). It comprises three customizable feedback options: a simplified affect grid (Grid), an abstract pulsating variant (Flubber), and no visual feedback (Proprioceptive)

16.12.2024 13:08 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
GitHub - afourcade/AffectTracker: AffectTracker AffectTracker. Contribute to afourcade/AffectTracker development by creating an account on GitHub.

๐Ÿ‘ฅTogether with F.Malandrone @lucyroe.bsky.social A.Ciston @thefirstfloor.bsky.social A.Villringer S.Carletto @michaelgaebler.com
๐Ÿ› ๏ธ Unity prefab: github.com/afourcade/Af...

16.12.2024 13:08 โ€” ๐Ÿ‘ 4    ๐Ÿ” 1    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Video thumbnail

๐Ÿš€ Preprint out! doi.org/10.31234/osf...
We developed, empirically evaluated and openly share **AffectTracker**, a new tool to collect continuous ratings of two-dimensional (valence and arousal) affective experience **during** dynamic emotional stimulation (e.g., 360ยฐ videos) in immersive VR! ๐Ÿฅฝ๐Ÿง ๐ŸŸฆ

16.12.2024 13:08 โ€” ๐Ÿ‘ 27    ๐Ÿ” 13    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 1

I thought it could be nice to connect the community of researchers exploring body-brain interactions on bsky, so here is the Body-Brain Interactions Starter Pack! ๐Ÿซ€๐Ÿซ๐Ÿ‘€๐Ÿง  #neuroskyence #academicsky

Let me know if you would like to be added or know someone to add. Enjoy and share!

go.bsky.app/Fwqeu32

24.09.2024 20:30 โ€” ๐Ÿ‘ 119    ๐Ÿ” 64    ๐Ÿ’ฌ 44    ๐Ÿ“Œ 6
Post image

Title: Real-time continuous rating of affective experience in immersive Virtual Reality

P.361 (Session 1)
@toninfrc.bsky.social

In collaboration with Torino University, we developed a fun and intuitive new tool to record moment-by-moment feelings!

28.05.2024 10:27 โ€” ๐Ÿ‘ 10    ๐Ÿ” 3    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Post image

We are coming to Psychologie und Gihirn 2024 (PUG) in Hamburg! Come chat with us! See the some teezers in the comments ๐Ÿ’ฌ #PuG2024

28.05.2024 10:23 โ€” ๐Ÿ‘ 12    ๐Ÿ” 6    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 3

Thank you!

12.01.2024 14:45 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

The picture was made using the AI image generator DALL-E3

12.01.2024 14:44 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 2    ๐Ÿ“Œ 0

We contribute to shed light on the complex relationship between emotions & the nervous systems (or MindBrainBody coupling) under naturalistic stimulation. ๐ŸŒŸ Stay tuned for the full paper & weโ€™re very happy about feedback and discussions! ๐Ÿ’ญ
(Illustration: DALL-E3)

11.01.2024 09:19 โ€” ๐Ÿ‘ 3    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Post image

However, whole-brain exploratory analyses revealed a temporo-occipital cluster, where higher EA was linked to decreased ๐Ÿง โžก๏ธ๐Ÿซ€ย  brain-to-heart (gammaโ†’HF-HRV) and increased ๐Ÿซ€โžก๏ธ๐Ÿง heart-to-brain (LF-HRVโ†’gamma) information flow.

11.01.2024 09:18 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

4๏ธโƒฃ Physiological modeling (using a method by @diegocandiar and others) did not provide evidence for our hypothesis that higher EA changes the bidirectional information flow between HF-HRV & posterior alpha power. ๐Ÿง ๐Ÿ”๐Ÿซ€

11.01.2024 09:18 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Post image

3๏ธโƒฃ Combining๐Ÿง EEG & ๐Ÿซ€ ECG, higher EA was also associated with lower heartbeat-evoked potential (HEP) amplitudes in a left fronto-central electrode cluster. This may indicate that stronger emotional states change the importance of signals from the outer world & the inner body.

11.01.2024 09:17 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Post image

2๏ธโƒฃ Higher EA was linked to ๐Ÿซ€ lower vagal cardioregulation (high-frequency heart rate variability, HF-HRV) and๐Ÿง lower posterior (parieto-occipital) alpha power.

11.01.2024 09:17 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Post image

1๏ธโƒฃ 29 healthy adults experienced virtual๐ŸŽขwhile we recorded the electrical activity in their๐Ÿง (#EEG) and ๐Ÿซ€ (#ECG). They then continuously rated the intensity of their emotional experience (emotional arousal, EA) while viewing a replay.

11.01.2024 09:15 โ€” ๐Ÿ‘ 2    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Post image

๐Ÿš€Happy our preprint is out!๐Ÿง ๐Ÿซ€๐ŸฅฝUsing immersive VR, we investigated how our emotional experience relates to the interaction between brain & heart activities
tinyurl.com/yxn3vk8e
with @flxklotz.bsky.social @smnhfmnn.bsky.social @langestroop.bsky.social V.Nikulin A.Villringer @mgblr.bsky.social ๐Ÿง ๐ŸŸฆ
1/7 โฌ‡๏ธ

11.01.2024 09:15 โ€” ๐Ÿ‘ 26    ๐Ÿ” 11    ๐Ÿ’ฌ 2    ๐Ÿ“Œ 2