O'Connell Lab's Avatar

O'Connell Lab

@connelllab.bsky.social

Prof Redmond O'Connell's lab, Trinity College Institute of Neuroscience. Seeking to understand the neural mechanisms underpinning high-level cognition. https://oconnell-lab.com/home/opportunities/

329 Followers  |  43 Following  |  14 Posts  |  Joined: 30.11.2023  |  2.0617

Latest posts by connelllab.bsky.social on Bluesky

@elainecorbett.bsky.social

01.07.2025 16:53 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

@epares.bsky.social @spk3lly.bsky.social @danfeuerriegel.bsky.social‬

01.07.2025 15:55 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

3/3 Here we show that their deconvolution approach eliminates these same signatures when applied to a ground-truth EA signal. We also recap the many other signatures of sensory EA that the CPP has been shown to exhibit.

01.07.2025 15:38 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

2/3 FrΓΆmer et al (2024) used a signal deconvolution method to show that one signature of evidence accumulation (EA) observed in the centro-parietal positivity (CPP) - trial-averaged response-locked buildup effects - could arise artifactually from overlapping stimulus- and response-locked components.

01.07.2025 15:38 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

1/3 Check out our new commentary bsky.app/profile/imag....

01.07.2025 15:38 β€” πŸ‘ 10    πŸ” 5    πŸ’¬ 1    πŸ“Œ 1

@johnpgrogan1.bsky.social @lucvermeylen.bsky.social @kobedesender.bsky.social @tcdpsychology.bsky.social

12.06.2025 10:59 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

Check out our new paper in which we identify a model that can jointly account for the timing and accuracy of perceptual choices, the timing and level of subsequent confidence judgments and the pre- and post-choice dynamics of neural decision signals

12.06.2025 10:58 β€” πŸ‘ 4    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0
Post image

Exciting news - early bird registration is now open for #RLDM2025!

πŸ”— Register now: forms.gle/QZS1GkZhYGRF...

Register now to save €100 on your ticket. Early bird prices are only available until 1st April.

11.02.2025 14:56 β€” πŸ‘ 16    πŸ” 15    πŸ’¬ 2    πŸ“Œ 2
Call to action requesting poster abstracts be submitted to the RLDM 2025 conference by the January 15th deadline. Poster is stylised in the pink and blue RLDM colours, and features the RLDM brain/robot mascot standing on a "submit now" sticker.

Call to action requesting poster abstracts be submitted to the RLDM 2025 conference by the January 15th deadline. Poster is stylised in the pink and blue RLDM colours, and features the RLDM brain/robot mascot standing on a "submit now" sticker.

πŸ“’ Call for Abstracts πŸ“’

Submit your extended abstracts on "learning and decision-making over time to achieve a goal" to #RLDM2025. Successful applications will be selected for poster or oral presentation to an interdisciplinary audience.

πŸ—“οΈ Deadline: Jan 15th
πŸ”— Learn more: rldm.org/call-for-abs...

10.12.2024 11:54 β€” πŸ‘ 36    πŸ” 12    πŸ’¬ 0    πŸ“Œ 2
Preview
Regressing Away Common Neural Choice Signals does not make them Artifacts. Comment on FrΓΆmer et al (2024, Nature Human Behaviour) The recent paper by FrΓΆmer et al (2024, Nature Human Behaviour) examines a component of the event-related potential (ERP) known as the centro-parietal positivity (CPP) that has been widely implicated ...

β€œRegressing common choice signals away does not make them artifacts"
Please check out our commentary on the FrΓΆmer et al paper (2024, Nature Human Behaviour) in which we highlight important problems with the analysis and interpretation of their perceptual choice data.
www.biorxiv.org/content/10.1...

27.09.2024 14:38 β€” πŸ‘ 7    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0

Our results have interesting implications for modelling 2 choice and continuous outcome dot motion tasks, and open the door to future research to help further develop our understanding of ODMR
5/5

07.03.2024 14:47 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

We found supporting evidence that temporal filters are involved in ODMR, with the higher frame rate display seeming to induce more ODMR than the lower frame rate display. Confidence data interestingly distinguished ODMR both from error and correct responses.
4/5

07.03.2024 14:47 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

Inspired by Bae and Luck's 2022 Visual Cognition paper, we decided to investigate if changing the display frame rate would impact ODMR rates - hypothesising that temporal filtering may be involved. We also gathered confidence data to compare ODMR to correct and error responses.
3/5

07.03.2024 14:47 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

Pat was frustrated at consistently responding in the opposite direction to the true dot motion direction displayed during piloting for a related study. Diving into the literature revealed he was not the only one responding in such a way.
2/5

07.03.2024 14:46 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

New paper from our very own @patmckeown.bsky.social in Visual Cognition, investigating opposite direction motion reports (ODMR) in RDKs. We looked at the role of display frame rate and confidence in these peculiar reports.
1/5 w/ Elaine Corbett and @redmondoconnell.bsky.social

07.03.2024 14:45 β€” πŸ‘ 0    πŸ” 1    πŸ’¬ 1    πŸ“Œ 1

We're hiring for a 2-year postdoctoral researcher position! Come join our ERC-funded project developing neurally-informed models of perceptual decision making and metacognition at Trinity College Dublin.

oconnell-lab.com/home/opportu...

30.11.2023 09:34 β€” πŸ‘ 7    πŸ” 9    πŸ’¬ 0    πŸ“Œ 0

We're hiring! The O'Connell Lab is offering a 2-year postdoctoral position to join our team and work on our ERC-funded project developing neurally-informed models of perceptual decision making and metacognition. Full info here: oconnell-lab.com/home/opportu..., feel free to DM any questions

28.11.2023 21:19 β€” πŸ‘ 4    πŸ” 2    πŸ’¬ 0    πŸ“Œ 1

@connelllab is following 20 prominent accounts