Avery HW Ryoo's Avatar

Avery HW Ryoo

@averyryoo.bsky.social

i like generative models, science, and Toronto sports teams phd @ mila/udem, prev. @ uwaterloo averyryoo.github.io πŸ‡¨πŸ‡¦πŸ‡°πŸ‡·

1,066 Followers  |  298 Following  |  49 Posts  |  Joined: 28.09.2023  |  2.4165

Latest posts by averyryoo.bsky.social on Bluesky

Preview
The curriculum effect in visual learning: the role of readout dimensionality Generalization of visual perceptual learning (VPL) to unseen conditions varies across tasks. Previous work suggests that training curriculum may be integral to generalization, yet a theoretical explan...

🚨 New preprint alert!

πŸ§ πŸ€–
We propose a theory of how learning curriculum affects generalization through neural population dimensionality. Learning curriculum is a determining factor of neural dimensionality - where you start from determines where you end up.
πŸ§ πŸ“ˆ

A 🧡:

tinyurl.com/yr8tawj3

30.09.2025 14:25 β€” πŸ‘ 76    πŸ” 24    πŸ’¬ 1    πŸ“Œ 2

Excited to share that POSSM has been accepted to #NeurIPS2025! See you in San Diego πŸ–οΈ

20.09.2025 15:40 β€” πŸ‘ 10    πŸ” 3    πŸ’¬ 1    πŸ“Œ 0
Preview
Neural Interfaces Neural Interfaces is a comprehensive book on the foundations, major breakthroughs, and most promising future developments of neural interfaces. The bo

I'm very excited to announce the publication of our new book Neural Interfaces, published by Elsevier. The book is a comprehensive resource for all those interested and gravitating around neural interfaces and brain-computer interfaces (BCIs).

shop.elsevier.com/books/neural...

19.08.2025 20:18 β€” πŸ‘ 6    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0

🐐

12.07.2025 20:23 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Post image

Step 1: Understand how scaling improves LLMs.
Step 2: Directly target underlying mechanism.
Step 3: Improve LLMs independent of scale. Profit.

In our ACL 2025 paper we look at Step 1 in terms of training dynamics.

Project: mirandrom.github.io/zsl
Paper: arxiv.org/pdf/2506.05447

12.07.2025 18:44 β€” πŸ‘ 4    πŸ” 1    πŸ’¬ 2    πŸ“Œ 0
Post image

(1/n)🚨Train a model solving DFT for any geometry with almost no training data
Introducing Self-Refining Training for Amortized DFT: a variational method that predicts ground-state solutions across geometries and generates its own training data!
πŸ“œ arxiv.org/abs/2506.01225
πŸ’» github.com/majhas/self-...

10.06.2025 19:49 β€” πŸ‘ 12    πŸ” 4    πŸ’¬ 1    πŸ“Œ 1
Manitokan are images set up where one can bring a gift or receive a gift. 1930s Rocky Boy Reservation, Montana, Montana State University photograph. Colourized with AI

Manitokan are images set up where one can bring a gift or receive a gift. 1930s Rocky Boy Reservation, Montana, Montana State University photograph. Colourized with AI

Preprint Alert πŸš€

Multi-agent reinforcement learning (MARL) often assumes that agents know when other agents cooperate with them. But for humans, this isn’t always the case. For example, plains indigenous groups used to leave resources for others to use at effigies called Manitokan.
1/8

05.06.2025 15:32 β€” πŸ‘ 35    πŸ” 13    πŸ’¬ 1    πŸ“Œ 3
Preview
Generalizable, real-time neural decoding with hybrid state-space models Real-time decoding of neural activity is central to neuroscience and neurotechnology applications, from closed-loop experiments to brain-computer interfaces, where models are subject to strict latency...

Stay tuned for the project page and code, coming soon!

Link: arxiv.org/abs/2506.05320

A big thank you to my co-authors: @nandahkrishna.bsky.social*, @ximengmao.bsky.social*, @mehdiazabou.bsky.social, Eva Dyer, @mattperich.bsky.social, and @glajoie.bsky.social!

🧡7/7

06.06.2025 17:40 β€” πŸ‘ 6    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0
Post image

Finally, we show POSSM's performance on speech decoding - a long context task that can quickly grow expensive for Transformers. In the unidirectional setting, POSSM beats the GRU baseline, achieving a phoneme error rate (PER) of 27.3 while having more robustness to variation in preprocessing.

🧡6/7

06.06.2025 17:40 β€” πŸ‘ 3    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

Cross-species transfer! πŸ΅βž‘οΈπŸ§‘

Excitingly, we find that POSSM pretrained solely on monkey reaching data achieves SOTA performance when decoding imagined handwriting in human subjects! This shows the potential of leveraging NHP data to bootstrap human BCI decoding in low-data clinical settings.

🧡5/7

06.06.2025 17:40 β€” πŸ‘ 4    πŸ” 0    πŸ’¬ 2    πŸ“Œ 0
Post image Post image

By pretraining on 140 monkey reaching sessions, POSSM effectively transfers to new subjects and tasks, matching or outperforming several baselines (e.g., GRU, POYO, Mamba) across sessions.

βœ… High RΒ² across the board
βœ… 9Γ— faster inference than Transformers
βœ… <5ms latency per prediction

🧡4/7

06.06.2025 17:40 β€” πŸ‘ 3    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

POSSM combines the real-time inference of an RNN with the tokenization, pretraining, and finetuning abilities of a Transformer!

Using POYO-style tokenization, we encode spikes in 50ms windows and stream them to a recurrent model (e.g., Mamba, GRU) for fast, frequent predictions over time.

🧡3/7

06.06.2025 17:40 β€” πŸ‘ 3    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

The problem with existing decoders?

πŸ˜” RNNs offer efficient, causal inference, but rely on rigid, binned input formats - limiting generalization to new neurons or sessions.

πŸ˜” Transformers enable generalization via tokenization, but have high computational costs due to the attention mechanism.

🧡2/7

06.06.2025 17:40 β€” πŸ‘ 4    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

New preprint! πŸ§ πŸ€–

How do we build neural decoders that are:
⚑️ fast enough for real-time use
🎯 accurate across diverse tasks
🌍 generalizable to new sessions, subjects, and even species?

We present POSSM, a hybrid SSM architecture that optimizes for all three of these axes!

🧡1/7

06.06.2025 17:40 β€” πŸ‘ 52    πŸ” 23    πŸ’¬ 2    πŸ“Œ 8

I am joining @ualberta.bsky.social as a faculty member and
@amiithinks.bsky.social!

My research group is recruiting MSc and PhD students at the University of Alberta in Canada. Research topics include generative modeling, representation learning, interpretability, inverse problems, and neuroAI.

29.05.2025 18:53 β€” πŸ‘ 11    πŸ” 2    πŸ’¬ 1    πŸ“Œ 0
POYO+ POYO+: Multi-session, multi-task neural decoding from distinct cell-types and brain regions

Scaling models across multiple animals was a major step toward building neuro-foundation models; the next frontier is enabling multi-task decoding to expand the scope of training data we can leverage.

Excited to share our #ICLR2025 Spotlight paper introducing POYO+ 🧠

poyo-plus.github.io

🧡

25.04.2025 22:14 β€” πŸ‘ 44    πŸ” 10    πŸ’¬ 1    πŸ“Œ 1

Interested in foundation models for #neuroscience? Want to contribute to the development of the next-generation of multi-modal models? Come join us at IVADO in Montreal!

We're hiring a full-time machine learning specialist for this work.

Please share widely!

#NeuroAI πŸ§ πŸ“ˆ πŸ§ͺ

11.04.2025 16:17 β€” πŸ‘ 57    πŸ” 31    πŸ’¬ 1    πŸ“Œ 1

πŸ“½οΈRecordings from our
@cosynemeeting.bsky.social
#COSYNE2025 workshop on β€œAgent-Based Models in Neuroscience: Complex Planning, Embodiment, and Beyond" are now online: neuro-agent-models.github.io
πŸ§ πŸ€–

07.04.2025 20:57 β€” πŸ‘ 36    πŸ” 11    πŸ’¬ 1    πŸ“Œ 0

Talk recordings from our COSYNE Workshop on Neuro-foundation Models 🌐🧠 are now up on the workshop website!

neurofm-workshop.github.io

05.04.2025 00:41 β€” πŸ‘ 34    πŸ” 10    πŸ’¬ 1    πŸ“Œ 1
Post image

Very late, but had a πŸ”₯ time at my first Cosyne presenting my work with @nandahkrishna.bsky.social, Ximeng Mao, @mattperich.bsky.social, and @glajoie.bsky.social on real-time neural decoding with hybrid SSMs. Keep an eye out for a preprint (hopefully) soon πŸ‘€

#Cosyne2025 @cosynemeeting.bsky.social

04.04.2025 05:21 β€” πŸ‘ 31    πŸ” 6    πŸ’¬ 2    πŸ“Œ 0

Excited to be at #Cosyne2025 for the first time! I'll be presenting my poster [2-104] during the Friday session. E-poster here: www.world-wide.org/cosyne-25/se...

27.03.2025 19:53 β€” πŸ‘ 8    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0
Post image Post image

We'll be presenting two projects at #Cosyne2025, representing two main research directions in our lab:

πŸ§ πŸ€– πŸ§ πŸ“ˆ

1/3

27.03.2025 19:13 β€” πŸ‘ 46    πŸ” 9    πŸ’¬ 1    πŸ“Œ 1

@oliviercodol.bsky.social my opportunity to lose to scientists in a different field

27.03.2025 02:06 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

Just a couple days until Cosyne - stop by [3-083] this Saturday and say hi! @nandahkrishna.bsky.social

24.03.2025 18:19 β€” πŸ‘ 6    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0

This will be a more difficult Cosyne than normal, due to both the travel restrictions for people coming from the US and the strike that may be happening at the hotel in Montreal.

But, we can still make this an awesome meeting as usual, y'all. Let's pull together and make it happen!

πŸ§ πŸ“ˆ
#Cosyne2025

23.03.2025 21:26 β€” πŸ‘ 44    πŸ” 5    πŸ’¬ 2    πŸ“Œ 0

Hi! Currently there are no plans to livestream, but we may *potentially* post recordings in the future (contingent on speaker permission)

11.03.2025 01:54 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

Join us at #COSYNE2025 to explore recent advancements in large-scale training and analysis of brain data! 🧠🟦

We also made a starter pack with (most of) our speakers: go.bsky.app/Ss6RaEF

10.03.2025 21:21 β€” πŸ‘ 17    πŸ” 4    πŸ’¬ 0    πŸ“Œ 0
Preview
COSYNE 2025 Workshop - Building a foundation model for the brain Join us to explore neuro-foundation models. March 31-April 1, 2025 in Mont Tremblant, Canada.

We have a great lineup of speakers and panelists, you can check out our schedule here: neurofm-workshop.github.io. Co-organized with: @mehdiazabou.bsky.social, @nandahkrishna.bsky.social, @colehurwitz.bsky.social, Eva Dyer, and @tyrellturing.bsky.social. We hope to see you there!

10.03.2025 19:55 β€” πŸ‘ 5    πŸ” 0    πŸ’¬ 1    πŸ“Œ 2
Post image

How can large-scale models + datasets revolutionize neuroscience πŸ§ πŸ€–πŸŒ? We are excited to announce our workshop: β€œBuilding a foundation model for the brain: datasets, theory, and models” at @cosynemeeting.bsky.social #COSYNE2025. Join us in Mont-Tremblant, Canada from March 31 – April 1!

10.03.2025 19:55 β€” πŸ‘ 44    πŸ” 17    πŸ’¬ 2    πŸ“Œ 9

forms.gle/1DPPVe8KLRWD...

here's a google form for ease!

01.02.2025 22:48 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

@averyryoo is following 20 prominent accounts