Cengiz Pehlevan's Avatar

Cengiz Pehlevan

@cpehlevan.bsky.social

theory of neural networks for natural and artificial intelligence https://pehlevan.seas.harvard.edu/

1,102 Followers  |  346 Following  |  9 Posts  |  Joined: 21.09.2023
Posts Following

Posts by Cengiz Pehlevan (@cpehlevan.bsky.social)

Post image

NEW: #Kempner researchers develop a mean-field theory of task-trained RNNs that bridges random and learned connectivityβ€”and find macaque motor cortex is best captured by an intermediate, task-specific recurrent structure.
Read the blog post πŸ‘‡
πŸ”— bit.ly/47f3Ldl

04.03.2026 16:46 β€” πŸ‘ 23    πŸ” 9    πŸ’¬ 2    πŸ“Œ 0
Post image

I am totally pumped about this new work . "Task-trained RNNs" are a powerful and influential framework in neuroscience, but have lacked a firm theoretical footing. This work provides one, and makes direct contact with the classical theory of random RNNs:
www.biorxiv.org/content/10.6...

04.03.2026 17:12 β€” πŸ‘ 84    πŸ” 31    πŸ’¬ 2    πŸ“Œ 3
Post image

Looking forward to presenting on "How behavior shapes recurrent circuits across sensory systems and species: from vision to touch" at the University of Chicago Neuroscience and ML workshop on Wednesday! Details below πŸ‘‡πŸ§΅

23.02.2026 23:36 β€” πŸ‘ 15    πŸ” 4    πŸ’¬ 1    πŸ“Œ 1

I’m deeply thankful to my supervisor, Mark Brandon (@markbrandonlab.bsky.social) for his patience, guidance, and constant support throughout this project, and to our collaborators in the Cengiz Pehlevan (@cpehlevan.bsky.social) lab at Harvard for their thoughtful and generous contributions.

16.01.2026 18:48 β€” πŸ‘ 1    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0

All theory is wrong until verified by data. Greatly indebted to @mhyaghoubi.bsky.social, @markbrandonlab.bsky.social, @douglasresearch.bsky.social for finding the hippocampus encoding reward prediction! Grateful to my advisor @cpehlevan.bsky.social, @kempnerinstitute.bsky.social.
#RL #hippocampus

19.01.2026 09:32 β€” πŸ‘ 31    πŸ” 9    πŸ’¬ 0    πŸ“Œ 1

Delighted to have contributed to this work. Huge kudos to everyone involved.

15.01.2026 00:20 β€” πŸ‘ 14    πŸ” 2    πŸ’¬ 1    πŸ“Œ 0

I’m very happy to share the latest from my lab published in @Nature

Hippocampal neurons that initially encode reward shift their tuning over the course of days to precede or predict reward.

Full text here:
rdcu.be/eY5nh

14.01.2026 21:32 β€” πŸ‘ 104    πŸ” 32    πŸ’¬ 2    πŸ“Œ 2

Very excited about this new work from the omnipotent Owen, with me and Ashok Litwin-Kumar! Can we reconcile low- and high-dimensional activity in neural circuits by recognizing that these circuits ~multitask~?
(Plausibly, yes 😊)

15.12.2025 20:17 β€” πŸ‘ 31    πŸ” 4    πŸ’¬ 0    πŸ“Œ 0
Preview
Schmidt Sciences Awards Early Career Fellowships to Michael Albergo, Melanie Weber - Kempner Institute Two Kempner Institute community members have receivedΒ AI2050 Fellowships from Schmidt Sciences, a nonprofit organization aimed at accelerating scientific knowledge and breakthroughs. The AI2050 Progra...

Congratulations to #KempnerInstitute community members @msalbergo.bsky.social and @mweber.bsky.social β€” recipients of @schmidtsciences.bsky.social AI2050 Fellowships! πŸŽ‰
Discover their innovative research shaping the future of AI πŸ‘‰ bit.ly/47Do4R3
#AI

06.11.2025 20:10 β€” πŸ‘ 13    πŸ” 4    πŸ’¬ 0    πŸ“Œ 0

First paper from the lab!
We propose a model that separates estimation of odor concentration and presence and map it on olfactory bulb circuits
Led by @chenjiang01.bsky.social and @mattyizhenghe.bsky.social joint work with @jzv.bsky.social and with @neurovenki.bsky.social @cpehlevan.bsky.social

04.11.2025 15:40 β€” πŸ‘ 36    πŸ” 13    πŸ’¬ 2    πŸ“Œ 1
Preview
Connectivity Structure and Dynamics of Nonlinear Recurrent Neural Networks The structure of brain connectivity predicts collective neural activity, with a small number of connectivity features determining activity dimensionality, linking circuit architecture to network-level...

Now in PRX: Theory linking connectivity structure to collective activity in nonlinear RNNs!
For neuro fans: conn. structure can be invisible in single neurons but shape pop. activity
For low-rank RNN fans: a theory of rank=O(N)
For physics fans: fluctuations around DMFT saddle⇒dimension of activity

03.11.2025 21:47 β€” πŸ‘ 60    πŸ” 16    πŸ’¬ 2    πŸ“Œ 2
SciPost: SciPost Phys. Lect. Notes 105 (2025) - Simplified derivations for high-dimensional convex learning problems SciPost Journals Publication Detail SciPost Phys. Lect. Notes 105 (2025) Simplified derivations for high-dimensional convex learning problems

scipost.org/SciPostPhysL...

27.10.2025 20:07 β€” πŸ‘ 9    πŸ” 2    πŸ’¬ 0    πŸ“Œ 0
Post image

Applying to do a postdoc or PhD in theoretical ML or neuroscience this year? Consider joining my group (starting next Fall) at UT Austin!
POD Postdoc: oden.utexas.edu/programs-and... CSEM PhD: oden.utexas.edu/academics/pr...

23.10.2025 21:36 β€” πŸ‘ 33    πŸ” 11    πŸ’¬ 1    πŸ“Œ 0

William Qian, Cengiz Pehlevan: Discovering alternative solutions beyond the simplicity bias in recurrent neural networks https://arxiv.org/abs/2509.21504 https://arxiv.org/pdf/2509.21504 https://arxiv.org/html/2509.21504

29.09.2025 06:50 β€” πŸ‘ 7    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0
NeurIPS 2025 Workshop DBM Findings Welcome to the OpenReview homepage for NeurIPS 2025 Workshop DBM Findings

⏳ Less than 1 day left until the Brain & Mind Workshop submission deadline!
πŸ” Submit to our Finding or Tutorials track on OpenReview.
Findings track submission: openreview.net/group?id=Neu...
Tutorial track submission: openreview.net/group?id=Neu...
More info: data-brain-mind.github.io

07.09.2025 16:30 β€” πŸ‘ 3    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0

Thank you so much for the kind shoutout! Grateful to be part of such a fantastic team.

07.09.2025 16:45 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Preview
Frontiers | Summary statistics of learning link changing neural representations to behavior How can we make sense of large-scale recordings of neural activity across learning? Theories of neural network learning with their origins in statistical phy...

Since I'm back on BlueSky - with @frostedblakess.bsky.social and @cpehlevan.bsky.social we wrote a brief perspective on how ideas about summary statistics from the statistical physics of learning could potentially help inform neural data analysis... (1/2)

04.09.2025 18:30 β€” πŸ‘ 34    πŸ” 10    πŸ’¬ 1    πŸ“Œ 0
Preview
Convergent motifs of early olfactory processing are recapitulated by layer-wise efficient coding The architecture of early olfactory processing is a striking example of convergent evolution. Typically, a panel of broadly tuned receptors is selectively expressed in sensory neurons (each neuron exp...

Excited to share new computational work, led by @jzv.bsky.social, driven by Juan Carlos Fernandez del Castillo + contribution from Farhad Pashakanloo. We recover 3 core motifs in the olfactory system of evolutionarily distant animals using a biophysically-grounded model + efficient coding ideas!

04.09.2025 16:51 β€” πŸ‘ 24    πŸ” 12    πŸ’¬ 0    πŸ“Œ 1
AIQ: Artificial Intelligence Quantified
YouTube video by DARPAtv AIQ: Artificial Intelligence Quantified

Great to have this video about my @darpa.mil Artificial Intelligence Quantified (AIQ) program out! Very exciting program with absolutely fantastic teams. Stay tuned for some jaw dropping announcements!

www.youtube.com/watch?v=KVRF...

02.09.2025 21:50 β€” πŸ‘ 7    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0

I am extremely grateful to be awarded the National University of Singapore (NUS) Development Grant, and to be a Young NUS Fellow! Look forward to collaborating with the Yong Loo Lin School of Medicine on exciting projects. This is my first grant and hopefully many more to come! #NUS #NeuroAI

27.08.2025 14:31 β€” πŸ‘ 8    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0
Preview
Simons Foundation Launches Collaboration on the Physics of Learning and Neural Computation Simons Foundation Launches Collaboration on the Physics of Learning and Neural Computation on Simons Foundation

Our new Simons Collaboration on the Physics of Learning and Neural Computation will develop powerful tools from #physics, #math, computer science and theoretical #neuroscience to understand how large neural networks learn, compute, scale, reason and imagine: www.simonsfoundation.org/2025/08/18/s...

19.08.2025 14:43 β€” πŸ‘ 21    πŸ” 5    πŸ’¬ 0    πŸ“Œ 3
Preview
Kempner Research Fellowship - Kempner Institute The Kempner brings leading, early-stage postdoctoral scientists to Harvard to work on projects that advance the fundamental understanding of intelligence.

If you work on artificial or natural intelligence and are finishing your PhD, consider applying for a Kempner research fellowship at Harvard:
kempnerinstitute.harvard.edu/kempner-inst...

18.08.2025 17:27 β€” πŸ‘ 48    πŸ” 31    πŸ’¬ 0    πŸ“Œ 0
Preview
Simons Foundation Launches Collaboration on the Physics of Learning and Neural Computation Simons Foundation Launches Collaboration on the Physics of Learning and Neural Computation on Simons Foundation

Congratulations to #KempnerInstitute associate faculty member @cpehlevan.bsky.social for joining the new
@simonsfoundation.org Simons Collaboration on the Physics of Learning and Neural Computation!

www.simonsfoundation.org/2025/08/18/s...

#AI #neuroscience #NeuroAI #physics #ANNs

18.08.2025 18:57 β€” πŸ‘ 15    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0

Very excited to lead this new @simonsfoundation.org collaboration on the physics of learning and neural computation to develop powerful tools from physics, math, CS, stats, neuro and more to elucidate the scientific principles underlying AI. See our website for more: www.physicsoflearning.org

18.08.2025 17:48 β€” πŸ‘ 92    πŸ” 14    πŸ’¬ 4    πŸ“Œ 1
Post image

🚨 Excited to announce our #NeurIPS2025 Workshop: Data on the Brain & Mind

πŸ“£ Call for: Findings (4- or 8-page) + Tutorials tracks

πŸŽ™οΈ Speakers include @dyamins.bsky.social @lauragwilliams.bsky.social @cpehlevan.bsky.social

🌐 Learn more: data-brain-mind.github.io

04.08.2025 15:28 β€” πŸ‘ 31    πŸ” 10    πŸ’¬ 0    πŸ“Œ 3
PNAS Proceedings of the National Academy of Sciences (PNAS), a peer reviewed journal of the National Academy of Sciences (NAS) - an authoritative source of high-impact, original research that broadly spans...

The post is based on a paper written with Yue M. Lu., @jzv.bsky.social, Anindita Maiti and @cpehlevan.bsky.social evan.

Check it out now at PNAS:

doi.org/10.1073/pnas...

(2/2)

28.07.2025 19:26 β€” πŸ‘ 2    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0
Preview
Solvable Model of In-Context Learning Using Linear Attention - Kempner Institute Attention-based architectures are a powerful force in modern AI. In particular, the emergence of in-context learning enables these models to perform tasks far beyond the original next-token prediction...

New in the #DeeperLearningBlog: the #KempnerInstitute's Mary Letey presents work recently published in PNAS that offers generalizable insights into in-context learning (ICL) in an analytically-solvable model architecture.

bit.ly/4lPK15p

#AI @pnas.org

(1/2)

28.07.2025 19:24 β€” πŸ‘ 6    πŸ” 4    πŸ’¬ 1    πŸ“Œ 0
Preview
In-context denoising with one-layer transformers: connections between attention and associative memory retrieval We introduce in-context denoising, a task that refines the connection between attention-based architectures and dense associative memory (DAM) networks, also known as modern Hopfield networks. Using a...

At #ICML2025, presenting work done at @flatironinstitute.org w Matt Smart and @albertobietti.bsky.social on in-context denoising (arxiv.org/abs/2502.05164). Come to Matt’s oral, Thursday, 4:15-4:30 PM, West Ballroom A, and see us right after at poster #E-3207, 4:30-7:00 PM, East Exhibition Hall A-B.

16.07.2025 18:37 β€” πŸ‘ 5    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0
Preview
Asymptotic theory of in-context learning by linear attention | PNAS Transformers have a remarkable ability to learn and execute tasks based on examples provided within the input itself, without explicit prior traini...

Great to see this one finally out in PNAS! Asymptotic theory of in-context learning by linear attention www.pnas.org/doi/10.1073/... Many thanks to my amazing co-authors Yue Lu, Mary Letey, Jacob Zavatone-Veth @jzv.bsky.social and Anindita Maiti!

11.07.2025 07:33 β€” πŸ‘ 23    πŸ” 5    πŸ’¬ 1    πŸ“Œ 0
Post image

#eNeuro: Obeid and Miller identify distinct neural computations in the primary visual cortex that explain how surrounding context suppresses perception of visual figures and features. β€ͺ@harvardseas.bsky.social‬
vist.ly/3n6tfb2

13.06.2025 23:32 β€” πŸ‘ 5    πŸ” 2    πŸ’¬ 0    πŸ“Œ 0