Cengiz Pehlevan's Avatar

Cengiz Pehlevan

@cpehlevan.bsky.social

theory of neural networks for natural and artificial intelligence https://pehlevan.seas.harvard.edu/

1,071 Followers  |  338 Following  |  8 Posts  |  Joined: 21.09.2023  |  1.77

Latest posts by cpehlevan.bsky.social on Bluesky

SciPost: SciPost Phys. Lect. Notes 105 (2025) - Simplified derivations for high-dimensional convex learning problems SciPost Journals Publication Detail SciPost Phys. Lect. Notes 105 (2025) Simplified derivations for high-dimensional convex learning problems

scipost.org/SciPostPhysL...

27.10.2025 20:07 β€” πŸ‘ 9    πŸ” 2    πŸ’¬ 0    πŸ“Œ 0
Post image

Applying to do a postdoc or PhD in theoretical ML or neuroscience this year? Consider joining my group (starting next Fall) at UT Austin!
POD Postdoc: oden.utexas.edu/programs-and... CSEM PhD: oden.utexas.edu/academics/pr...

23.10.2025 21:36 β€” πŸ‘ 24    πŸ” 9    πŸ’¬ 1    πŸ“Œ 0

William Qian, Cengiz Pehlevan: Discovering alternative solutions beyond the simplicity bias in recurrent neural networks https://arxiv.org/abs/2509.21504 https://arxiv.org/pdf/2509.21504 https://arxiv.org/html/2509.21504

29.09.2025 06:50 β€” πŸ‘ 7    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0
NeurIPS 2025 Workshop DBM Findings Welcome to the OpenReview homepage for NeurIPS 2025 Workshop DBM Findings

⏳ Less than 1 day left until the Brain & Mind Workshop submission deadline!
πŸ” Submit to our Finding or Tutorials track on OpenReview.
Findings track submission: openreview.net/group?id=Neu...
Tutorial track submission: openreview.net/group?id=Neu...
More info: data-brain-mind.github.io

07.09.2025 16:30 β€” πŸ‘ 3    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0

Thank you so much for the kind shoutout! Grateful to be part of such a fantastic team.

07.09.2025 16:45 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Preview
Frontiers | Summary statistics of learning link changing neural representations to behavior How can we make sense of large-scale recordings of neural activity across learning? Theories of neural network learning with their origins in statistical phy...

Since I'm back on BlueSky - with @frostedblakess.bsky.social and @cpehlevan.bsky.social we wrote a brief perspective on how ideas about summary statistics from the statistical physics of learning could potentially help inform neural data analysis... (1/2)

04.09.2025 18:30 β€” πŸ‘ 33    πŸ” 10    πŸ’¬ 1    πŸ“Œ 0
Preview
Convergent motifs of early olfactory processing are recapitulated by layer-wise efficient coding The architecture of early olfactory processing is a striking example of convergent evolution. Typically, a panel of broadly tuned receptors is selectively expressed in sensory neurons (each neuron exp...

Excited to share new computational work, led by @jzv.bsky.social, driven by Juan Carlos Fernandez del Castillo + contribution from Farhad Pashakanloo. We recover 3 core motifs in the olfactory system of evolutionarily distant animals using a biophysically-grounded model + efficient coding ideas!

04.09.2025 16:51 β€” πŸ‘ 24    πŸ” 12    πŸ’¬ 0    πŸ“Œ 1
AIQ: Artificial Intelligence Quantified
YouTube video by DARPAtv AIQ: Artificial Intelligence Quantified

Great to have this video about my @darpa.mil Artificial Intelligence Quantified (AIQ) program out! Very exciting program with absolutely fantastic teams. Stay tuned for some jaw dropping announcements!

www.youtube.com/watch?v=KVRF...

02.09.2025 21:50 β€” πŸ‘ 6    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0

I am extremely grateful to be awarded the National University of Singapore (NUS) Development Grant, and to be a Young NUS Fellow! Look forward to collaborating with the Yong Loo Lin School of Medicine on exciting projects. This is my first grant and hopefully many more to come! #NUS #NeuroAI

27.08.2025 14:31 β€” πŸ‘ 8    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0
Preview
Simons Foundation Launches Collaboration on the Physics of Learning and Neural Computation Simons Foundation Launches Collaboration on the Physics of Learning and Neural Computation on Simons Foundation

Our new Simons Collaboration on the Physics of Learning and Neural Computation will develop powerful tools from #physics, #math, computer science and theoretical #neuroscience to understand how large neural networks learn, compute, scale, reason and imagine: www.simonsfoundation.org/2025/08/18/s...

19.08.2025 14:43 β€” πŸ‘ 21    πŸ” 5    πŸ’¬ 0    πŸ“Œ 3
Preview
Kempner Research Fellowship - Kempner Institute The Kempner brings leading, early-stage postdoctoral scientists to Harvard to work on projects that advance the fundamental understanding of intelligence.

If you work on artificial or natural intelligence and are finishing your PhD, consider applying for a Kempner research fellowship at Harvard:
kempnerinstitute.harvard.edu/kempner-inst...

18.08.2025 17:27 β€” πŸ‘ 48    πŸ” 31    πŸ’¬ 0    πŸ“Œ 0
Preview
Simons Foundation Launches Collaboration on the Physics of Learning and Neural Computation Simons Foundation Launches Collaboration on the Physics of Learning and Neural Computation on Simons Foundation

Congratulations to #KempnerInstitute associate faculty member @cpehlevan.bsky.social for joining the new
@simonsfoundation.org Simons Collaboration on the Physics of Learning and Neural Computation!

www.simonsfoundation.org/2025/08/18/s...

#AI #neuroscience #NeuroAI #physics #ANNs

18.08.2025 18:57 β€” πŸ‘ 15    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0

Very excited to lead this new @simonsfoundation.org collaboration on the physics of learning and neural computation to develop powerful tools from physics, math, CS, stats, neuro and more to elucidate the scientific principles underlying AI. See our website for more: www.physicsoflearning.org

18.08.2025 17:48 β€” πŸ‘ 92    πŸ” 14    πŸ’¬ 4    πŸ“Œ 1
Post image

🚨 Excited to announce our #NeurIPS2025 Workshop: Data on the Brain & Mind

πŸ“£ Call for: Findings (4- or 8-page) + Tutorials tracks

πŸŽ™οΈ Speakers include @dyamins.bsky.social @lauragwilliams.bsky.social @cpehlevan.bsky.social

🌐 Learn more: data-brain-mind.github.io

04.08.2025 15:28 β€” πŸ‘ 31    πŸ” 10    πŸ’¬ 0    πŸ“Œ 3
PNAS Proceedings of the National Academy of Sciences (PNAS), a peer reviewed journal of the National Academy of Sciences (NAS) - an authoritative source of high-impact, original research that broadly spans...

The post is based on a paper written with Yue M. Lu., @jzv.bsky.social, Anindita Maiti and @cpehlevan.bsky.social evan.

Check it out now at PNAS:

doi.org/10.1073/pnas...

(2/2)

28.07.2025 19:26 β€” πŸ‘ 2    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0
Preview
Solvable Model of In-Context Learning Using Linear Attention - Kempner Institute Attention-based architectures are a powerful force in modern AI. In particular, the emergence of in-context learning enables these models to perform tasks far beyond the original next-token prediction...

New in the #DeeperLearningBlog: the #KempnerInstitute's Mary Letey presents work recently published in PNAS that offers generalizable insights into in-context learning (ICL) in an analytically-solvable model architecture.

bit.ly/4lPK15p

#AI @pnas.org

(1/2)

28.07.2025 19:24 β€” πŸ‘ 6    πŸ” 4    πŸ’¬ 1    πŸ“Œ 0
Preview
In-context denoising with one-layer transformers: connections between attention and associative memory retrieval We introduce in-context denoising, a task that refines the connection between attention-based architectures and dense associative memory (DAM) networks, also known as modern Hopfield networks. Using a...

At #ICML2025, presenting work done at @flatironinstitute.org w Matt Smart and @albertobietti.bsky.social on in-context denoising (arxiv.org/abs/2502.05164). Come to Matt’s oral, Thursday, 4:15-4:30 PM, West Ballroom A, and see us right after at poster #E-3207, 4:30-7:00 PM, East Exhibition Hall A-B.

16.07.2025 18:37 β€” πŸ‘ 5    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0
Preview
Asymptotic theory of in-context learning by linear attention | PNAS Transformers have a remarkable ability to learn and execute tasks based on examples provided within the input itself, without explicit prior traini...

Great to see this one finally out in PNAS! Asymptotic theory of in-context learning by linear attention www.pnas.org/doi/10.1073/... Many thanks to my amazing co-authors Yue Lu, Mary Letey, Jacob Zavatone-Veth @jzv.bsky.social and Anindita Maiti!

11.07.2025 07:33 β€” πŸ‘ 21    πŸ” 5    πŸ’¬ 1    πŸ“Œ 0
Post image

#eNeuro: Obeid and Miller identify distinct neural computations in the primary visual cortex that explain how surrounding context suppresses perception of visual figures and features. β€ͺ@harvardseas.bsky.social‬
vist.ly/3n6tfb2

13.06.2025 23:32 β€” πŸ‘ 5    πŸ” 2    πŸ’¬ 0    πŸ“Œ 0

πŸ“£ Grad students and postdocs in computational and theoretical neuroscience: please consider applying for the 2025 Flatiron Institute Junior Theoretical Neuroscience Workshop! All expenses are covered. Apply by April 14. jtnworkshop2025.flatironinstitute.org

09.04.2025 16:11 β€” πŸ‘ 21    πŸ” 16    πŸ’¬ 0    πŸ“Œ 0

New preprint! We trained an RNN using RL to solve a decision making task used to characterize suboptimal decision making by Schizophrenic patients. First project exploring comp psych models, thanks to @adam-manoogian.bsky.social @shawnrhoadsphd.bsky.social @bqian.bsky.social @cpehlevan.bsky.social

27.03.2025 17:00 β€” πŸ‘ 10    πŸ” 5    πŸ’¬ 1    πŸ“Œ 0

Congratulations!

19.02.2025 01:12 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

Honoured to have been selected as a #SloanFellow
Thankful for all the support from family, mentors, collaborators, colleagues and students along the way!
@sloanfoundation.bsky.social

18.02.2025 22:13 β€” πŸ‘ 48    πŸ” 9    πŸ’¬ 3    πŸ“Œ 0
Preview
Symmetries and Continuous Attractors in Disordered Neural Circuits A major challenge in neuroscience is reconciling idealized theoretical models with complex, heterogeneous experimental data. We address this challenge through the lens of continuous-attractor networks...

(1/30) New preprint! "Symmetries and continuous attractors in disordered neural circuits" with Larry Abbott and Haim Sompolinsky
bioRxiv: www.biorxiv.org/content/10.1...

29.01.2025 18:26 β€” πŸ‘ 96    πŸ” 34    πŸ’¬ 7    πŸ“Œ 2
Preview
Symmetries and continuous attractors in disordered neural circuits A major challenge in neuroscience is reconciling idealized theoretical models with complex, heterogeneous experimental data. We address this challenge through the lens of continuous-attractor networks...

Theory in neuroscience, you say? How about this preprint by @david-g-clark.bsky.social, with a couple of others you might recognize? :-) #neuroscience

27.01.2025 00:23 β€” πŸ‘ 35    πŸ” 9    πŸ’¬ 0    πŸ“Œ 1

Our preprint with @frostedblakess.bsky.social, @jzv.bsky.social, @cpehlevan.bsky.social is out!

We develop a simple reinforcement learning model that recapitulates 3 disparate hippocampal dynamics. With ablation studies, these representations improve the speed and flexibility of policy learning.

17.12.2024 19:46 β€” πŸ‘ 16    πŸ” 5    πŸ’¬ 3    πŸ“Œ 1
Postdoctoral Fellow in Theoretical & Computational Neuroscience Postdoctoral positions are available in Jacob Zavatone-Veth’s research group at Harvard's Center for Brain Science. We are broadly interested in the theory of neural computation; see jzv.io for more i...

The official ad for this postdoc position is now (finally) live, see academicpositions.harvard.edu/postings/14486!

16.12.2024 14:42 β€” πŸ‘ 10    πŸ” 5    πŸ’¬ 2    πŸ“Œ 0
Post image

Come by at Neurips to hear Hamza present about properties of various feature learning infinite parameter limits of transformer models.

Poster in Hall A-C #4804 at 11 AM PST

Paper arxiv.org/abs/2405.15712 , code github.com/Pehlevan-Gro...

Work with Hamza Chaudhry and @cpehlevan.bsky.social

13.12.2024 01:04 β€” πŸ‘ 16    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0

Excited to share my #NeurIPS2024 paper with @jzv.bsky.social, @BenjaminSRuben, and @cpehlevan.bsky.social on mechanistic mismatches in data-constrained models of neural dynamics! (1/n)

12.12.2024 19:51 β€” πŸ‘ 37    πŸ” 10    πŸ’¬ 1    πŸ“Œ 3

I have to check how these works use/intepret these equations but typically, in your notation, r is interpreted as firing rate, and x is current.

11.12.2024 20:02 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

@cpehlevan is following 20 prominent accounts