The Transformer Cookbook
We present the transformer cookbook: a collection of techniques for directly encoding algorithms into a transformer's parameters. This work addresses the steep learning curve of such endeavors, a prob...
We present The Transformer Cookbook: a collection of recipes for programming algorithms directly into transformers!
Hungry for an induction head? Craving a Dyck language recognizer? We show you step-by-step how to cook up transformers for these algorithms and many more!
03.10.2025 16:24 —
👍 5
🔁 5
💬 1
📌 0
x.com
You can find a thread about the paper here: x.com/satwik1729/s...
09.12.2024 19:09 —
👍 1
🔁 0
💬 0
📌 0
Excited to head to @NeurIPSConf today! I'll be presenting our work on the representational capabilities of Transformers and RNNs/SSMs. If you're interested in meeting up to discuss research or chat, feel free to reach out via DM or email!
09.12.2024 19:09 —
👍 3
🔁 0
💬 1
📌 0
Karp: A Language for NP Reductions
I wanted my Algorithms students to program NP hardness reductions so we developed Karp. A domain specific language for writing Karp reductions. Our students are quite good with a debugger, so reducing learning Theory to debugging seemed like a win. docs.racket-lang.org/karp/index.h...
27.11.2024 04:48 —
👍 27
🔁 9
💬 1
📌 1