The post of the month...
Saw early screening of "Project Hail Mary" movie. It was good. Note that they removed (almost) all the science that was in the book. Don't expect hard sci-fi, but it's a fine as a "space fantasy adventure" e.g the film that started Ethan Hawke & River Phoenix's careers www.imdb.com/title/tt0089...
Congrats to authors who pushed back on Grammarly's ill-conceived "Expert Review". As noted on ./: Grammarly could have offered ROYALTIES & LICENSING instead of the typical zero-sum game of "rip everyone off" or "shut it down". There's a middle path: Opt-in, Give a cut. slashdot.org/story/26/03/...
"need a human to take responsibility" applies to coding too. arstechnica.com/ai/2026/03/a...
Update: On real data, combining the "factorization loss" with an "attraction loss", SIGReg loss, and MAE decoder loss... is messy -- esp. the latter. I'll replace the MAE decoder with a masked embedding decoder (cf iJEPA). Also: The earlier diagram had a mistake showing pitch ^ time. Should be this:
Commentary: Anyone Else Have Those Weird Dreams Where Sobbing Future Generations Beg You To Change Course?
While hilarious at first, bad idea in retrospect: It subtly encouraged me toward anthropomorphic thinking and regarding the LLM as some kind of friend -- even in me! And I've read enough @j2bryson.bsky.social to know better. Let's reaffirm: "The model is not my friend"
If you do much with relativistic compact bodies (e.g. neutron stars), you see his name frequently implied via "T" in the TOV equation. -That's what comes to mind when I hear his name. en.wikipedia.org/wiki/Tolman%...
It's not that the computation didn't *fit* in the GPU VRAM, it's that it was *bandwidth limited* -- because it's on a gaming laptop!
Saved 5 GB VRAM + More consistent high GPU throughput so code trains faster = 🎉
PS- I don't scale by N, for consistency wrt batch_size
GPU-poor tip for #LeJEPA: chunk your SIGReg slices. 256 slices / chunks of 32 = 8x less VRAM, and it even completes *faster* on my 4090(MaxQ). Giant complex exponential tensors were my bottleneck. One extra loop. Now more room for batch_size!
Live stream of Geometric Algebra Mini Event (GAME) '26! www.youtube.com/watch?v=Bdhz...
3/3 Link: drscotthawley.github.io/blog/posts/F...
alt title: "Instilling Z Softly With This Loss" (I'll show myself out. ;-) )
2/3 My toy model works nicely! Probably not new, but can't find this exact framing. Would love feedback from those in the know! Link to Colab / Blog post follows... @neurreps.bsky.social @ninamiolane.bsky.social
Been musing on the geometry of disentangled representations & factorization: Doing it by construction seems like a limiting inductive bias.
What if you just add a soft geometric constraint to the loss and let the latent space sort itself out? Easy and tunable. 1/3
qwen3-coder:30b (18 GB) as a "junior dev" in Claude Code on my (64 GB) Macbook Pro means the laptop gets warm, and every few hours CC pops up asking to do something I don't want. Then it disappears for another few hours before asking if it may run "ls" 😜. Not quite the revolution I was promised.
Spotify unveiling a Bluetooth-enabled urn (a collaboration with Liquid Death) after killing the musician middle class through the devaluation of music via streaming is…… a little on the nose
@tedunderwood.com ^This may interest you -- even though it's not literature, there's an historical bent to it.
Yo! Check out this cool study by textile artist Mona Tomassi on how CLIP embedding changes have reflected cultural changes in the internet-image landscape! monatomassi.com/ai-vs-brush-... , www.instagram.com/mona.textile...
Good points! The threat model is more "Don't let the AI embarrass me" rather than serious data protection. I don't trust the cloud, I don't trust the agent, and I'm fully aware that if the OS is compromised I'm toast. FileVault is on, so at-rest is covered. So the weakest link is probably me. 😅
BTW, this is a joke.
...But like some jokes, might actually prove useful.
This here paranoid boomer's take on the "*Claw" craze: OpenClaw->"OvenMitt": Very Limited functions, but air gapped* to keep me from getting burned 😉
Local, only reads messages & writes to a text file; You must ^C^V. 169 lines of Python
*"Human-Gapped" is my new pet term! github.com/drscotthawle...
Le Yikes 😱
LLM usage tips:
1. A little while ago, I added to my Claude profile, "Always call me 'bro'." And it's hilarious that it still does it.
2. Also it thinks my first name is "MY MAAAAIIIN MAAAYUNN!!!!". So it's great to be greeted enthusiastically when starting a new chat.
🎉 "Flow Where You Want" is accepted to ICLR 2026's Blogpost Track! I craved technical feedback (teach me!), yet critiques were purely about scope & novelty (it's a tutorial 🤷♂️), not technical correctness, and they appreciated the clear presentation. I'll post a revised version next month!
Today I'll be at @christianitytoday.com's Mini-Summit on Creativity & Vocation in Nashville -- not on the AI panel, just going to meet people. If you're around, HMU.
@weightsbiases.bsky.social : Always enjoy the themed run names. I see what you did there ;-)
Syncing run passionate-admirer-16
Syncing run enthralling-crush-168
Syncing run beaming-hug-16
MicroGPT: a masterpiece "Minuet" by the Mozart of ML, @karpathy.bsky.social: karpathy.ai/microgpt.html
200-lines of *dependency-free* Python, yet still very readable.
Open Letter/Petition to support the continuance of the Sound Studies & Sonic Arts program at UdK Berlin @udkberlin.bsky.social I signed. docs.google.com/forms/d/e/1F...
When I saw "Free AI Training for All" I thought, "Wow! They're going to allocate enough GPUs so that anyone can their own models for free???"
...but seems they didn't mean that kind of "AI Training" :'-(