Well we finally know what's harder... Gold medal @ IMO or a few hundred random GitHub issues
16.08.2025 14:23 — 👍 0 🔁 0 💬 0 📌 0Well we finally know what's harder... Gold medal @ IMO or a few hundred random GitHub issues
16.08.2025 14:23 — 👍 0 🔁 0 💬 0 📌 0AMD gpu can't multiply matrices... I want my money back :/ youtube.com/shorts/Imj5j...
17.06.2025 02:34 — 👍 2 🔁 0 💬 0 📌 0new conspiracy theory: language models write such verbose, spaghetti code bc they charge you per token
30.04.2025 21:45 — 👍 1 🔁 0 💬 0 📌 0
Code: github.com/samuela/torc...
Install with `pip install torch2jax` to get started!
Adding support for random ops and batch norm required a near complete rewrite of the library internals. v0.1.0 now makes extensive use of PyTorch Modes (pytorch.org/docs/stable/...), an underrating part of the PyTorch API IMHO.
Shout out to Nick Boyd who inspired me to undertake this rewrite!
As a reminder, torch2jax enables running PyTorch code in JAX. Mix-and-match PyTorch and JAX code with seamless, end-to-end autodiff, use JAX classics like jit, grad, and vmap on PyTorch code, and run PyTorch models on TPUs.
See x.com/SamuelAinswo... for the initial project announcement.
I'm excited to announce the release of torch2jax v0.1.0!
Now with support for random functions like `torch.rand` and modules that use mutable buffers like BatchNorm 🎲
the wild thing, for me, is that i really do not think the lina khan era of tech regulation was anywhere close to reining in the free-for-all of consumer harm
14.02.2025 01:05 — 👍 725 🔁 32 💬 16 📌 4