Classic sneaky enrollment. No barrage of emails threatening desk reject thus far, might be ahead of us :)
27.02.2026 13:02 — 👍 2 🔁 0 💬 0 📌 0Classic sneaky enrollment. No barrage of emails threatening desk reject thus far, might be ahead of us :)
27.02.2026 13:02 — 👍 2 🔁 0 💬 0 📌 0I second this. Feel great.
24.02.2026 13:55 — 👍 2 🔁 0 💬 0 📌 0Yes exactly, ViT-scale training of around 300M-600M params is the ballpark here
24.02.2026 13:54 — 👍 2 🔁 0 💬 0 📌 0Happy accidents like forgetting to turn off a run and then taking vacation never happens with lr schedule (I guess it never happens on our clusters either, but hypothetically :D)
23.02.2026 21:16 — 👍 0 🔁 0 💬 1 📌 0Just tugging along at a safe LR for a long time just feels good in the spirit. DINOv3 style (without the degradation in feature quality, dont mind that part...)
23.02.2026 21:15 — 👍 0 🔁 0 💬 1 📌 0Shoutout Västerås btw (from your picture). Pretty middling Swedish town XD
23.02.2026 19:05 — 👍 1 🔁 0 💬 0 📌 0As LR decays I empirically find grads to almost always go up. Has a risk to derail your training loss. Almost never happens to me with constant LR. Especially with big networks. Also you can resume at any time and leave training on for as long as you want without a schedule :)
23.02.2026 19:00 — 👍 3 🔁 1 💬 1 📌 0Biggest psyop: Learning rate decay. @parskatt.bsky.social showed me the truth but I refused to accept it...
23.02.2026 18:41 — 👍 5 🔁 0 💬 2 📌 1Hehe I am a zoomer, only digital :)
23.02.2026 18:39 — 👍 1 🔁 0 💬 1 📌 0Malmooo
23.02.2026 18:37 — 👍 1 🔁 0 💬 0 📌 0I dont like the neurips template. Feels like you have to do hacky stuff to fit your figures / tables.
23.02.2026 16:25 — 👍 0 🔁 0 💬 0 📌 0I am starting to like the ECCV template. It feels like you can include more tables / figures and thus I can fit alot more within the page limit. Might be bad for readers :)
23.02.2026 16:24 — 👍 2 🔁 0 💬 3 📌 0East coast US also going to sleep without decisions out 🤔
21.02.2026 04:42 — 👍 2 🔁 0 💬 0 📌 0ICLR: Radio silence and then full reset due to openreview leak... :)
05.02.2026 19:37 — 👍 1 🔁 0 💬 0 📌 0Big!
04.02.2026 19:14 — 👍 0 🔁 0 💬 0 📌 0"Patience you must have my young researcher"...
04.02.2026 19:13 — 👍 0 🔁 0 💬 1 📌 0Malmoooo, let's go!
30.01.2026 11:53 — 👍 4 🔁 0 💬 0 📌 0RoCo = Robust Correspondences
13.01.2026 11:31 — 👍 1 🔁 0 💬 0 📌 0He died for our sins
12.01.2026 15:15 — 👍 1 🔁 0 💬 0 📌 0Interesting :), thanks for sharing
19.12.2025 07:25 — 👍 0 🔁 0 💬 0 📌 01337 > 1024
17.12.2025 22:27 — 👍 0 🔁 0 💬 1 📌 0Sounds nice!
13.12.2025 21:19 — 👍 1 🔁 0 💬 0 📌 0Congratulations! Well deserved!
10.12.2025 22:49 — 👍 2 🔁 0 💬 0 📌 0Hehe yeah I did not interpret the email as our submissions would be redacted FBI-style :D
09.12.2025 22:36 — 👍 1 🔁 0 💬 1 📌 0RIP those who stayed up for an early submission number... :)
09.12.2025 22:35 — 👍 2 🔁 1 💬 1 📌 0I joined a PhD program with no publications, no research internships, and no experience in the area. I have, thus far, found the experience very enjoyable. You seem to be much further along in research than I were, keep it up!
27.11.2025 20:17 — 👍 1 🔁 0 💬 1 📌 0Alvis most wanted
27.11.2025 08:37 — 👍 2 🔁 0 💬 1 📌 0Thanks for sharing! :)
26.11.2025 15:16 — 👍 1 🔁 0 💬 1 📌 0
Presenting today at #BMVC2025 our follow-up work on anisotropic rotation averaging which is particularly useful in global SfM. We propose a fast solver ACD and integrate robust optimization.
If you’re at the conference, welcome to come to our poster #516!
bmvc2025.bmva.org/proceedings/...