David Nordström's Avatar

David Nordström

@davnords.bsky.social

Phd Student @ Chalmers Deep Learning for Computer Vision. Strengthen your ViTs: https://github.com/davnords/octic-vits

100 Followers  |  102 Following  |  148 Posts  |  Joined: 10.02.2025
Posts Following

Posts by David Nordström (@davnords.bsky.social)

Post image 09.03.2026 08:51 — 👍 6    🔁 0    💬 0    📌 0

Vaguepostmaxxing

08.03.2026 21:26 — 👍 2    🔁 0    💬 1    📌 0

Hand-annotation is underrated for matching benchmarks

07.03.2026 19:38 — 👍 4    🔁 0    💬 1    📌 0

Big if true 👀

07.03.2026 08:37 — 👍 3    🔁 0    💬 0    📌 0

Nice squad! Good luck.

05.03.2026 22:23 — 👍 1    🔁 0    💬 0    📌 0

Classic sneaky enrollment. No barrage of emails threatening desk reject thus far, might be ahead of us :)

27.02.2026 13:02 — 👍 2    🔁 0    💬 0    📌 0

I second this. Feel great.

24.02.2026 13:55 — 👍 2    🔁 0    💬 0    📌 0

Yes exactly, ViT-scale training of around 300M-600M params is the ballpark here

24.02.2026 13:54 — 👍 2    🔁 0    💬 0    📌 0

Happy accidents like forgetting to turn off a run and then taking vacation never happens with lr schedule (I guess it never happens on our clusters either, but hypothetically :D)

23.02.2026 21:16 — 👍 0    🔁 0    💬 1    📌 0

Just tugging along at a safe LR for a long time just feels good in the spirit. DINOv3 style (without the degradation in feature quality, dont mind that part...)

23.02.2026 21:15 — 👍 0    🔁 0    💬 1    📌 0

Shoutout Västerås btw (from your picture). Pretty middling Swedish town XD

23.02.2026 19:05 — 👍 1    🔁 0    💬 0    📌 0
Post image

As LR decays I empirically find grads to almost always go up. Has a risk to derail your training loss. Almost never happens to me with constant LR. Especially with big networks. Also you can resume at any time and leave training on for as long as you want without a schedule :)

23.02.2026 19:00 — 👍 3    🔁 1    💬 1    📌 0

Biggest psyop: Learning rate decay. @parskatt.bsky.social showed me the truth but I refused to accept it...

23.02.2026 18:41 — 👍 6    🔁 0    💬 2    📌 1

Hehe I am a zoomer, only digital :)

23.02.2026 18:39 — 👍 2    🔁 0    💬 1    📌 0

Malmooo

23.02.2026 18:37 — 👍 1    🔁 0    💬 0    📌 0

I dont like the neurips template. Feels like you have to do hacky stuff to fit your figures / tables.

23.02.2026 16:25 — 👍 0    🔁 0    💬 0    📌 0

I am starting to like the ECCV template. It feels like you can include more tables / figures and thus I can fit alot more within the page limit. Might be bad for readers :)

23.02.2026 16:24 — 👍 2    🔁 0    💬 3    📌 0

East coast US also going to sleep without decisions out 🤔

21.02.2026 04:42 — 👍 2    🔁 0    💬 0    📌 0

ICLR: Radio silence and then full reset due to openreview leak... :)

05.02.2026 19:37 — 👍 1    🔁 0    💬 0    📌 0

Big!

04.02.2026 19:14 — 👍 0    🔁 0    💬 0    📌 0

"Patience you must have my young researcher"...

04.02.2026 19:13 — 👍 0    🔁 0    💬 1    📌 0

Malmoooo, let's go!

30.01.2026 11:53 — 👍 4    🔁 0    💬 0    📌 0

RoCo = Robust Correspondences

13.01.2026 11:31 — 👍 1    🔁 0    💬 0    📌 0

He died for our sins

12.01.2026 15:15 — 👍 1    🔁 0    💬 0    📌 0

Interesting :), thanks for sharing

19.12.2025 07:25 — 👍 0    🔁 0    💬 0    📌 0

1337 > 1024

17.12.2025 22:27 — 👍 0    🔁 0    💬 1    📌 0

Sounds nice!

13.12.2025 21:19 — 👍 1    🔁 0    💬 0    📌 0

Congratulations! Well deserved!

10.12.2025 22:49 — 👍 2    🔁 0    💬 0    📌 0

Hehe yeah I did not interpret the email as our submissions would be redacted FBI-style :D

09.12.2025 22:36 — 👍 1    🔁 0    💬 1    📌 0

RIP those who stayed up for an early submission number... :)

09.12.2025 22:35 — 👍 2    🔁 1    💬 1    📌 0