David Nordström's Avatar

David Nordström

@davnords.bsky.social

Phd Student @ Chalmers Deep Learning for Computer Vision. Strengthen your ViTs: https://github.com/davnords/octic-vits

99 Followers  |  102 Following  |  143 Posts  |  Joined: 10.02.2025
Posts Following

Posts by David Nordström (@davnords.bsky.social)

Classic sneaky enrollment. No barrage of emails threatening desk reject thus far, might be ahead of us :)

27.02.2026 13:02 — 👍 2    🔁 0    💬 0    📌 0

I second this. Feel great.

24.02.2026 13:55 — 👍 2    🔁 0    💬 0    📌 0

Yes exactly, ViT-scale training of around 300M-600M params is the ballpark here

24.02.2026 13:54 — 👍 2    🔁 0    💬 0    📌 0

Happy accidents like forgetting to turn off a run and then taking vacation never happens with lr schedule (I guess it never happens on our clusters either, but hypothetically :D)

23.02.2026 21:16 — 👍 0    🔁 0    💬 1    📌 0

Just tugging along at a safe LR for a long time just feels good in the spirit. DINOv3 style (without the degradation in feature quality, dont mind that part...)

23.02.2026 21:15 — 👍 0    🔁 0    💬 1    📌 0

Shoutout Västerås btw (from your picture). Pretty middling Swedish town XD

23.02.2026 19:05 — 👍 1    🔁 0    💬 0    📌 0
Post image

As LR decays I empirically find grads to almost always go up. Has a risk to derail your training loss. Almost never happens to me with constant LR. Especially with big networks. Also you can resume at any time and leave training on for as long as you want without a schedule :)

23.02.2026 19:00 — 👍 3    🔁 1    💬 1    📌 0

Biggest psyop: Learning rate decay. @parskatt.bsky.social showed me the truth but I refused to accept it...

23.02.2026 18:41 — 👍 5    🔁 0    💬 2    📌 1

Hehe I am a zoomer, only digital :)

23.02.2026 18:39 — 👍 1    🔁 0    💬 1    📌 0

Malmooo

23.02.2026 18:37 — 👍 1    🔁 0    💬 0    📌 0

I dont like the neurips template. Feels like you have to do hacky stuff to fit your figures / tables.

23.02.2026 16:25 — 👍 0    🔁 0    💬 0    📌 0

I am starting to like the ECCV template. It feels like you can include more tables / figures and thus I can fit alot more within the page limit. Might be bad for readers :)

23.02.2026 16:24 — 👍 2    🔁 0    💬 3    📌 0

East coast US also going to sleep without decisions out 🤔

21.02.2026 04:42 — 👍 2    🔁 0    💬 0    📌 0

ICLR: Radio silence and then full reset due to openreview leak... :)

05.02.2026 19:37 — 👍 1    🔁 0    💬 0    📌 0

Big!

04.02.2026 19:14 — 👍 0    🔁 0    💬 0    📌 0

"Patience you must have my young researcher"...

04.02.2026 19:13 — 👍 0    🔁 0    💬 1    📌 0

Malmoooo, let's go!

30.01.2026 11:53 — 👍 4    🔁 0    💬 0    📌 0

RoCo = Robust Correspondences

13.01.2026 11:31 — 👍 1    🔁 0    💬 0    📌 0

He died for our sins

12.01.2026 15:15 — 👍 1    🔁 0    💬 0    📌 0

Interesting :), thanks for sharing

19.12.2025 07:25 — 👍 0    🔁 0    💬 0    📌 0

1337 > 1024

17.12.2025 22:27 — 👍 0    🔁 0    💬 1    📌 0

Sounds nice!

13.12.2025 21:19 — 👍 1    🔁 0    💬 0    📌 0

Congratulations! Well deserved!

10.12.2025 22:49 — 👍 2    🔁 0    💬 0    📌 0

Hehe yeah I did not interpret the email as our submissions would be redacted FBI-style :D

09.12.2025 22:36 — 👍 1    🔁 0    💬 1    📌 0

RIP those who stayed up for an early submission number... :)

09.12.2025 22:35 — 👍 2    🔁 1    💬 1    📌 0
01.12.2025 22:40 — 👍 17    🔁 1    💬 0    📌 1

I joined a PhD program with no publications, no research internships, and no experience in the area. I have, thus far, found the experience very enjoyable. You seem to be much further along in research than I were, keep it up!

27.11.2025 20:17 — 👍 1    🔁 0    💬 1    📌 0

Alvis most wanted

27.11.2025 08:37 — 👍 2    🔁 0    💬 1    📌 0

Thanks for sharing! :)

26.11.2025 15:16 — 👍 1    🔁 0    💬 1    📌 0
Post image

Presenting today at #BMVC2025 our follow-up work on anisotropic rotation averaging which is particularly useful in global SfM. We propose a fast solver ACD and integrate robust optimization.

If you’re at the conference, welcome to come to our poster #516!

bmvc2025.bmva.org/proceedings/...

25.11.2025 10:16 — 👍 6    🔁 2    💬 0    📌 0