Naver Labs Europe organizes a Workshop on AI for Robotics in the French Alpes (Grenoble), the 4th edition. This year the topic is 'Spatial AI', registration is open!
29.07.2025 16:14 โ ๐ 16 ๐ 4 ๐ฌ 0 ๐ 0@ericdexheimer.bsky.social
PhD student at Dyson Robotics Lab, Imperial College London http://edexheim.github.io
Naver Labs Europe organizes a Workshop on AI for Robotics in the French Alpes (Grenoble), the 4th edition. This year the topic is 'Spatial AI', registration is open!
29.07.2025 16:14 โ ๐ 16 ๐ 4 ๐ฌ 0 ๐ 0๐Looking for a multi-view depth method that just works?
We're excited to share MVSAnywhere, which we will present at #CVPR2025. MVSAnywhere produces sharp depths, generalizes and is robust to all kind of scenes, and it's scale agnostic.
More info:
nianticlabs.github.io/mvsanywhere/
Cool work! Do you think thereโs any architectural bias that prevents learning extrema rather than only minima or only maxima? Or is it mostly the repeatability issue? Just has me thinking about classical SIFT DoG and Harris handling both light/dark
11.03.2025 22:02 โ ๐ 1 ๐ 0 ๐ฌ 1 ๐ 0This is a fun example with a continuous transition between distinct 3D scenes!
25.02.2025 19:22 โ ๐ 9 ๐ 0 ๐ฌ 0 ๐ 1Here's a reconstruction of a movie establishing shot
25.02.2025 19:22 โ ๐ 11 ๐ 0 ๐ฌ 1 ๐ 1Weโve had fun testing the limits of MASt3R-SLAM on in-the-wild videos. Hereโs the drone video of a Minnesota bowling alley that weโve always wanted to reconstruct! Different scene scales, dynamic objects, specular surfaces, and fast motion.
25.02.2025 19:22 โ ๐ 46 ๐ 5 ๐ฌ 3 ๐ 3Iโm not too familiar with it, but it seems there is some equivalence noted in Section 3.2 of โFeature preserving point set surfaces based on nonโlinear kernel regressionโ.
13.01.2025 16:03 โ ๐ 3 ๐ 0 ๐ฌ 0 ๐ 0Introducing MASt3R-SLAM, the first real-time monocular dense SLAM with MASt3R as a foundation.
Easy to use like DUSt3R/MASt3R, from an uncalibrated RGB video it recovers accurate, globally consistent poses & a dense map.
With @ericdexheimer.bsky.social* @ajdavison.bsky.social (*Equal Contribution)