Code is now out! Try it for yourself here: github.com/abhimadan/st...
30.07.2025 21:42 โ ๐ 10 ๐ 4 ๐ฌ 0 ๐ 0@nmwsharp.bsky.social
3D geometry researcher: graphics, vision, 3D ML, etc | Senior Research Scientist @NVIDIA | polyscope.run and geometry-central.net | running, hockey, baking, & cheesy sci fi | opinions my own | he/him personal website: nmwsharp.com
Code is now out! Try it for yourself here: github.com/abhimadan/st...
30.07.2025 21:42 โ ๐ 10 ๐ 4 ๐ฌ 0 ๐ 0Also: this paper was recognized with a best paper award at SGP! Huge thanks to the organizers & congrats to the other awardees.
I was super lucky to work with Yousuf on this one, he's truly the mastermind behind it all!
Actually, Yousuf did a quick experiment which is related (though a different formulation), using @markgillespie64.bsky.social et al's Discrete Torsion Connection markjgillespie.com/Research/Dis.... You get fun spiraling log maps! (image attached)
02.07.2025 18:54 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0Yeah! That diffused frame is "the most regular frame field in the sense of transport along geodesics from the source", so you get out a log map that is as-regular-as-possible, in the same sense.
You could definitely use another frame field, and you'd get "log maps" warped along that field.
๐ป Website: www.yousufsoliman.com/projects/the...
๐ Paper: www.yousufsoliman.com/projects/dow...
๐ฌ Code (C++ library): geometry-central.net/surface/algo...
๐ Code (python bindings): github.com/nmwsharp/pot...
(point cloud code not available yet, let us know if you're interested!)
We give two variants of the algorithm, and show use cases for many problems like averaging values on surfaces, decaling, and stroke-aligned parameterization. It even works on point clouds!
02.07.2025 06:23 โ ๐ 8 ๐ 0 ๐ฌ 1 ๐ 0Instead of the usual VxV scalar Laplacian, or a 2Vx2V vector Laplacian, we build a 3Vx3V homogenous "affine" Laplacian! This Laplacian allows new algorithms for simpler and more accurate computation of the logarithmic map, since it captures rotation and translation at once.
02.07.2025 06:23 โ ๐ 6 ๐ 0 ๐ฌ 1 ๐ 0Previously in "The Vector Heat Method", we computed log maps with short-time heat flow, via a vector-valued Laplace matrix rotating between adjacent vertex tangent spaces.
The big new idea is to rotate **and translate** vectors, by working homogenous coordinates.
Logarithmic maps are incredibly useful for algorithms on surfaces--they're local 2D coordinates centered at a given source.
Yousuf Soliman and I found a better way to compute log maps w/ fast short-time heat flow in "The Affine Heat Method" presented @ SGP2025 today! ๐งต
Holding SIGGRAPH Asia 2026 in Malaysia is a slap in the face to the rights of LGBTQ+ people. Especially now, when underrepresented people need as much support as we can possibly give them ! Angry like me ? Sign this open letter to let them know. ๐ณ๏ธโโง๏ธ๐ณ๏ธโ๐
docs.google.com/document/d/1...
Sampling points on an implicit surface is surprisingly tricky, but we know how to cast rays against implicit surfaces! There's a classic relationship between line-intersections and surface-sampling, which turns out to be quite useful for geometry processing.
10.06.2025 18:01 โ ๐ 22 ๐ 2 ๐ฌ 1 ๐ 0Thank you! There's definitely a low-frequency bias when stochastic preconditioning is enabled, but we only use it for the first ~half of training, then train as-usual. The hypothesis is that the bias in the 1st half helps escape bad minima, then we fit high-freqs in the 2nd half. Coarse to fine!
08.06.2025 05:57 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0My childโs doll and tools I captured as 3D Gaussians, turned digital with collisions and dynamics. We are getting closer to bridging the gap between the world we can touch and digital 3D. Experience the bleeding edge at #NVIDIA Kaolin hands-on lab, #CVPR2025! Wed, 8-noon. tinyurl.com/nv-kaolin-cv...
06.06.2025 15:05 โ ๐ 11 ๐ 2 ๐ฌ 3 ๐ 0Check out Abhishek's research!
I was honestly surprised by this result: classic Barnes-Hut already builds a good spatial hierarchy for approximating kernel summations, but you can do even better by adding some stochastic sampling, for significant speedups on the GPU @ matching average error.
Ah yes absolutely. That's a great example, we totally should have cited it!
When we looked around we found mannnnnny various "coarse-to-fine" like schemes appearing in the context of particular problems or architectures. As you say, what most excited us here is having simple+general option.
Thank you for the kind words :) The technique is very much in-the-vein of lots of related ideas in ML, graphics, and elsewhere, but hopefully directly studying it & sharing is useful to the community!
04.06.2025 22:10 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0We did not try it w/ the Gaussians in this project (we really focused on the "query an Eulerian field" setting, which is not quite how Gaussian rendering works).
There are some very cool projects doing related things in that setting:
- ubc-vision.github.io/3dgs-mcmc/
- diglib.eg.org/items/b8ace7...
Tagging @selenaling.bsky.social and @merlin.ninja, who are both on here it turns out! ๐
03.06.2025 01:19 โ ๐ 1 ๐ 0 ๐ฌ 1 ๐ 0website: research.nvidia.com/labs/toronto...
arxiv: arxiv.org/abs/2505.20473
code: github.com/iszihan/stoc...
Kudos go to Selena Ling who is the lead author of this work, during her internship with us at NVIDIA. Reach out to Selena or myself if you have any questions!
Closing thought: In geometry, half our algorithms are "just" Laplacians/smoothness/heat flow under the hood. In ML, half our techniques are "just" adding noise in the right place. Unsurprisingly, these two tools work great together in this project. I think there's a lot more to do in this vein!
03.06.2025 00:43 โ ๐ 4 ๐ 0 ๐ฌ 2 ๐ 0Geometric initialization is a commonly-used technique to accelerate SDF field fitting, yet it often results in disastrous artifacts for non-object centric scenes. Stochastic preconditioning also helps to avoid floaters both with and without geometric initialization.
03.06.2025 00:43 โ ๐ 0 ๐ 0 ๐ฌ 1 ๐ 0Neural field training can be sensitive to changes to hyperparameters. Stochastic preconditioning makes training more robust to hyperparameter choices, shown here in a histogram of PSNRs from fitting preconditioned and non-preconditioned fields across a range of hyperparameters.
03.06.2025 00:43 โ ๐ 0 ๐ 0 ๐ฌ 1 ๐ 0We argue that this is a quick and easy form of coarse-to-fine optimization, applicable to nearly any objective or field representation. It matches or outperforms custom designed polices and staged coarse-to-fine schemes.
03.06.2025 00:43 โ ๐ 2 ๐ 0 ๐ฌ 1 ๐ 1Surprisingly, optimizing this blurred field to fit the objective greatly improves convergence, and in the end we anneal ๐ผ to 0 and are left with an ordinary un-blurred field.
03.06.2025 00:43 โ ๐ 0 ๐ 0 ๐ฌ 1 ๐ 0And implementing our method requires changing just a few lines of code!
03.06.2025 00:43 โ ๐ 0 ๐ 0 ๐ฌ 1 ๐ 0Itโs as simple as perturbing query locations according to a normal distribution. This produces a stochastic estimate of the blurred neural field, with the level of blur proportional to a scale parameter ๐ผ.
03.06.2025 00:43 โ ๐ 2 ๐ 0 ๐ฌ 1 ๐ 0Selena's #Siggraph25 work found a simple, nearly one-line change that greatly eases neural field optimization for a wide variety of existing representations.
โStochastic Preconditioning for Neural Field Optimizationโ by Selena Ling, Merlin Nimier-David, Alec Jacobson, & me.
Fun new paper at #SIGGRAPH2025:
What if instead of two 6-sided dice, you could roll a single "funky-shaped" die that gives the same statistics (e.g, 7 is twice as likely as 4 or 10).
Or make fair dice in any shapeโe.g., dragons rather than cubes?
That's exactly what we do! 1/n
The Symposium on Geometry Processing is an amazing venue for geometry research: meshes, point clouds, neural fields, 3D ML, etc. Reviews are quick and high-quality.
The deadline is in ~10 days. Consider submitting your work, I'm planning to submit!
sgp2025.my.canva.site/submit-page-...
Hi BlueSky! I'm trying to get back to a "write" relationship with social media after hiding from it for a while.
I like geometry research, useful code, pierogies [sic], triangles, outdoorsy life, etc. I mainly post about research/software, but glad to chat about anything.