I'll be at GDC next week! With rising RAM costs and ever-growing asset sizes, texture compression has never mattered more. Get in touch if you'd like to meet and chat.
I'm excited to announce the release of spark.js 0.1 now with support for WebGL!
www.ludicon.com/castano/blog...
#webgl #webgpu #sparkjs
That's quite impressive. I'll have to give hardware decoding a try. It's unfortunate that the vulkan video extensions are not exposed in any mobile device, where the power savings would make them much more attractive.
Reminds me to the character rendering pipeline in Hades: www.youtube.com/watch?v=Vj9e...
Real-time compression would address the memory footprint problem, but the other challenges remain and it's unclear to me the hardware decoder would perform better than bink.
For Vulkan we provide spirv shaders and example code showing how to use them in your app. You would be responsible of the resource management and kernel dispatch.
I’m curious how the Vulkan video extensions are working for you. How’s the latency and what kind of decode rates are you able to get?
Yes, base texture and mips are block compressed in realtime. Spark is very fast and produces superior results than all other realtime codecs. It uses a constrained search space and is optimized for perceptual metrics. Happy to provide specific numbers for your target hardware.
Thanks, happy to help co-develop a solution for your use case, including switch support.
spark.js is just a webgpu runtime. The Spark codecs should run on both switch consoles, but access to Nintendo SDK requires demand from a licensed studio. I should be able to provide support for switch the moment a developer requests it.
Runtime mipmap generation can reduce texture transmission by 25%, with no loss of quality.
Real-time doesn’t mean compromises: the same algorithms used offline can run efficiently on the GPU.
Learn about the spark.js implementation here:
www.ludicon.com/castano/blog...
I've been working on something new since August 2025 and have decided to finally spill the beans:
Introducing "Project Echo", a deterministic Record & Replay tool for PS5:
www.youtube.com/watch?v=K_sd...
Please read the video description and let me know your brutally honest feedback!
I’ve recently been tinkering with MLPs, exploring different ways they might be used in rendering to get a feel for their potential. I also put together a blog post with some initial impressions. interplayoflight.wordpress.com/2026/02/10/a...
For offline compression NVTT used to be best option, but these days most people have forked/reimplemented it and amalgamated several codecs. Kram is a good example. I wish there was a more lightweight option for texture processing in the same spirit as meshoptimizer.
I generated the UASTC gltf after taking the screenshots, so I don't have the exact same shot. UASTC quality is pretty good, but takes 3x the size on disk than AVIF and doesn't scale to lower video memory requirements.
And my fork of the webgpu demo that inspired this article: github.com/Ludicon/webg...
The tool that I built to update the model with AVIF textures:
github.com/Ludicon/gltf...
I've put together an updated version of the Sponza scene with uncompressed PNG and compressed AVIF textures. I wrote about the process and compared the results against KTX.
www.ludicon.com/castano/blog...
#webgpu #web3d #sparkjs
While working on spark.js, I realized that common normal map compression formats weren’t supported in popular frameworks like three.js. I added the necessary support to three.js and wrote an article to shed some light on the topic:
ludicon.com/castano/blog...
#webgpu #webgl #threejs #sparkjs
Graphics Programming weekly - Issue 426 - February 1st, 2026 www.jendrikillner.com/post/graphic...
For what I'm working on, I can change every file, asset, shader, and C++ code and see the changes in <1s.
Shaders take roughly 100ms, C++ code 500ms.
Using my own custom tech and Live++.
Really worthwhile reading from @icastano.bsky.social! If you're building with WebGPU, spark.js gives you new choices for texture compression that weren't possible with WebGL.
Also, I need to update my older blog post. 😅
Choosing texture formats for WebGPU apps
www.ludicon.com/castano/blog...
Loading large scenes and running out of VRAM? This guide is for you. A follow up to @donmccurdy.com's guide, with a new option: Ship AVIF/WebP over the network, with native block-compressed textures in VRAM
#webgpu #webgl
spark.js has been featured on webgpu.com!
"Real-Time Texture Transcoding for Faster Asset Delivery"
www.webgpu.com/showcase/spa...
#webgpu #threejs #gamedev
I'm excited to announce that @maxonvfx.bsky.social has licensed Spark for their cutting-edge Redshift renderer! Looking forward to supporting their work.
Fuck Donald Trump
We have to impeach him.
I've also submitted a small spark.js update that enables the use of these formats when using the three.js GLTF loader:
github.com/Ludicon/spar...
Shaved a few more bytes too! The package is now down to 256KB!
#webgpu #threejs #sparkjs
three.js r182 was just released! 🎉
github.com/mrdoob/three.js/rel
I contributed support for RG normal maps, enabling BC5 and EAC_RG normal map compression in both the WebGL and WebGPU renderers.
#webgpu #webgl #threejs
After nine years of development, meshoptimizer has reached its first major version, 1.0!
This release focuses on improvements in clusterization and simplification as well as stabilization. Here's a release announcement with more details on past, present and future; please RT!
meshoptimizer.org/v1