Thank you! :)
Yes, we are fortunate to have access to PS5 development through our Uni.
@m4xc.bsky.social
Software engineer fascinated by hardware, electronics, & the niche. Studying graphics programming at BUAS π€ Intrigued by voxels, pixel art, & graphics. https://m4xc.dev/
Thank you! :)
Yes, we are fortunate to have access to PS5 development through our Uni.
A first pass is used to determine what 8x8x8 chunks need to be updated, then the second pass is indirectly dispatched to perform only required re-voxelization.
A big optimization I worked on after my last post was making our voxelizer lazy. So, it only updates parts of the scene that changed. This made a huge difference for our performance and allowed us to push for larger level sizes!
Below is a high-level drawing of how it works:
A picture showing how to use the render graph interface using a builder pattern.
The engine supports both PC and PS5 through a platform-agnostic render graph I wrote.
It was a big gamble, I had written it as a prototype in 2 weeks, but it really paid off in the end. Saving us a ton of time and keeping our renderer clean.
Below is a code snippet showcasing the interface :)
We've used the #voxel game-engine we build as a team of students to make a little diorama puzzle game :)
This has been an incredibly fun experience, building this from the ground up, working together with designers and artists!
Engine: youtu.be/uvLZn1X_R0Q
Game: buas.itch.io/zentera
I'm assuming that by cluster you're referring to a voxel brick in this context :) (correct me if I'm wrong)
If each brick has its own transform. Couldn't that result in gaps between the bricks? Do you do anything to combat that, or is it not an issue?
Excited to finally show off Nanite Foliage www.youtube.com/watch?v=FJtF...
03.06.2025 18:19 β π 209 π 46 π¬ 14 π 2Awesome work, the demo looks amazing! :)
I'm curious what is the voxel size difference between LODs?
Does one brick in LOD1 cover 4x4x4 bricks in LOD0?
Or does it cover 2x2x2 bricks in LOD0?
Hey, sorry for my late response, Iβm happy you enjoyed reading my blog post :)
I donβt currently have an RSS feed, I always share the posts on socials instead.
But I might look into RSS sometime.
Here's a video showcasing the demo we created using the engine, to showcase our engine is capable of being used to make actual games! :D
16.04.2025 14:57 β π 9 π 0 π¬ 0 π 0Cone traced reflections.
Cone traced soft shadows.
Cone traced soft shadows & ambient occlusion.
For the past 8 weeks I've been working in a team of fellow students on a voxel game engine. I've been primarily working on the graphics, creating a cross-platform render graph for us, and working together with @scarak.bsky.social on our cone-traced lighting, and various graphics features! :)
16.04.2025 14:57 β π 19 π 1 π¬ 1 π 0I decided to open source my implementation of Surfel Radiance Cascades Diffuse Global Illumination, since I'm not longer actively working on it. Hopefully the code can serve as a guide to others who can push this idea further :)
github.com/mxcop/src-dgi
Itβs interesting to see the CWBVH performing worse here. If I remember correctly it usually outperforms the others right?
03.03.2025 11:20 β π 0 π 0 π¬ 1 π 0Graphics Programming weekly - Issue 376 - January 26th, 2025 www.jendrikillner.com/post/graphic...
27.01.2025 14:02 β π 78 π 23 π¬ 0 π 0I wrote a blog post on my implementation of the Surfel maintenance pipeline from my Surfel Radiance Cascades project. Most of what I learned came from "SIGGRAPH 2021: Global Illumination Based on Surfels" a great presentation from EA SEED :)
m4xc.dev/blog/surfel-...
In case you're looking for the perfect university for 2025/2026:
Consider the game program of Breda University. :) Teaching team straight out of gamedev, C++, strong focus on graphics, and dedicated tracks for programming, art and design.
Tuition fee this year is 2530 for EU citizens.
games.buas.nl
My new blog post explains spectral radiometric quantities, photometry and basics of spectral rendering. This is part 2/2 in a series on radiometry. Learn what it means when a light bulb has 800 lumen and how your renderer can account for that.
momentsingraphics.de/Radiometry2P...
With RC we store intervals for every probe (intervals are rays with a min and max time)
They are represented as HDR RGB radiance and a visibility term which is either 0 or 1.
Hereβs the discord link :)
discord.gg/6sUYQjMj
Thatβs correct :)
GT stands for Ground Truth, itβs the brute force correct result which I want to achieve.
Now I'm working on integrating a multi-level hash grid for the Surfels based on NVIDIA's SHARC and @h3r2tic.bsky.social's fork of kajiya. Here's a sneak peak of the heatmap debug view :)
15.01.2025 18:28 β π 10 π 0 π¬ 0 π 0I've been validating the results of my Surfel Radiance Cascades, comparing it to ground-truth and using the white furnace test. I think I'm really close to GT now, attached are 2 images, one is GT, the other is SRC :)
15.01.2025 18:28 β π 26 π 1 π¬ 2 π 0I've been improving the Surfel Radiance Cascades since my last post, ironing out bugs one by one. It's starting to get difficult to see the difference between my ground truth renderer and the SRC one :)
(in this rather low frequency scene, more high frequency tests coming soon)
After many weeks of effort I finally have radiance on screen now! Iβve still got issues mostly related to my Surfel Acceleration Structure, however Iβd say the results are already looking quite promising. It has kind of a hand painted look to it now :)
03.01.2025 09:38 β π 39 π 4 π¬ 0 π 0I think that this is indeed the best next step, the acceleration structure I have right now is starting to become the bottleneck.
Thanks for the tip!
Itβs time for an upgrade :)
Surfel debugger showing Cascade statistics to get an idea of what kind of work we're going to be doing.
A Surfel covered terrain with some Debug GUI overlayed on top.
I Finally got around to adding ImGui to my project, it was about time after logging info for a while :p
I'm now working on splitting the Surfels into Cascades, after which I can start implementing the radiance gathering, merging, and finally applying the diffuse irradiance to the screen :)
Thanks! We have a discord community where you can ask questions in case you get stuck :)
Warning: once you go RC thereβs no going back :p (jk)
discord.gg/DezF3NF3
Awesome! :)
Is the attached image 1 frame of accumulation? (single-shot)
If not Iβm curious, how many frames does it take to converge to that level, on average?
Something I'm considering for the acceleration structure is an extra layer of indirection (Surfel Clusters), which would allow me to store an offset & count in each cell.
Instead of storing the index of each Surfel in the cell itself.
For the acceleration structure I'm using a simple uniform grid at the moment, which can store a fixed number of Surfels per cell.
This is quite annoying, because closer to the camera Surfels are physically smaller, so cells need a higher capacity and cells are using a lot of memory.
~2500 Surfels distributed across the screen using gbuffer data.
This week I've been implementing Surfels for the first time, the most challenging part so far has been the spawning algorithm. Stopping Surfels from spawning on top of each other. The spatial look-up acceleration structure is also a part which I still need to iterate on :)
14.12.2024 15:59 β π 28 π 0 π¬ 1 π 0I'm using a method we call SPWI (Screen Probes World Intervals) which means probes are distributed across the screen. But, they are actually placed in the world, and their intervals (rays) are traced in world space.
So, in this demo lighting is also gathered outside of the screen-space.