Max Liani's Avatar

Max Liani

@maxliani.bsky.social

Tracing rays at NVIDIA. Former lighting artist, working on computer graphics rendering tech for all humans and robots alike. I mostly post stuff about light transport simulation, and my hobby project Workbench.

1,112 Followers  |  49 Following  |  490 Posts  |  Joined: 27.10.2024
Posts Following

Posts by Max Liani (@maxliani.bsky.social)

That’s also how my old renderer “Glimpse” worked, still in active development at Netflix Animation.

In Glimpse you could mark as “instance group” an arbitrary point in the hierarchy, making it a BVH nesting level (instance of instances), to avoid flattening too much and producing exorbitant scenes.

25.02.2026 08:36 — 👍 3    🔁 0    💬 0    📌 0

Love the thought 😊 I miss working on artists tools.
In a nutshell, instances are discovered as part of DAG traversal. Transform, visibility, materials (and the attribute state in general) accumulates during traversal and gets stored in the instance when traversal reaches a renderable object.

25.02.2026 08:36 — 👍 5    🔁 0    💬 2    📌 0
Video thumbnail

Also, because the scene graph is a DAG, not a tree, one can add parents by holding Ctrl during drag & drop and create instances that way.

25.02.2026 08:02 — 👍 4    🔁 0    💬 1    📌 0
Video thumbnail

Last weekend I refined the scene tree view in Workbench. I used the drag & drop support in dear imgui to implement DAG reordering and reparenting, plus renaming. Custom behavior and drop region draw to show where the node would be inserted, and display where and why it cannot.

25.02.2026 07:54 — 👍 24    🔁 2    💬 1    📌 0

That looks insane. Very cool

22.02.2026 23:10 — 👍 1    🔁 0    💬 1    📌 0

Yes exactly. It probably found some very bizzarre issue nobody would have ever found. But it’s just the start of the rabbit hole, I think. I am hopeful there are some good learning lessons for me.

21.02.2026 11:04 — 👍 2    🔁 0    💬 0    📌 0

Uh, it crashed after half a million iterations 🧐

21.02.2026 08:33 — 👍 6    🔁 0    💬 1    📌 0
Video thumbnail

100,000 fuzz iterations. I'd say this part of the system works :)

21.02.2026 06:28 — 👍 28    🔁 2    💬 2    📌 0

Oh boy, I am letting Cursor run automated mutation testing in Workbench... 🍿

21.02.2026 02:09 — 👍 1    🔁 0    💬 0    📌 0

Yep, just another day.

18.02.2026 09:24 — 👍 1    🔁 0    💬 0    📌 0

I have just spent 7 hours straight, barely leaving my desk looking at the wrong problem. I thought I was lost in reindexing hell, but the complex data structure I just made works just fine, the problem was in the incomplete input data all along. How is your day?

17.02.2026 07:35 — 👍 11    🔁 0    💬 1    📌 0

I misunderstood your question yesterday. Yes, DOF through mirrors IRL behave this way.

14.02.2026 20:42 — 👍 0    🔁 0    💬 0    📌 0

The behavior in my implementation is due to average in the measuring region gradually changing from an object to the other as the pointer moves.

14.02.2026 08:27 — 👍 0    🔁 0    💬 0    📌 0

I don’t know, but I guess it depends on the lens AF motor and the sensing hardware. I know some cameras use an iterative algorithm to refine focus, some others have a dual pixels that measure the phase change and can measure exactly at any time.

14.02.2026 08:27 — 👍 0    🔁 0    💬 1    📌 0
Video thumbnail

Estimation of focus distance works through reflections also, as far as they are sharp enough. I use some path solid angle heuristic to determine if the "sight" stopped at the surface or peered through the scattered path.

14.02.2026 01:02 — 👍 12    🔁 0    💬 1    📌 0
Video thumbnail

Finally, I came around at implementing DOF and wiring it to the camera manipulator system. Click and drag to samples the scene to estimate the focus distance, while scroll to modulate the aperture.

14.02.2026 01:02 — 👍 34    🔁 1    💬 1    📌 0

Yes it does. You have to point the AI to a test source file that shows the convention you want it to follow. I configured agent instructions for that. So far if I ask to make a new tests it’s been spot on.

11.02.2026 08:23 — 👍 1    🔁 0    💬 0    📌 0
Preview
About fast 2D CDF construction In a path tracer, sampling a light source that is driven by a texture, such as a HDR map, requires mapping a 2d random number to texels of the map, with a probability that is proportional to the br…

Cool. I posted a while ago about fast CDF reconstruction. I wonder if you are doing something like this or have a different technique. maxliani.wordpress.com/2024/03/09/a...

11.02.2026 04:58 — 👍 3    🔁 1    💬 1    📌 0
Post image

As we speak about unit testing, this is an example of what my unit tests look like. In particular the assertion messages are AI autocompletions. So, I didn't have to type much. Personally, I find this much more appealing than what I get with macro-based systems.

11.02.2026 04:24 — 👍 5    🔁 0    💬 1    📌 0

I believe those are good too. They simply compress a statement without any inscrutable magic. I don’t like when macros define a class or hide away complex contro flow, where it’s hard to tell what I am looking at… like you get a missing symbol linker error, and that name is nowhere to be found.

09.02.2026 23:44 — 👍 1    🔁 0    💬 0    📌 0

Friends out there. I have a question related to the use of unit test frameworks. Most frameworks for C/C++ are macros based.
Do you like using macros for the purpose?
Or do you think macros make your life harder, you wish for another way?

09.02.2026 10:25 — 👍 3    🔁 1    💬 1    📌 0
Post image

Eventually I did a good cleanup on my unit testing framework. I added support for JUnit xml export (in case I ever want to setup a CI process, even though it would be nonsense for a personal project 😅). But the cool thing is that it produces a complete log and xml export even in case of crash.

08.02.2026 01:05 — 👍 6    🔁 1    💬 0    📌 0
Video thumbnail

Crossed off the procrastination list: added support for multiple cameras in the scene. Up to now, I couldn't create cameras from the GUI, couldn't select them for rendering either. It's funny how something like this takes 20 minutes to do but can stay on the list for years.

01.02.2026 05:00 — 👍 37    🔁 0    💬 0    📌 0
Post image

Chamfer of disjoined edges work, at least for simple cases... lots more cases still to handle.

01.02.2026 00:17 — 👍 8    🔁 0    💬 0    📌 0
Post image

Some progress, but still rather broken...

31.01.2026 02:01 — 👍 3    🔁 0    💬 1    📌 0
Preview
a man with a mustache is wearing a suit and tie and making a funny face . ALT: a man with a mustache is wearing a suit and tie and making a funny face .

“Sometimes”

30.01.2026 06:28 — 👍 7    🔁 0    💬 0    📌 0

Definite real life. Besides the stove, nobody sane of mind would model/texture grout lines around the cabinets. It has the hallmarks of home renovation :)

29.01.2026 23:39 — 👍 1    🔁 0    💬 0    📌 0

The more a scribble diagrams and examples in my notebook, the more complex the problem seems to become.

27.01.2026 01:28 — 👍 0    🔁 0    💬 0    📌 0
Post image

I am back to it. I decided to revamp my modeling operators to carve out a half a decent modeler. There is a lot of cool manipulators options to be explored there. I started on chamfer... clearly, it's going to take a while to get it right.

25.01.2026 06:22 — 👍 31    🔁 2    💬 3    📌 0

Are you saying that “fat rendering” and “concrete rendering” industries has stolen and repurpose the name too? They have their magazines and international conventions on rendering you know 😄
Jokes apart, 3d graphics didn’t invent the term, it is general and means transforming to final appearance.

23.01.2026 23:40 — 👍 1    🔁 0    💬 1    📌 0