Also found something else, which is now included in v0.2.2. I forgot to allow the sensor to yield a film plugin.
02.03.2026 13:05 β π 0 π 0 π¬ 0 π 0@ben.graphics.bsky.social
I do my PhD on physically based (differentiable) rendering, material appearance modeling/perception @tsawallis.bsky.social's Perception Lab I enjoy photography, animation/VFX/games, working on my renderer, languages and contributing to open source.
Also found something else, which is now included in v0.2.2. I forgot to allow the sensor to yield a film plugin.
02.03.2026 13:05 β π 0 π 0 π¬ 0 π 0
New bugfix is available for: github.com/pixelsandpoi...
Now nested plugins serialize/resolve correctly in your scene, so you can start using BlendedBSDFs :)
Are you talking about Johannesβ jo.dreggn.org/home/2015_mn...?
26.02.2026 17:43 β π 1 π 0 π¬ 1 π 0Fly through of part of a Keeper level directly in the Unreal Engine editor. Part way through I turn on all of the game objects the player doesn't see -- it takes a lot to make games work.
25.02.2026 19:23 β π 364 π 96 π¬ 10 π 2I was today's years old: github.com/rlguy/Blende...
25.02.2026 17:21 β π 0 π 0 π¬ 0 π 0A couple of great Slang Shader talks are ready for viewing:
24.02.2026 18:57 β π 6 π 3 π¬ 0 π 0Vulkan releases game engine tutorial
The Vulkan Working Group has published, Building a Simple Game Engine, a new in-depth tutorial for developers ready to move beyond the basics and into professional-grade engine development.
Learn more: www.khronos.org/blog/new-vul...
#vulkan #tutorial #programming #gpu #gameengine
Late to the party, butβ¦
I just had the most cinematic encounter in my entire video game career fighting against a primed Titan in #FFXVI.
Good job @square-enix-games.com, Iβm flabbergasted.
This release mainly stabilizes the scraping process so we can build an automatic API during package build. I also introduced a builder pattern to build the scene.
If you use Mitsuba and want to build scenes programmatically, give it a try and let me know if you find any rough edges! :)
Cheers!
3/3
You can pass the MITSUBA_VERSION env var, also as an inline during your pip install mitsuba-scene-description to specify a version you want to build. If you do not pass the var, the package will attempt to import Mitsuba and source the plugin reference from that version or falls back to v3.7.1.
2/x
Image shows a code example for the Python package introduced in this post. The code is as following (for screenreaders): import mitsuba_scene_description as msd import mitsuba as mi mi.set_variant("llvm_ad_rgb") # Define components diffuse = msd.SmoothDiffuseMaterial(reflectance=msd.RGB([0.8, 0.2, 0.2])) ball = msd.Sphere( radius=1.0, bsdf=diffuse, to_world=msd.Transform().translate(0, 0, 3).scale(0.4), ) cam = msd.PerspectivePinholeCamera( fov=45, to_world=msd.Transform().look_at( origin=[0, 1, -6], target=[0, 0, 0], up=[0, 1, 0] ), ) integrator = msd.PathTracer() emitter = msd.ConstantEnvironmentEmitter() # builder pattern scene = ( msd.SceneBuilder() .integrator(integrator) .sensor(cam) .shape("ball", ball) .emitter("sun", emitter) .build() ) # or scene = msd.Scene( integrator=integrator, sensors=cam, # also accepts a list for multi-sensor setups shapes={"ball": ball}, emitters={"sun": emitter}, ) mi.load_dict(scene.to_dict()) # will return: {'ball': {'bsdf': {'reflectance': {'type': 'rgb', 'value': [0.8, 0.2, 0.2]}, 'type': 'diffuse'}, 'radius': 1.0, 'to_world': Transform[ matrix=[[0.4, 0, 0, 0], [0, 0.4, 0, 0], [0, 0, 0.4, 1.2], [0, 0, 0, 1]], ... ], 'type': 'sphere'}, 'integrator': {'type': 'path'}, 'sensor': {'fov': 45, 'to_world': Transform[...], 'type': 'perspective'}, 'sun': {'type': 'constant'}, 'type': 'scene'}
G'day!
I've just published a new version of mitsuba-scene-description to GitHub and PyPI: github.com/pixelsandpoi...
I've changed the generation process, so you no longer need to manually clone and build the API yourself. The Mitsuba plugin API will now be generated during package build.
1/x
Thank you for the warm words!
It would already suffice to just use git and check with a git diff before one continues to package the TeX project. Claude actually provides diffs and asks for permissions. Maybe there was just one permission too much.
cover it up with the hope that no one will see the err. I'm an honest guy, and I'd like to keep it that way. I just feel sorry for the people whose reputations I may have hurt with this, i.e. Tom and the respective authors of the paper.
Hereβs to new scientific integrity.
I just decided to let Claude do it for me instead. While it fixed some issues it really did more harm than good. To be frank, I just messed up by not checking stuff meticulously after I let Claude do its thing. In the end, I had my learning and I rather be open about my wrongdoing than trying to...
20.02.2026 11:32 β π 3 π 0 π¬ 1 π 0Pretty much my reaction to this as well. I'm pretty anti-LLMs for the scientific process, but the paper I revised had tons of weird Tikz hacks in it, where I wanted to embed Tikz plots in a table which caused all sorts of problems that Arxiv didn't accept. After 2 hours of trying to fix it myself...
20.02.2026 11:32 β π 1 π 0 π¬ 1 π 0What a banger set of contributions π₯³ Ordered!
20.02.2026 08:41 β π 1 π 0 π¬ 0 π 0Excited to share that GPU Zen 4: Advanced Rendering Techniques is officially out. This volume features work from some of the most visually impactful projects of recent years, including #AssassinsCreed , #DOOM , #StarwarsOutLaws and NVidia's Zorah. And a killer cover :)
20.02.2026 01:02 β π 87 π 19 π¬ 5 π 2
Summary:
arxiv.org/abs/2512.123...
Contains the correct citation (w/o appendix and revised table).
arxiv.org/abs/2512.123...
arxiv.org/abs/2512.123...
Contain the incorrect citation.
Version 4 is on its way with the fixes.
100% agreed. I had everything in check with the first version I uploaded. I am usually more cautious around these things. I had split the submission to another journal and didn't copy the git folder, so I did not see the applied changes in the end. Otherwise this would've been an easy find with git.
19.02.2026 10:55 β π 1 π 1 π¬ 1 π 0
Fortunately, only one citation was affected. Still, kind of mad about the fact that this happened. Once more, sorry to all colleagues affected.
I also added the missing dates from a few citations and fixed the entries where PO Box was a co-author (sources resolved correctly).
My sincere apologies to the authors whom I failed to cite correctly. And a note of caution: Never let LLMs touch your papers, even for small fixes.
19.02.2026 09:48 β π 3 π 0 π¬ 0 π 0
[16] Sriram Guna Elumalai, Sree Harsha Nelaturu, Subha Nagarajan, Ines Rieger,
Bjoern Eskofier, and Andreas Maier. 2025. Beyond Texture: Generating
Interpretable Extremely-High Activation Images for Robust Vision Models.
arXiv:2501.07827 [cs.CV] should point to this: arxiv.org/abs/2510.13433
2/x
Unfortunately, I left Claude fixing Arxiv compilation errors. Apparently, it changed a citation in the process (that was right in v1 of the paper). I am currently reviewing all the citations now again by hand and preparing a new version that fixes these. I will also post them here:
1/x
Just watched the talk. Thanks for sharing and great job on integrating the RT backend in Godot. Appreciate the code samples!
18.02.2026 17:40 β π 1 π 0 π¬ 0 π 0
Not gonna lie, I'm impressed by Unreal's new layered material system (Substrate): www.youtube.com/watch?v=d1nc...
Reminds me a lot of Wenzel's and Andrea's work on layered BSDFs, but for real-time graphics. Happy to see it in Unreal.
www.cg.tuwien.ac.at/research/pub...
rgl.epfl.ch/publications...
Cheers mate!
16.02.2026 22:55 β π 0 π 0 π¬ 0 π 0
Thanks for sharing our stuff!
Stay tuned for more π¬
Sorry for the wait (been to China for 4 weeks between the years), but I finally managed to update the pre-print with more results and the code is now also available on: github.com/ag-perceptio...
If you run into issues, let me know.
The talk is scheduled for:
Talk Session: 3D Shape and Space Perception
Date/Time: Monday, May 18, 2026, 8:15 β 9:45 am
Location: Talk Room 2
Happy to announce that I will be giving a remote talk @vssmtg.bsky.social this year. I will be presenting our recent pre-print MRD: Metamers rendered differentiably (arxiv.org/abs/2512.12307).
Happy to take questions before the presentation in May, or live during the online Q&A.
#VSS2026 #VSS26