Time for another short story :)
theorangeduck.com/page/review-...
@theorangeduck.bsky.social
Animation & Machine Learning at Epic Games. Ex Ubisoft La Forge. Programmer & Occasional Writer. https://theorangeduck.com/
Time for another short story :)
theorangeduck.com/page/review-...
This year I had the chance to visit the Peace Museum in Hiroshima. Of all the things I saw in Japan this is the place that will stay with me forever.
theorangeduck.com/page/hiroshima
With a bit of help from the authors I've added another huge high quality dataset to my Geno retargetings. This time the very nice InterAct dataset (hku-cg.github.io/interact/).
You can find it here: github.com/orangeduck/i...
Yeah - very much possible if you drop down to C or use something like numba/cython. The Python overhead is just insanely large - often 100x slower than the C equivalent for something like quaternion maths, even in places where you can use numpy.
06.09.2025 19:29 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0Made a bare-bones Motion Matching implementation to test it: github.com/orangeduck/G...
Sadly I had to drop to 30fps because Python is so painfully slow even something as simple as Forward Kinematics can take several milliseconds. Eeek.
My latest article is on the problem of joint-error propagation - and some techniques for tackling it when doing Machine Learning or other data-driven numerical methods.
theorangeduck.com/page/joint-e...
I've just pushed up a version of my GenoView project but implemented in Python (using raylib bindings). Hopefully that is useful to anyone doing ML research building interactive animation systems who wants to make use of PyTorch, Numpy, etc.
github.com/orangeduck/GenoViewPython
We have a rare opening on our team for an Animation Programmer. If working with us on shipping Machine Learning animation tech inside UE appeals to you, please go ahead and apply!
www.epicgames.com/site/en-US/c...
I've added an appendix to my page on debug drawing text with lines with a contribution from @mikkomononen.bsky.social and his version of a non-monospaced font based on the Hershey Fonts Simplex characters:
theorangeduck.com/page/debug-d...
Does your font have a name? And would you mind if I included a version of it in my article as an appendix ๐?
16.06.2025 01:50 โ ๐ 0 ๐ 0 ๐ฌ 1 ๐ 0Oh awesome didn't know about these!
15.06.2025 01:30 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0There may not be many obvious alternatives, but CLIP is not a great embedding for describing motion. It's trained on static images so has a confused concept of time/tense/ordering, will often focus on the object rather than the action, and usually interprets verbs as nouns:
14.06.2025 22:07 โ ๐ 7 ๐ 1 ๐ฌ 1 ๐ 0More than once I've wanted to debug draw text in a game scene but found it not as easy as expected. I made a lookup table that allows you to draw text via line segments. It was a fun little weekend project, even if I'm sure it's something people have done before!
theorangeduck.com/page/debug-d...
Time for another long blog post... and today it's all about how we handle the temporal aspect of animation data. And more specifically, how to approach things from a signal processing perspective, covering up-sampling, down-sampling, and everything in-between.
theorangeduck.com/page/filteri...
Really interesting... scary how much the inner robot is capable of without a conscious control loop. I can simple envision the words I want to appear on the screen and it will type them out for me without any visual feedback or semantic understanding of what a "key", "keyboard" or "screen" is.
07.03.2025 18:46 โ ๐ 2 ๐ 0 ๐ฌ 0 ๐ 0Jรฉrรดme Eippers' YouTube channel is quickly becoming a goldmine for Animation Programming. He is covering so many classic animation papers - reviving a lot of great ideas and techniques that are either under-utilized or forgotten by the industry.
www.youtube.com/@JeromeEippe...
Learned a new spring fact from my colleague @rat-face.bsky.social recently. For a critically damped spring tracking a target moving at a constant velocity, the amount of time the spring will lag behind the target is given by "halflife / ln(2)". Added to my blog post theorangeduck.com/page/spring-...
25.02.2025 01:28 โ ๐ 6 ๐ 1 ๐ฌ 0 ๐ 0I've uploaded a retargeted version of the 100STYLE dataset matching the other datasets: github.com/orangeduck/1...
I wasn't sure if I should add it at first. The quality is lower due to the inertial capture, and it's missing finger motion, but overall I figured it's still worth it.
It's been a while since I did a pure Machine Learning article on my blog, but today I have a post for you about adding noise to Neural Networks, and some fun experiments with Flow-Matching:
theorangeduck.com/page/noise-n...
Big engines are using the FBX SDK (it's crappy and slow but the only sane way to get data out of an FBX file).
For prototypes I usually end up writing a maya script to dump out the raw mesh data and bind pose into a binary format I can handle. Usually something like this: github.com/orangeduck/G...
My latest article is about one of my favorite techniques in graphics - Mรผller's method of Polar Decomposition.
It also details some variations, including one which is ~2x faster to compute with more regular convergence.
theorangeduck.com/page/variati...