Just added a brief appendix with a derivation for the stopping time that takes into account the initial acceleration:
theorangeduck.com/page/fitting...
Just added a brief appendix with a derivation for the stopping time that takes into account the initial acceleration:
theorangeduck.com/page/fitting...
By the way do you have any links for how the Inertial Easing is working? I have been meaning to cover it on my blog for a while, but now it seems whatever information there was on it has disappeared from the internet :(
16.02.2026 13:52 β π 0 π 0 π¬ 1 π 0
Yeah honestly the more I look at motion capture data the less sure I am of what model is the most "correct"...
The critical spring damper is very nice from a mathematical standpoint. From a user flexibility standpoint this one is probably not too bad: theorangeduck.com/page/new-mov...
Recently I've been taking another look at fitting movement model parameters from data and managed to derive some formulations that work in terms of starting and stopping times and distances - something that is potentially a lot more intuitive for designers to use.
theorangeduck.com/page/fitting...
You could probably make a copy of the SmoothWalkingMode in C++ and go from there.
09.02.2026 21:17 β π 1 π 0 π¬ 0 π 0I think you'll need to implement a new movement mode which should be possible to do in blueprints.
08.02.2026 22:55 β π 0 π 0 π¬ 1 π 0A reader found a bug in the code listings on my "New Movement Model" article. The intermediate spring should take "track_vel" as a target, not "next_vel" (see github.com/orangeduck/M...) article has been updated!
02.02.2026 13:49 β π 1 π 0 π¬ 0 π 0
I've prepared a little blog post discussing the ideas behind the new Smooth Walking Mode we developed for UE's Mover plugin. This Movement Mode is also used in the new UE 5.8 GASP release!
theorangeduck.com/page/new-mov...
Because we couldn't release the code and data we've also prepared a small example implementation which I hope will prove useful to those trying to implement the method:
github.com/gouruiyu/Con...
Briefly, Control Operators provide a way of encoding arbitrary inputs to Neural Networks. This is particularly useful for game development as you can't be sure what the structure of input data will be like. For full explanation check out this blog post we prepared:
theorangeduck.com/page/impleme...
I'm so happy to finally be able to talk about our new SIGGRAPH Asia paper "Control Operators for Interactive Character Animation". It's been by far the most work I've seen go into a SIGGRAPH paper (thanks goes to @gouruiyu.bsky.social) and a fun project all round.
theorangeduck.com/page/control...
Time for another short story :)
theorangeduck.com/page/review-...
This year I had the chance to visit the Peace Museum in Hiroshima. Of all the things I saw in Japan this is the place that will stay with me forever.
theorangeduck.com/page/hiroshima
With a bit of help from the authors I've added another huge high quality dataset to my Geno retargetings. This time the very nice InterAct dataset (hku-cg.github.io/interact/).
You can find it here: github.com/orangeduck/i...
Yeah - very much possible if you drop down to C or use something like numba/cython. The Python overhead is just insanely large - often 100x slower than the C equivalent for something like quaternion maths, even in places where you can use numpy.
06.09.2025 19:29 β π 0 π 0 π¬ 0 π 0
Made a bare-bones Motion Matching implementation to test it: github.com/orangeduck/G...
Sadly I had to drop to 30fps because Python is so painfully slow even something as simple as Forward Kinematics can take several milliseconds. Eeek.
My latest article is on the problem of joint-error propagation - and some techniques for tackling it when doing Machine Learning or other data-driven numerical methods.
theorangeduck.com/page/joint-e...
I've just pushed up a version of my GenoView project but implemented in Python (using raylib bindings). Hopefully that is useful to anyone doing ML research building interactive animation systems who wants to make use of PyTorch, Numpy, etc.
github.com/orangeduck/GenoViewPython
We have a rare opening on our team for an Animation Programmer. If working with us on shipping Machine Learning animation tech inside UE appeals to you, please go ahead and apply!
www.epicgames.com/site/en-US/c...
I've added an appendix to my page on debug drawing text with lines with a contribution from @mikkomononen.bsky.social and his version of a non-monospaced font based on the Hershey Fonts Simplex characters:
theorangeduck.com/page/debug-d...
Does your font have a name? And would you mind if I included a version of it in my article as an appendix π?
16.06.2025 01:50 β π 0 π 0 π¬ 1 π 0Oh awesome didn't know about these!
15.06.2025 01:30 β π 0 π 0 π¬ 0 π 0There may not be many obvious alternatives, but CLIP is not a great embedding for describing motion. It's trained on static images so has a confused concept of time/tense/ordering, will often focus on the object rather than the action, and usually interprets verbs as nouns:
14.06.2025 22:07 β π 6 π 1 π¬ 1 π 0
More than once I've wanted to debug draw text in a game scene but found it not as easy as expected. I made a lookup table that allows you to draw text via line segments. It was a fun little weekend project, even if I'm sure it's something people have done before!
theorangeduck.com/page/debug-d...
Time for another long blog post... and today it's all about how we handle the temporal aspect of animation data. And more specifically, how to approach things from a signal processing perspective, covering up-sampling, down-sampling, and everything in-between.
theorangeduck.com/page/filteri...
Really interesting... scary how much the inner robot is capable of without a conscious control loop. I can simple envision the words I want to appear on the screen and it will type them out for me without any visual feedback or semantic understanding of what a "key", "keyboard" or "screen" is.
07.03.2025 18:46 β π 2 π 0 π¬ 0 π 0
JΓ©rΓ΄me Eippers' YouTube channel is quickly becoming a goldmine for Animation Programming. He is covering so many classic animation papers - reviving a lot of great ideas and techniques that are either under-utilized or forgotten by the industry.
www.youtube.com/@JeromeEippe...
Learned a new spring fact from my colleague @rat-face.bsky.social recently. For a critically damped spring tracking a target moving at a constant velocity, the amount of time the spring will lag behind the target is given by "halflife / ln(2)". Added to my blog post theorangeduck.com/page/spring-...
25.02.2025 01:28 β π 6 π 1 π¬ 0 π 0
I've uploaded a retargeted version of the 100STYLE dataset matching the other datasets: github.com/orangeduck/1...
I wasn't sure if I should add it at first. The quality is lower due to the inertial capture, and it's missing finger motion, but overall I figured it's still worth it.
It's been a while since I did a pure Machine Learning article on my blog, but today I have a post for you about adding noise to Neural Networks, and some fun experiments with Flow-Matching:
theorangeduck.com/page/noise-n...