Ask Unreal Anything: Animation | July 13, 2022 at 11AM EDT

We are not planning on integrating Valorant’s engine modifications (such as the recording of animation graph state to enable server-side rollback). We are however taking the requirements on board with regards to planning future animation logic-related features.

1 Like

Control Rig is continually being improved and refined. We have many plans for where Control Rig is headed and look forward to sharing more information soon. We always welcome workflow and feature suggestions!

1 Like

You can’t change the reset child on activation dynamically, but your options would be:

  • Make a C++ Anim Node Function library that allows changing that value
  • Make an anim node function to do this when you need to
    -Store playback time
    -Restore playback time

2nd option should be doable in blueprint, but it won’t scale if your graph branch is more complex

Thank you for looking out for the artist! It hard to learn a tool workflow in different programs when we been using one way for 15+ years and that was the only way we been operating.

It is exciting to see how fast artist will get as the product matures and the user base become more fluent and adoption grows!

From your screenshot it appears that you are running 4.x. This is not really very easy to do in older versions of the engine, however in 5.0 we added “Anim Node Functions” - these allow you to write functions in BP/native that will be called at specific points in anim graph execution, for example “OnBecomeRelevant”. In that function you would then have full control over what the sequence player was doing, e.g. select a new animation/start time or not.

The Lyra sample provides some good examples of using anim node functions.

1 Like

I’m not sure of the exact cause of this issue, but there could be several causes.

  1. Ensure you’re using the latest UE5 release. I think there was previously a bug related to this
  2. The Setup Event should be querying Initial Transform to set Control transforms.

We are always evaluating where machine learning can provide solutions across the engine.

We recently released the “ML Deformer” plugin which uses an offline process to train a neural network to reproduce mesh deformations (at runtime) that match any arbitrary offline deformations.

There are several other areas of research utilizing ML for animation related technology. And we are refining our toolset to better support developers who wish to utilize ML in Unreal.


We have plans to make it a lot easier for users to create paths and then get characters to nicely follow those paths from the AI and Motion Matching tooling side, I can’t speak to the other area’s though unfortunately.

1 Like

Thanks for the bug report - we will look into it!

I refer you to the documentation on animating a Metahuman:

1 Like

You could create an Actor Blueprint that contains both the sphere and the character inside it. As long as they’re not parented to each other, you should be able to animate the sphere without rotating your character inside.

1 Like

MetaHumans provides a good example of how to combine modular props/apparel together into a single character. I recommend adding additional props, clothing, etc. in the same way - as a new skeletal mesh that is added to the BP, using Post Process Animation Blueprints or Master Pose to attach to the MH. This can have its own Control Rig for custom animation.

The best way to find MH Control Rigs is to use the Content Browser’s Filter functionality and filter by Control Rigs in the project.

1 Like

Creating an animation pipeline to support differently proportioned characters requires some pre-planning and careful setup.

First and foremost is to ensure that your characters share the same Skeleton asset and have identical reference poses (I recommend the A-pose).

Then make sure that the Skeleton bone translation retarget settings are setup correctly. Usually the Pelvis is set to “Animation Relative”. This ensures that the height of the hips will playback correctly on characters of different heights.

The rest of the skeleton is usually best set to “Skeleton”. This ensures the bone’s translation remains at the proportions coming from the Skeletal Mesh, not the translation from the animation sequence. You will know you got this correct when your “dwarf” animation does not squash your “giant” down to dwarf proportions.

This concludes the basic “translation retargeting” setup.

From here, there are things you will have to do to “fix up” contact points. IK bones are often added to skeletons and baked to “contact” points (like door knobs) in animations that have contact in them. These IK bones can then be used as target IK locations directly in an IK Rig or Control Rig at runtime (and blended on/off as needed depending on the context).

If you are looking to copy animation to a different skeletal mesh to modify it (for example to customize your dwarf animation to look better on a giant) then you would be best served by the IK Retargeting system that can bake out new animation sequences and provide more control over how the retarget is applied.

Modular clothing remains a tough challenge. There are no magic solutions here that enable sharing of clothing assets between different proportions. There are offline solutions that can accelerate the creation of meshes for different proportions (like Wrap 3d). But this may be out of the scope for smaller teams.

1 Like

Creating assets like Capes that can be attached to any character and still look great on a single platform is a complex problem. Combining that with a wide range of platforms that games like Fortnite support greatly increases the complexity.

For tutorials about Cloth Simulation in UE, check out the tutorials available here: Epic Developer Community Learning | Tutorials, Courses, Demos & More – Epic Developer Community

You can “pin” hands/feet using IK in the retargeter by selecting the bone chain with the IK goal you want to pin and set “Blend to Source” to 1.0 (default is 0).

It will position the IK at the location of the source bone. So for example if you used “Blend to Source” on the arms in a “open door” animation, the character’s hands would go to the door knob regardless of how tall they are.

There are many other features like this in the chains settings and more to come!

Well the issue here is it doesn’t scale, which I need to globally scale my rig tot he actor for the day. I’m sure there’s a way from there to figure it out, but I just haven’t been able to get scaling my actor and controlling the source blends to work the way Mobu does.

  • We would like to add the ability for Control Rig to get and modify assets like Pose Assets. It’s not there yet, but stay tuned.
  • Unfortunately, not yet.
  • You can “Bake Animation Sequence” which will generate an Animation Sequence Asset, Edit with FK Control Rig to generate keys, and Bake to Control Rig to generate keys for the Control Rig. These can all be used together to swap out rigs of differing complexity.
1 Like

This is a great question for the MetaHuman Creator team. You can find their community page linked here: MetaHuman - Unreal Engine

That’s all the time we have for today! Thank you all so much for coming by and asking questions from our awesome team, and of course a big thank you to Jeremiah, Kiaran, Paddy and Tom!!

Happy developing everyone!