At the moment, creating complex animations feels more difficult in the Engine than in dedicted animation software. While it is difficult to pinpoint exactly why, here are a few examples of small things that I have noticed:
Limited support for bone/control based constraints.
Switching between pivot points (Average location / Individual elements / Custom) is difficult.
Copying / pasting poses takes time as it requires creating a control rig pose asset.
Unable to filter the tracks shown in the Sequencer to those for controls that are selected in the viewport.
Are there plans to further expand the Animation Mode in the editor? Having a fully-fledged animation suite in the Editor is a dream scenario for developers that want to move their animation workflow entirely to the Engine!
Are there plans to support control rig controls being able to interact with other objects in the scene?
+1! This would also be useful for creating interactions between different characters (e.g. when creating a grapple move). The following features could also be a big help:
Allow selection and manipulation of items on multiple rigs simultaneously. This includes using a control on one rig, as a pivot point for items on another rig.
Facilitate passing on data between control rigs (which is possible using control rig component, but a bit difficult outside of gameplay when working in the Editor’s Animation Mode).
I reached out to Epic’s resident ML Machine Learning expert Daniel Holden for an answer.
“The challenge for Epic is in finding a workflow for these tools which is accessible to everyone, not just machine learning experts and researchers. Machine Learning has already proven it can produce incredible results so it is definitely something Epic is looking at but there is no specific focus on quadrupeds right now.”
Slots/montages are designed to allow gameplay code to insert and control animation directly into the graph, for example contextual interactions, melee attacks etc. this allows logic to be decoupled somewhat between gameplay and the animation graph.
They have a lot of bundled-up features that are more or less only available in the context of Montage, e.g. efficient server-side root motion, built-in split-body support (e.g. via slot groups)
Montages also have the advantage of being more closely integrated with replication systems in the engine (see docs here, at the bottom).
This is something that unfortunately falls completely on my shoulders, why is there no documentation for this big tool you’re making?
Firstly, to all that are excited about the Motion Matching work, apologizes for our lack of communication. I actually did write up a big May update on how the system works, and how to use it. However, we made a decision in May to re-write a large piece of the backend of our Motion Matching system which outdated this document and thus why it didn’t go public.
The re-write is to make sure that the system is not only as expandable by users as possible, but also to simplify the structure of the tool. This should be finished this month, and I expect to deliver a public update on how to use Motion Matching and where it’s going into the future.
The main idea behind using a separate hierarchy of IK bones is to provide a stable target that represents the location of contact points for the hands and feet.
For example, if you have an animation of a short character opening a door, reaching for a door handle, the location/height of the door handle is essentially “baked” into the animation.
But if you then play this animation back on a tall character, they will reach for a location ABOVE the door handle.
To fix this, an IK bone can be baked at the actual height of the door handle and remain there regardless of blending or playing back on different proportions. The IK bone remains at the correct location because it is in a separate hierarchy and it’s translation retarget settings are set to “Animation” (meaning the translation is not modified by the target character’s proportions).
Then you can utilize notifications or curves to blend IK on the arm to reach the location of the IK bone, which is at the location of the door handle!
There are many use cases for IK bones, but they all follow a similar pattern of needing a stable representation of a contact point to use as a target for IK. Hope that helps!
We are not currently aware of any plugins that allow import of this format. It might need translating via some other interchange format (e.g. FBX) through some DCC tool to be able to get them into the engine.
The DefaultAnimationRig is used to tell tools like Sequencer what Control Rig to use for animating. Sequencer will automatically add a Default Animation Rig track when the Skeletal Mesh is added to the level.
Without ResetChildOnActivation, the animation sequence will stay at whatever playback time it was at before blending out. Because of this, after the first time, it behaves differently where as, ResetChild reinits the branch every time you switch to it.
While we do not currently have python support for generating / editing blueprints (or animation blueprints) we do have some degree of python support in various places across the engine and in our animation toolset.
The Lyra sample provides a pretty comprehensive example of how to dynamically link and encapsulate logic into your animation graphs at runtime alongside typical gameplay concerns. There is an upcoming live stream scheduled around the Fall where this is due to be looked at in detail.
As to the two sub questions:
Smoothly blending between linked graphs is only achieved at the moment using either manual blending (flip-flipping slots or a custom node) or via inertial blending. The blend time for the blend is specified on the graphs properties.
There are a number of ways to pass variables to sub-graphs. The original implementation (which still functions in UE5) allowed you to expose variables on your linked graphs/layers as pins on the node. These days the recommended workflow is to ‘pull’ data in your linked anim instances via Property Access, either from your ‘main’ anim instance or directly from the rest of the gameplay framework.
Hey
Do you have any plans on making tutorials or documentation on Motion Matching (Motion Trajectory). It looks like a really great feature for the future. Or if there is any documentation on a topic that you can share, I would also highly appreciate it.
Thanks!
Curve type is as you say only currently modifiable on a whole-sequence basis. There is currently an internal effort to move the tooling around animation sequences to be at parity with Sequencer, which should allow for per-key modifications like this in the future.
For your rigid body sim, while it’s not clear the exact problems you are having, this is something that has been talked about before in a live stream a while ago, which may help.
In the Unreal docs it is noted that both IK Rig and Control Rig can be procedurally affected in animation blueprints. What are the ideal use cases for using one over the other to achieve Full Body IK adjustments to animations at runtime, if I am not intending to ever manually animate on the Control Rig? Can IK Rig be used for things like the Valley of the Ancients slope warping just as well as Control Rig was?
Thanks!
IK Rig is a simplified “rig” that specializes in IK. It uses a stack of solvers and an array of input goal transforms to generates a pose. The IK Rig also acts as a home for retargeting characterization data.
Control Rig is a much broader rigging system that can do procedural modifications at runtime, or be used for keyframe animation in Sequencer.
In the future, IK Rigs will be able to be embedded in Control Rigs providing a convenient way to create and tune IK setups for use in larger, more sophisticated runtime or keyframe rigs.
For now, IK Rigs are primarily useful as a way to interactively create and tune IK setups for use in an animation blueprint, or to transfer animation between different skeletal meshes using the IK Retargeter asset.
IK Rigs can be used anywhere you would otherwise use one of the many different IK nodes in the anim graph. Regarding Full Body IK, the solver is identical between IK Rig and Control Rig, so it’s really a matter of which suites your particular use case better when deciding which to use. For retargeting, you will need an IK Rig. For simple IK tasks at runtime, I would probably opt for an IK Rig anim node. For sophisticated procedural modifications or keyframe animation, go with Control Rig!