We are trying to use animation data from an animation sequence created for one character with its own skeleton and naming convention on another character with a different skeleton and naming convention. We need this to work both at game runtime and in Sequencer.
For example, I’d like to connect the animation curve CTRL_expressions_jawOpen from a MetaHuman animation sequence to a custom skeleton that has its own morph targets. I have run many tests, but the most promising ones are the following:
Control Rig only: This seems to be the best solution for us, but I can’t access the currently playing animation to use it as input to the Control Rig, and I can’t use the Get Curve Value node to read a value from a curve that doesn’t exist in the current skeleton’s curves. If we add this curve name to the custom skeleton curves or to the Control Rig curves, the value is never updated from the currently playing animation (in Sequencer or in game).
Animation Blueprint: This works exactly as expected if I plug the animation sequence directly into the Animation Blueprint and then use a Modify Curve node to send the value to our custom morph target, or to a Control Rig with an input variable that receives the value from a Get Curve Value node. However, as with Control Rig, I can’t find a way to access the currently playing animation sequence. I also tried using the Event Blueprint Update Animation, but it seems to be ignored by Sequencer.
Animation Post Process: Our Get Curve Value node is never updated from the currently playing animation, either in Sequencer or in game.
Actor Blueprint: I tried using Event Tick. It works when Simulation mode is on, but the evaluation of the data is not consistent and varies between the value from the current animation and the default value.
Does anyone know what is going wrong in our process?
In unreal, animations and animation sequences use the skeleton asset to drive everything, especially at runtime. That said, there are a couple of tools to help you play animation data from one skeleton to another.
First, the tool I would recommend is ik retargetting. This allows you to transfer data from one source to another and those sources don’t have to match. This can be done offline, creating new animation sequences that you can then use in traditional ways, or it can be done at runtime. I’ve linked some documentation and video below.
Documentation - This is slightly out of date, but the video after talks about
This provides the most flexibility while being more performant that running control rig to play every animation. I would still highly recommend that you retarget and save new animations that match your new source.
Second: depending on the differences between your skeletons, say they have many core bones that match but there are small differences on the leaf bones, you can use the compatible skeletons feature.
This allows you to specify a skeleton that has all the animation and say that the animation can play on another. For bones that don’t exist, they will remain on ref pose.
Lastly, I wouldn’t recommend playing animation through control rig as your default approach, especially if building a game. This will not be performant and the tools to blend animation for gameplay will not run through that as well. Control rig is really good at modifying poses for runtime ik and creating motion for parts that you didn’t hand key.
Also, I want to add another tool. In the animation graph we also have Remap Curves.
Unfortunately, we don’t have documentation for this, but it is a node that sits in your animation graph that will map curves from your animation input and animation sources and tells them to play on a different source. This would be how you take the CTRL_expressions_jawOpen curve you mention and play it on a different output.
This is the technical documentation on the struct for the settings on that as an example.
I might not have been precise enough in my first post, so I’ll try to explain my issue better.
The main purpose of these tests is to use the same MetaHuman animation sequence captured with the MetaHuman Performance tool on multiple characters, including custom non-MetaHuman skeletal meshes. I would like to avoid duplicating the “same” data, so I need to re-route some animation data, such as CTRL_expressions_jawOpen, which is an additive animation curve in a MetaHuman animation sequence, to a custom morph target (head_mouthOpen) and apply some automatic corrections on jaw bone if possible (through Control Rig, I guess?).
We have a couple of tools for this, but the primary tool we’re recommending is the RemapCurveOP in the retargeting system. The IkRetargeter does much more now and in 5.6 was refactored to include some new operations. So in the case you describe, you would have a retarget asset with a single remap curve operation.
[Image Removed]then in the animgraph you can run that retarget asset on your incoming pose by creating a RetargetPoseFromMesh node and then setting RetargetFrom variable to SourcePinAsPose.
[Image Removed]This is faster than running a control rig as well.
Sorry for the late reply, i can’t get this method to work, here are screenshots of my settings, i have no idea what i’m missing
The animation sequence i use have values on the 2 curves i try to transfer: [Image Removed]
The skeletal mesh has curves corresponding to destination names (the skeleton too), morph targets auto are checked
[Image Removed]
Curves to remap are setup in the retargeter asset
[Image Removed]
The retargeter settings in the animation blueprint (I linked the animation sequence directly, which node do you use to get the animation from the sequencer? It seems like the Animation BP is overridden by the sequencer)
Just wanted to reach out, we’ve just gotten back from our holiday break and I hope to have an answer for you over the next couple of days with an example.
Apologies for the long delay. So after doing a bit more research we’ve found a bug that I’ve logged that you can follow here: https://issues.unrealengine.com/issue/UE-360794 It will take a 24 hours for it to show up.
Note the the above is the intended workflow, but when the Retarget Node is set to RetargetPose using the pose input pin the method will not work. Unfortunately there isn’t a work around for that, you would have to RetargetFromMesh to be able to do this. The reason that method does work is that is the typical workflow for Fortnite, show here: https://dev.epicgames.com/documentation/en-us/unreal-engine/facial-animation-sharing-in-unreal-engine
Now, I can’t provide a workaround that is easy because of the way that the retargeter operations function, but if you would like to investigate on your own you would need to look at two specific areas of code. FIKRetargetCurveRemapOp::AnimGraphPreUpdateMainThread is where in code we build up the curves to that we want to transfer through the CurveRemap operation. That is called inside of FAnimNode_RetargetPoseFromMesh::PreUpdate but that is where the code is currently blocked from running because of the input pose from mesh option being set.