Retarget Metahuman Facial Animation to Custom character

Hi Community,

I was trying to achieve a complete unreal engine based animation workflow where I can animate, retarget the mocap and use them for my in game custom skeletal mesh which is based on ue5 Manny Rig.

The main issue is how do I get my facial animations retargeted if I have a custom facial rig setup on my custom skeletal character ? I did not find any workflow as to how I retarget the meta human facial animation to custom skeletal mesh which has a custom facial rig setup (Unfortunately UE5 Manny or Quinn do not have any facial rig or facial features).

Is it possible at all ? Can I do everything inside unreal without dependent on external 3d softwares like Blender or Maya.

PS: I cannot use metahuman’s bone setup for my game it’s too heavy, That being the main reason I am looking for alternatives.

If your character has a control rig for the face, you can eventually use recorded MetaHuman facial animations ( with an iPhone, Faceware or Facegood ) and simply get the control rig values ( pretty much 0 to 1 almost ) and use them to drive the controls on your facial rig.

So basically using MetaHuman Facial Rig as a value proxy for your rig.

1 Like

Can you point me in the right direction on how to do this technically?

Hi I wanted to do this, but not sure how to achieve it. I have a animation sequence on ARKIT character and wanted to transfer to MMD character, both have difference set of bones. Can you kindly point me to the right direction to do this? Thank you

Hi, is pretty much what I said in the post above, meaning that you need to gather the 0to1 values from ARKit and use them in your facial rig.
If your facial rig has blend shapes you just need to match the names to transfer the values, for example:
The opening of the jaw in ARKit correspond to a blend shape called JawOpen.
Your character also has a similar blendshapce, but called OpenMouth.
What you can do is to read the JawOpen value ( from the BP or from the ABP ) using a simple GetCurveValue node, and store/send the data to your character, driving the OpenMouth blend shape.

If instead you have a joint based rig, you have two options:

  1. Build a rig inside Unreal Engine using Control Rig, then have something similar to the pose library, where you store the control(s) values, so that you can achieve a pose by simply choosing a pose
  2. If you already have a facial rig in Maya and you don’t want to redo the rig in Unreal, you can export the ARKit blendshape values from UE to Maya, and pretty much doing the same pose library setup, but all done in Maya.

If you have multiple characters with the same joint structure, I strongly suggest to create the facial control rig in Unreal, since you can reuse it across characters and you can also have the ARKit values transfering in realtime by using Blueprint logic.

1 Like

Thank you so much for replying so swiftly, I wanted to do exactly what you mentioned above, connect one to another blendshape value, then to be able to play the ARKIT animation sequence on custom character.

I understand the concept but I lack the technical knowledge to do it using BP/ABP, I’m still new to UE :frowning: If you would be so kind to show me? Thank you

I think this is what you’re looking for.