Retarget Metahuman Facial Animation to Custom character

Hi, is pretty much what I said in the post above, meaning that you need to gather the 0to1 values from ARKit and use them in your facial rig.
If your facial rig has blend shapes you just need to match the names to transfer the values, for example:
The opening of the jaw in ARKit correspond to a blend shape called JawOpen.
Your character also has a similar blendshapce, but called OpenMouth.
What you can do is to read the JawOpen value ( from the BP or from the ABP ) using a simple GetCurveValue node, and store/send the data to your character, driving the OpenMouth blend shape.

If instead you have a joint based rig, you have two options:

  1. Build a rig inside Unreal Engine using Control Rig, then have something similar to the pose library, where you store the control(s) values, so that you can achieve a pose by simply choosing a pose
  2. If you already have a facial rig in Maya and you don’t want to redo the rig in Unreal, you can export the ARKit blendshape values from UE to Maya, and pretty much doing the same pose library setup, but all done in Maya.

If you have multiple characters with the same joint structure, I strongly suggest to create the facial control rig in Unreal, since you can reuse it across characters and you can also have the ARKit values transfering in realtime by using Blueprint logic.

1 Like