Retarget Metahuman face animation to an external custom skeletal mesh with different bones.

I have a custom character in Maya with a custom face and body rig. And I have some stereo HMC facial recordings that I want to apply to it. I am trying to avoid a 3rd party vendor to process all this footage into useful mocap animation data. Trying to figure out if there is a pipeline for this. I have been able to calibrate, solve and process 1 take of this facial mocap recording, and apply this facial animation to a metahuman.

The next step would be to hopefully retarget this animation to this separate custom character skeleton rig, with different face bones. And then export that out as a baked fbx to be used in Maya, and ideally or hopefully also reapply the face and body rig that the character has in Maya.

Id imagine, that I import this custom skeletal mesh into Unreal, then retarget the metahuman face to the imported skeleton, somehow bake the facial animation sequence to this new skeleton and then export out a new baked .fbx, then import into back Maya. Is any of this possible?

Would something like this work in my situation. The solution is a little vague and I dont fully wrap my head around it but have some limited experience with BPs.

Appreciate any help. Thanks

Since around 2020 the faces generally use curves/morphs to work.
All you need to do is move the curve values from one model to another as part of an animation - provided they use the standard (Apple defined) naming for the morph targets then, “everything just works™”.

Thanks but can you point me to how I can do this? Where do I move the curve values? How do I move the values?
My custom fbx exported from Maya into Unreal doesn’t have any morph targets. I have so many questions.

You open any animation with what you want in any DCC and you copy the curves over to your file - each dcc does it different, so you need to look up for whatver you use.

your fbx doesn’t have curves because you probably never animated it.
But if you export an animation from the engine and play it back in the DCC you should see the curves (and other stuff like bones).
You just move things between animations and assuming your mesh was properly set up you are generally good to go. (Obviously it’s an approximation, and not the exact same expression so you do need to tweak some at some point).

thanks again for the quick response. I didn’t create this custom rig, so I am not sure how it was set up. But it is a professional made rig, that many animators will use.
So you are saying I export out the Metahuman facial animation to an FBX file
Import that into Maya with my custom skeleton rig.
And then manually copy the animation curve of every bone individually, and paste it to the similar bone on my custom rig? I will google if there is a better process for moving the animations.
It does seem the Metahuman has a lot more bones than my rig, what do you do in that instance?
If that is correct, this is the best process for doing such a thing, especially if I end up having 100s of animations that need to be moved/retarget like this?

Thanks again, appreciate the help.

The berter way for high volume would be to just retarget within the engine…
Aside from that if you retarget first, and then export the retargets animation, you will get working animation bone/curve tracks whith the right naming and minimal work…

right, that was my first and original question in my OP. I haven’t found any info online on how to retarget a metahuman face to a custom face with different bones. Like you can easily do with body performance retargets. So I came here asking if anyone could point me in the right direction on how to do that exactly. I’ve seen stuff on how to use a custom mesh on a metahuman but that is not what I want.

I also did export my metahuman as an fbx to bring into Maya with my Maya riig. But with default options the export is all sorts of messed up. works about 80% of the time, but gets all wrinkly when doing expressions and every few seconds the mesh gets all completely mangled, like scaling in on itself randomly.

Thanks again for the help.

Edit: I am seeing there is a tool or asset in the Metahuman sample project that can convert ARkit blendshapes to rigs designed with ARkit. Which I believe is what you were alluding to in your original response. It does seem this custom rig was designed for ARkit plus additional blendshapes on top. So I will be checking out that route.

The face animation done with ARKit doesnt need anything special.

If noth rigs are correctly rigged, all you need to do is have an animation with rhe abimated curves.

Unlike bones, curves are freestanding. You can have any curve on any skeletal mesh.
Thats why copy pasting the curves would work fine to move the animations between assets.