How can I fit the tracked facial muscle movement outside LiveLink app to metahuman rig?

It seems the LiveLink app for iphone provides the function fit the mesh to individual user’s face.
Then, at some point of the pipeline, the tracked data should somehow ‘fitted’ to standard blendshape or individual Metahuman model’s blendshape, right?

Is there any good documentation about those fitting step?
Is it done in the app or in somewhre in the Unreal Engine blueprint?
If formal is true, can the usr access the algorithm?

Yes this is a problem… You need a 1 to 1 mapping of your face to a digital character’s face. I created a product to do this and you can see it here…