Creating a custom Metahuman with scanned blendshapes... is it possible?

Hi everyone. I am creating a short film with realistic characters comming from scanned faces of real actors. My idea is to scan not only theirs faces but also some expressions so I can create a few blendshapes. Then when creating the Metahuman character I use the scanned mesh adapted to a Metahuman topology and upload it to the Mesh to Metahuman tool and have the final version.

My problem comes on how could I implement those blendshapes onto internal blendshapes inside Unreal and use them into Sequencer? If you are wondering why not just use the facial mocap from live link, I will actually do, but the results coming from this are far from recreate some of the real expressions from the actors, at least from the few tests I did.

The meshes from the blendshapes will have the exact same topology of the Metahuman head I will be using inside Unreal, so I guess there should be a way to use them and add them maybe into the blendshapes list but I couldn’t find a clear solution to this. I have been also looking for ways using external plugins like Mesh morpher but I still don’t see how I can use them into the sequencer as another layer onto my animations, or similar.

Any ideas or thoughts about this will be very appretiated. And for the live link mocap if you have suggestions on how to improve it… I tried with a tripod and frontal light but some of the expressions look awkward, not sure if adding trackers around the face would improve the result?

Thanks a lot!