I am completely new with Unreal Engine, but the MetaHuman project caught my eye for my master thesis that I am currently working on. My thesis will generate facial animation, and very likely as a 3D point cloud. To see these results, I would like to have the 3D points animate a face. MetaHuman has very realistic faces, and my generated animation should be realistic. Therefore, I want to use MetaHuman for the visualization of these points.
However, how can I animate the MetaHuman face with an fbx file that contains 3D points (animated)?
These 3D points are extracted by Dynamixyz. I know they have worked together, and that you can map the recordings lie to a 3D model in Unreal Engine. However, I will have a neural network in between that generates the animation from the extracted 3D points.
If my main question is not possible to do. Can someone refer me to a tutorial where the Dynamixyz to Unreal connection isn’t live? And is it possible to export the blendshape values from MetaHuman?