Making Static Mesh from Metahuman results in Vertex Disappearing and bad Indexing

Hi, I’m currently working on the Metahuman Assets.
In the node “mh_arkit_mapping_pose”(screenshot below), I can drive the mesh of MH by modifying the weights, e.g. I change the weight of EyeBlinkLeft to 1, so the Mesh blink the left Eye.
As you can see, the text reads that we have 64904 triangles and 35184 vertexes.

Say I’d like to get this EyeBlinking mesh, so, I click the “MAKE STATIC MESH” buttom. I create a lot of them, for each ARKit Property.

STRANGELY, all the STATIC MESH I created have less than 35184 vertexes:


What’s more, when I export these mesh as .OBJ file, I found that the INDEXING is bad, i.e. the index of vertexes are shuffled, making the “faces” values in every .OBJ unique.

I’ve been stuck here a while. WHY making STATIC MESH results in losing vertexes and shuffled indexing? Can we solve it somehow?

Thanks in advance for any suggestion!

Waiting for a hero…

Hello,thanks for sharing.I met an another question similar to yours ,do you know how to export animation sequence to static mesh per frame in UE5?

Hi,
As you can see from my issue, exporting static mesh for each frame results in uncontrollable problem: the mesh points aren’t semantically aligned between frames.

Hence, I will recommend exporting the morphable mesh(something that u can control and drive in other 3d software such as maya or 3dsmax) firstly.

Then, u can export the sequence of animation. Using the sequence data, u can drive that mesh in maya, from which, u can obtain the mesh points of each frame more stably and relatively easily.