Metahuman - Blender

Did not examine myself - but, are they identical in name, or do they all just seem to be identical in function. If the latter, it may be an ‘unfinished’ naming convention the developers have in progress for later UI implementations. But, the blendshapes seem different than most of us are probably used to, they seem to control wrinkle maps and not the actual movement itself. Like, the blendshapes just augment the bone movement. (ie - a drop/open jaw blendshape makes the jaw’s wrinkle happen, but the jaw only actually opens with the jaw bone being rotated. Whereas we would have expected traditionally that the blendshape for Jaw open would actually show you the jaw opening.)

These blendshapes are different in name but actually identical in function. I open the fbx file in maya shown as below. You can see that I hide all other blendshapes but show the two blendshapes of jawOpen and jawLeft, however, these two blendshapes are almost the same. I do not know if it is because I change the number of LODs to 1 when I export the fbx file (following the solution of **** ), but the blendshapes are reaaly different from the traditional blendshapes. I have used the live link face of iphone to test the facial animation, which seems it is ok, so I feel quite confused about the blendshape mesh, which seems really different, anyone can give a explanation ?

Lituanhui, hm not sure, as I don’t have Maya. You might have to wait until Epic releases the Metahuman Creator, since a Maya file will be included when you download each human. To quote from the Unreal blog: “When you’re happy with your human, you can download the asset via Quixel Bridge, fully rigged and ready for animation and motion capture in Unreal Engine, and complete with LODs. You’ll also get the source data in the form of a Maya file, including meshes, skeleton, facial rig, animation controls, and materials.”

By the way, I suspect the blendshapes you’re seeing are correct in Maya. If you’re wondering why the link link works in UE4, that’s because the animation is using the control rig assigned to the head, which manipulates all the bones (it’s mapping the ARKIT values to the bones, and using the blendshapes for fine tuning after). You can see it yourself if you try editing some of the curve values in the animBP… the traditionally named ones like mouth open, etc hardly do anything. You have to adjust the control rig assigned curve values, which start with CTRL in front. If you check out the control rig you can see how it’s handling all this under the hood.

1 Like

As you look closer you can see what the shape keys do see if you can spot any more?

there are some masks that also go with the bone and blend/morph/key shapes you can see in the pc version how the normal maps are used with these masks so you will move the forhead bones bind the correct key shape to the bone movement that will wrinkle the head and the RGB map will also be used to show the part of the normal map that is rellivent to the animation giving better visual of what is going on showing the shadowing caused by the head wrinkling
There are three extra textures and three extra normal maps that get masked to do this on top of the key shapes that only give a very subtle effect helping the bone movements look more realistic

Yes there are a ton of bones in the face but you can hide each bone in pose mode so you can see what you are working with the bones are in groups the root of each group will move every bone in each group reference the UE4 face rig to see what ones do the animating and what ones just get used to form the shape of the character

remember this is not a finished product there will be a set of creation sliders that will probably be using the extra bones to puff out the lips change the type of nose etc

its not easy to reverse engineer something … when its not finnished it’s even harder as you don’t know what your supposed to be making GOOD LUCK

Have you guys been able to export hair to Blender? I’m aware that they are present exclusively in LODs 0 and 1, however after exporting from UE and importing into Blender the hair is not there (only polygons for the eyelashes are present).

I managed to export the metahuman head into blender, then back into UE and get the rig working again. Here’s the whole process:

  1. Export Metahuman Head from UE with next options:
  2. Download FBX Converter: https://images.autodesk.com/adsk/files/fbx20132_converter_win_x64.exe
  3. Open Converter and convert exported fbx file:
  4. Import converted fbx to blender with next options:
  5. Sculpt model in blender as you want.
  6. Click on root in outliner and click Alt + P, then “Clear and Keep Transformation”. Delete empty.
  7. Select All Objects in scene and export model from blender in fbx format with next options:
  8. Import fbx to UE with next options:
  9. Open imported mesh in UE and original Metahuman Head.
  10. Copy all setting from original head to imported.
  11. Asset User Data the most important option. You need to Copy it and Paste to imported Head. I tried to manually add the same options and it’s doesn’t work. So copy and just paste.

  12. Everything should work
    2021-08-01-13-39-31
8 Likes

Thanks a lot for this step by step guide! It really helped defining a MetaHuman pipeline for Blender.

Unfortunately I am having an issue when reimporting the head mesh and I do not understand why:
the vertex count is not matching the original mesh one even if I did everything correctly and I just shaped differently the existing vertices without adding or more in any way.

Morphs are intact and everything looks like working as intended until I try to use the modified head mesh in the MetaHuman sample scene. The face is not reacting to the face animation sequence and I fear that this could be related to the vertex count mismatch.

Were you able to use the modified head mesh for motion capture or in animated sequences?

It can’t be the vertex count. Blender pretty good at creating new geometry in mesh with shapekeys.

Here is my test with sculpted model and added sphere to the mesh:

As you can see, the sequence works with the rig and mesh doesn’t explode or do anything weird.

Have you tried the modified head with like the MetaHuman sample scene sequence?
Any idea why the modified head should not behave and speak like the default one during that sequence?

If you do not have errors when importing into UE and all the morphs work in the imported mesh, then in my opinion the only thing that can be - not all settings are set correctly.

Check the Post Process Anim Blueprint in the imported mesh and Animating Rig. It should be like this:

And you need to Copy and Paste them from the original head. Manual selection of Asset User Data will not make the rig work.

And I haven’t tried the MetaHuman sample scene sequence. When you create a new level sequence does everything work there?

Thanks Lorenzo_Drake, your pipeline worked fine for me. As a note, for the control rig to work back in UE I had to download “with Source Assets” and then import the .dna asset into the User Data, as I read somewhere else. It would be good if it weren’t necessary to do so, but just copying didn’t solve it.

i think this link will help you

Thanks for this tutorial! It worked for me. The only problem I have is that even though I created new binds for the eyelashes with the sculpted mesh, they don’t follow the eyes. Any idea how to solve this?

Have you tried setting the ‘Source’ asset in addition to the Target skeletal mesh in the Create new binding pop-up window? I believe you should set the original mesh in the Source slot for the eyebrows, eyelashes and peach fuzz, and also for hair if you are using a different head but the same original groom.

Hello there,

I see there is a workaround to get the facial mesh data exported to Blender and reimported to Unreal. I do not see a result though, with the hair (or any mesh skinned to the face). Has anyone done a face mesh morphing like that with an actual result on the whole character in Unreal?

I would like to know what is needed to morph a face without breaking anything about animations and skinned meshes attached (hair that have to follow the head morph).

Thank you

Hi there,
the guide above was really helpful, thanks a lot! But I also had the issue that the hair did not follow the mesh anymore. After searching for a solution way too long I found a solution in another forum post.

Create new bindings for all groom assets as mentioned above and in the project settings set “Default Skin Cache Behavior” to “Inclusive”. This fixed it for me.
Unreal will prompt you to change the settings back every time you open the project but as long as it works and I don’t get any issues I will just ignore that…

Hope this will help some people to get this working properly.

Hello, Lorenzo_Drake,thanks for your post ,I did it and it gave me a lot good idea.Now,I met an another problem similar to this,I had a custom skeletal mesh, and it has different skeleton and MorphTarget with Metahuman’s. I imported this skeletal mesh into UE5,and create its Control Rig and PostProcessAnimBlueprint and set them in my mesh up as you mentioned.I want to use ControlRig to drive it like your last picture,but my skeletal mesh has without DNA Asset.Do you know how can I set the AssetUserData and control my custom skeletal mesh?