I’d like to share a couple of tips&tricks related to Mesh 2 MetaHuman workflow for all of you who’d like to experiment with the technology a little deeper than a simple Promote Frame->Track Active Frame->Identity Solve-> Mesh to MetaHuman wizard-like style. Read below how to overcome MHC limitations (e.g., my avatar has a haircut that isn’t available in the creator) and how to fix an Unreal bug related to ARKit Live Link.
First of all - don’t rely on MetaHuman Creator Sculpt/Move tools too much, as they’re tricky. I don’t use them at all because:
-
their flexibility is minimal,
-
MHC has a huge focal length (I’d assume the Field Of View is somewhere between 8-15) - it’s far from what pictures taken with your phone usually have.
Therefore - unless you’re a character artist with trained eye who can capture the likeness from pictures having different FOV, you’ll get into trouble - even if you achieve fairly good results from the front view, your MH won’t look alike when you rotate it. That’s why it’s better to make all tweaks in advance before you send it to MHC.
Here’s my full workflow:
1/ Scan your face (I assume most of us don’t have sophisticated 3d scanners and rather use Polycam or another software)
2/ Clean up your mesh (I’ve noticed it gives slightly better results) - smooth out bumps, sculpt missing parts (e.g., I had to sculpt my ears from the scratch, because they’re obscured by hair). You don’t have to care about mesh topology at this stage
3/ Bake your albedo/diffuse map onto the modified mesh (it’s important because auto-trackers leverage this texture, but don’t worry if it’s not very accurate - you’ll be able to tweak markers manually)
4/ Import your mesh into Unreal, add MetaHuman Identity asset, click + Components from Mesh (this will automatically add the Capture Data, Face and Body parts, and a Neutral Pose)
5/ Set viewport FOV to a small value (e.g., 5) and set the camera to view your face from the front. Promote Frame and Track Active Frame
Points 6-12 are ESSENTIAL:
6/ Auto trackers are not good enough if you strive for likeness. You should leverage more curves, but remember they’re not available until you click “MetaHuman Identity Solve” the first time. So do it and then enable more curves (in the right panel).
7/ To tweak the curves you can drag control points. You can add more control points (CTRL+LMB on the curve) or remove the existing ones (CTRL+LMB on a control point). Their resolution doesn’t matter at all. Their count doesn’t influence the final topology. Use as many as you need to follow the original shape (but not too many because you need smooth organic shape). You can multi-select many control points with SHIFT+LMB drag, but your mouse has to be moved in left->right and top->bottom direction. Otherwise, points aren’t selected.
8/ Spend a lot of time on eyes, as they are very important to achieve likeness (I usually spend 20-30 minutes here). Switch between mesh/template view frequently and make sure all eye marks, eye lids (and their widths/depths) are accurate. “Eye crease” curve is tricky - as the fold doesn’t follow it exactly. So you have to tweak it, click Identity Solve, compare the template with the original mesh, and repeat (tweak, Identity Solve, compare…). You can press L + LMB drag to rotate the light.
9/ Autotrackers (at least for me) mark brows way too high. Leverage Brow curves to fix this.
10/ You’re not limited to a single Frame. You can promote more frames. Just make sure you don’t use the same curves on different frames. This is especially useful if the auto-tracking doesn’t detect your ears accurately.
11/ Now the MAGIC console command which will make a huge difference: mh.Identity.ExportMeshes 1
This will export both your scanned AND the conformal (the retopologized one, with MH topology) meshes to your “Saved” directory the next time you click MetaHuman Identity Solve. It lets you open both meshes in a DCC and they’re perfectly aligned (no more moving/rotating/rescaling). This is a perfect moment to bake an additional normal map. Because polycam models have poor quality, it doesn’t make sense to bake the normal map from it, but what I usually do here is: a) copy my conformal model b) shrinkwrap the copied model to the scanned model - you’ll get better alignment, but it’s still smooth enough) c) because some parts will get broken (especially the eyelids and mouth) - I limit the shrinkwrap by weigh-painting the vertices d) bake a normal map from the copied and shrinkwrapped mesh to the conformal one. After this step, your model will look much better (although this normal map won’t be visible in MHC, we’ll use it later in Unreal)
12/ When you’re ready, click Mesh to MetaHuman, open the Creator and set the skin, eyes, brows, hair, etc… Don’t worry if you don’t find a perfect haircut, but select the one which resembles yours as much as possible. Don’t try to sculpt/move your custom mesh even if it seems misplaced - remember the focal length is really high and the perspective is different.
13/ Download your MetaHuman, but DON’T modify it in a DCC. Some tutorials show you how to tweak the blendshapes or resculpt the mesh, but don’t do it. It will get out of sync with the rigging system and facial expressions will look awkward. This is why it’s crucial to make all the mesh modifications at the beginning of the process.
14/ You can tweak your haircut by modifying the appropriate Groom asset (as long as you use grooms). It provides two groups: Group ID 0 refers to sideburns so if you want to get rid of it, simply set its Hair width to 0. Group ID 1 refers to the rest of your hair. You can shorten it, change its scale at the root and the tip (to imitate the density) and also update physics if you need it. Do the same thing with brows. BTW - the brows can be rendered using Cards (in such a case you can change the cards mesh to modify the shape and density of your brows).
Even if your avatar looks good now, it will probably look strange as soon as you start animating it with ARKit (LiveLink), because the facial expressions will not resemble your own. There are a couple of steps to be taken to fix this:
1/ Things like teeth/jaw position can be modified with appropriate Curves in the Anim Blueprint - open your MH Blueprint, select Face, find out what Anim Class is used and edit it. Open AnimGraph and add a ModifyCurve node, with appropriate curve pins and set their values.
2/ You can make further adjustments to how your avatar reacts to your ARKit expressions. ARKit provides head and eye rotations + 52 weights. The appropriate mapping is done in mh_arkit_mapping_pose. For each of those 52 weights there’s a specific animation frame (in mk_arkit_mapping_anim). If for instance, you don’t want your nasolabial fat pad to be that apparent, you can edit the appropriate frame. Keep in mind though - Unreal has a bug. The provided mh_arkit_mapping pose is missing one Pose Name (MouthClose) so if you change the animation asset and click “Update Poses from Source Animation” in mh_arkit_mapping_pose, it’ll break. To fix this, you have to add MouthClose pose name right after JawOpen (and before MouthFunnel). As soon as you do this, you can use “Update Poses from Source Animation” again.
3/ You should export at least 8 textures: FaceColor_MAIN, FaceColor_CM1, FaceColor_CM2, FaceColor_CM3, FaceNormal_MAIN, FaceNormal_WM1, FaceNormal_WM2, FaceNormal_WM3. These are the textures that are blended for specific facial expressions (normal maps for skin folds and colored textures for blood flows - e.g. parts of your face become reddish when you frown). I’ll write another post on how to use these textures and also describe my whole workflow here (especially the way I blend my photogrammetry face texture and the baked normal maps with the MH).