MetaHuman - PROPERLY attaching a head to a custom rig body (no sockets)

Hello,
So, like alot of people, I’ve been trying to find a decent way to attach the MetaHuman head to a custom rig body while keeping the functionality of the face. The most common solution i’ve seen online involves attaching the “Meta_FaceMesh” skeletal mesh asset to the custom body using the socket of either the head or neck joint and then hiding the shoulders with an opacity mask or custom material.

This works if your custom body already has a neck, or if you don’t mind that the neck does not deform.
My body does not have a neck and the MetaHuman head actually has some pretty nice neck and shoulder deformation that I’d like to keep if i can.

The MetaHuman body rig drives the head through an animation blueprint called “Face_AnimBP”. It seems logical to me that there should be some way to modify that animBP (or a new version of it) to reference a custom body instead of the MetaHuman body and drive the “Meta_FaceMesh” skeletal mesh asset in a similar way.

Face_AnimBP

I’m still learning my way around blueprints and visual scripting in Unreal, so if anyone can think of a relatively simple way to do this or can point me to a good resource it would be very much appreciated.

*also, is there a good resource for explaining how the MetaHuman blueprints are set up in the first place? Obviously you can go in and look at them, but being as new to visual scripting as i am it can be difficult to figure out what’s referencing what.

2 Likes

So, I’ve been diving into this and learning alot along the way. I’ll post what worked for me to both help others who might be looking for a similar solutions and for my own future reference.

Method #1

So my assumption was correct. How Face_AnimBP gets its animation to sync with the body is actually very simple. It’s just a single node in the Animgraph called Copy Pose From Mesh that determines the BodyPose and applies it to the head skeletal mesh.

The documentation for that node is here: Copy a Pose from another Skeletal Mesh | Unreal Engine 4.27 Documentation
I’m no programmer but through testing, the node seems to essentially look for bone names in the Skeletal Mesh that match bones in the Source Mesh Component and apply animation to the ones that match. That Source Mesh Component can be defined through the pin on the node, or if nothing is hooked up, it will look at the parent component of the skeletal mesh.

So, if you have a body with the same skeleton/joint structure as the MetaHuman (UE4/UE5 mannequin have this base structure too I believe) simply making the Meta_FaceMesh component a child of that body in the Character Blueprint will allow it to inherit the same animations without having to edit the Face_AnimBP at all. Easy. Fantastic.

Unfortunately, my custom skeleton was not based on the MetaHuman/UE4/UE5 skeleton so this method did not work for me.

Method #2

In a case where the skeleton for the body and the MetaHead are different, there is another node called Retarget Pose From Mesh.
RetargetNode
This node can be plugged into the BodyPose similarly to Copy Pose From Mesh and will reference an IK Retargeter asset.
There are tons of tutorials online explaining how to retarget animation using unreals IK Retargetrer, so i won’t get into that here. But basically make an IK retargeter that retargets the animation from the custom body skeletal mesh to the Meta_FaceMesh skeletal mesh. Then reference that IK Retargeter asset in the Retarget Pose From Mesh details panel.

Unfortunately, for some reason I couldn’t get the retargeted animation to match close enough to really look like it was connected. I’m not sure if that was something I was doing wrong in the process, or if the retarget tool just isn’t THAT accurate (when retargeting animation in other software I typically lose some fidelity also).
If anyone has any insight into why that might have been I’m all ears.

Which brings us to…

Method 3#

This method, though more work and probably much messier, ended up being closer to what I originally assumed the solution would be. Full disclosure tho…i did still end up using sockets. :slight_smile:

My methodology was this:
If I had this problem in a typical 3D package (Maya/3DSMax/Blender) I would use joint constraints. Meaning, the rotation/translation of bones that I specify from the body skeleton will directly drive the rotation/translation of bones I specify in the Meta_FaceMesh skeletal asset.

So i figured to get that working in Unreal I need two things:

  1. A way to evaluate the rotation/translation values from the driver bones (the body).
  2. A way to bring those rotation/translation values into the Face_AnimBP so they can be applied to the driven bones (Meta_FaceMesh).

To solve the first problem, i went into the Skeletal Mesh for the body and created sockets for the neck and head joints.
Sockets
This allows me to access the rotation/translation of those joints in the CharacterBP.
I then create a function to get those sockets values and output them. I called this function FindPilotGirlHeadLocation

Now to solve the second problem. I went to the Face_AnimBP and created variables corresponding to those sockets I created. So two variables, one for the head and one for the neck. I then created a simple function in the Face_AnimBP to set those variables. I called this function SetHeadROTRecieve.

Now i’m going back to the CharacterBP to call that function. In the CharacterBP event graph, set up a cast to the Face_AnimBP. Grab that SetHeadROTRecieve function and plug in the outputs from the FindPilotGirlHeadLocation function.


I also set up a custom event so i could trigger all this called UpdateHeadRotations.

Now that the neck and head joint values are hooked up to the variables in the Face_AnimBP, all we need to do is hook that up to the bones we want to drive in the Meta_FaceMesh.
So going back to the Face_AnimBP animgraph, I hooked the variable for the head up to a Transform(modify)Bone node.
I hooked the variable for the neck up to a Spline IK node (my body had only one neck joint and the MetaHead has 2, so i had to use this node to spread the rotation values over 2 joints)


As you can see, i had to use alot of utility nodes to add values to the rotations to get them into the right orientation. The joint orientation for my body was not the same as MetaHuman (my rig was Y-down bone, MetaHuman is x-down bone) so i had to solve the problem in the bp, which was a huge pain and very messy. but what are you gonna do…

Next I went back into the CharacterBP and connected the Meta_FaceMesh to the body components top spine joint with the parent socket. This will ensure the Meta_FaceMesh component will move with the chest/shoulder of the body while the head and neck is driven by the blueprint. Now everything is hooked up and working.

Last thing i had to figure out was a way to trigger that UpdateHeadRotations custom event every frame so i could get the head rotation applied when I scrubbed animation. Since I’m setting this character up for cinematic use and not a game, i couldn’t use the Tick event.

I found this video that explained a super handy way to update blueprints from the sequencer: https://www.youtube.com/watch?v=qt3x64Grlq0&list=LL&index=1&ab_channel=MattLakeTA

Aaaand that solved all my problems. Kind of long winded and complicated I know. If anyone see’s this and thinks “why didn’t he just do it this other way” I’d love to hear alternatives.

-Cheers

2 Likes

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.