Which brings us to…
Method 3#
This method, though more work and probably much messier, ended up being closer to what I originally assumed the solution would be. Full disclosure tho…i did still end up using sockets.
My methodology was this:
If I had this problem in a typical 3D package (Maya/3DSMax/Blender) I would use joint constraints. Meaning, the rotation/translation of bones that I specify from the body skeleton will directly drive the rotation/translation of bones I specify in the Meta_FaceMesh skeletal asset.
So i figured to get that working in Unreal I need two things:
- A way to evaluate the rotation/translation values from the driver bones (the body).
- A way to bring those rotation/translation values into the Face_AnimBP so they can be applied to the driven bones (Meta_FaceMesh).
To solve the first problem, i went into the Skeletal Mesh for the body and created sockets for the neck and head joints.
This allows me to access the rotation/translation of those joints in the CharacterBP.
I then create a function to get those sockets values and output them. I called this function FindPilotGirlHeadLocation
Now to solve the second problem. I went to the Face_AnimBP and created variables corresponding to those sockets I created. So two variables, one for the head and one for the neck. I then created a simple function in the Face_AnimBP to set those variables. I called this function SetHeadROTRecieve.
Now i’m going back to the CharacterBP to call that function. In the CharacterBP event graph, set up a cast to the Face_AnimBP. Grab that SetHeadROTRecieve function and plug in the outputs from the FindPilotGirlHeadLocation function.
I also set up a custom event so i could trigger all this called
UpdateHeadRotations.
Now that the neck and head joint values are hooked up to the variables in the Face_AnimBP, all we need to do is hook that up to the bones we want to drive in the Meta_FaceMesh.
So going back to the Face_AnimBP animgraph, I hooked the variable for the head up to a Transform(modify)Bone node.
I hooked the variable for the neck up to a Spline IK node (my body had only one neck joint and the MetaHead has 2, so i had to use this node to spread the rotation values over 2 joints)
As you can see, i had to use alot of utility nodes to add values to the rotations to get them into the right orientation. The joint orientation for my body was not the same as MetaHuman (my rig was Y-down bone, MetaHuman is x-down bone) so i had to solve the problem in the bp, which was a huge pain and very messy. but what are you gonna do…
Next I went back into the CharacterBP and connected the Meta_FaceMesh to the body components top spine joint with the parent socket. This will ensure the Meta_FaceMesh component will move with the chest/shoulder of the body while the head and neck is driven by the blueprint. Now everything is hooked up and working.
Last thing i had to figure out was a way to trigger that UpdateHeadRotations custom event every frame so i could get the head rotation applied when I scrubbed animation. Since I’m setting this character up for cinematic use and not a game, i couldn’t use the Tick event.
I found this video that explained a super handy way to update blueprints from the sequencer: https://www.youtube.com/watch?v=qt3x64Grlq0&list=LL&index=1&ab_channel=MattLakeTA
Aaaand that solved all my problems. Kind of long winded and complicated I know. If anyone see’s this and thinks “why didn’t he just do it this other way” I’d love to hear alternatives.
-Cheers