Currently I am in the process of making a markerless facial animation capture tool for one of my projects which I intend on releasing as open-source for the community once it is at a usable stage.
One question I had for folks familiar with facial animation is how do you all manage to use both face joints/bones and blend shapes/morph targets in unison together?
Looking at the Face AR example (Face AR Sample | Unreal Engine Documentation), they mention using corrective blend shapes but I am not sure exactly how to go about applying them. Do you combine both to create “poses” and use that on top of morph targets?
So far, I have mostly focused on interpolating blend shapes/morph targets on a 0-1 scale, but I figure I should use joint-driven animation to give me greater control.
Any insights into your own process of how you manage with facial animations will be helpful. I am using a Daz3D Genesis 3 character if that helps.
You can combine them. For corrective blend shapes there’s the Pose Driver node and the Bone Driven Controller node. The first one uses radial basis functions (i.e., avoids gimbal lock issues) and the second one basically remaps “raw” rotation values. I use the Pose Driver for corrective blend shapes on shoulders/hips etc. and the Bone Driven Controller to get eyelids to follow the up/down rotation of the eyes.
These are for having bones drive morph targets. I’m not sure there’s anything built-in for making morph targets drive bones. You might have to set that up manually by reading curve values and using them to set bone transforms.
I haven’t done much with facial expressions, but I’m currently doing them all with morphs. FYI, UE4 I believe has a max of 8 influences per vertex, and Daz Genesis 8 (and 3?) has some vertices in the head that have 9 or 10 influences, so trying to drive facial expressions using bones and Daz’s built-in weights might not work as expected.
Thanks, I wasn’t aware of the Pose Driver node!
That’s a good point you bring up about using default Daz Genesis built-in weights. I’ll have to keep that in mind as I proceed down the pipeline and maybe create custom weights at some point.
We have been using G3 for a while and although we really don’t have a pressing need for facial expressions the fact that they are there is an added bonus and both clusters as well as morphs work with the usual set up curve.
Clusters or joints has the advantage of being able to be animated and implemented as you would any other type of key framed animation as well the same set of animations can be used on many different characters with out the per object limitation of morph targets. There is a small issue with the jaw line that is easy enough to fix using the blend weight brush in 3ds Max and for most needs like real time game play and NPC interaction does a fair job.
Using Motionbuilder it’s easy enough to wire in the voice device or facial animations via the relationship constraint
A quick test using cluster.
With around 70 clusters most shapes can be achieved
Morph targets will of course give you the best results as it’s not dependent on weighting and easy to either purchase the sets you need or make the targets in Blend or better still Z-Brush.
With Genesis 3 you can use both at the same time and is more about the authoring tools that you have due to the lack of such tools being available in Unreal 4 but as a replacement for the Epic base mannequin it’s a good stating point to use as a framework for much more complex actors and building ready to use materials or use the advanced materails already supplied by Epic.
Thanks, FrankieV! You have been an incredible resource for G3/DAZ and animation questions on these forums (and via your YouTube channel) in general. That video you shared of the StorytellerTest with clusters is impressive. Hard to distinguish that on a quick look from phoneme-driven blend shapes.