Greetings,
Please spot some light on the process of implementing facial animations.
We are new to facial animations and looking for an exhausting faq/guide/overview of the process and steps t be done.
Thanks
Greetings,
Please spot some light on the process of implementing facial animations.
We are new to facial animations and looking for an exhausting faq/guide/overview of the process and steps t be done.
Thanks
This is a very complex topic, but i’ll try to explain it in terms of how its usually done in most AAA games and what makes sense for a very streamlined pipeline in ue4:
The two most common methods to do facial animation are either morphs or bone weights, morphs have the disadvantage of being targeted to one specific character only, so whats commonly done is that the base facial structure and movement is driven by bone weights / placed joints in the face at various spots, then morphs are used if needed to correct anything that the bones can’t achieve easily due to linear skinning such as specific lip compression or extreme lip positions.
When it comes to animations the big advantage of this method is you can create a pose database of most common poses a face can do ( Look into FACS for this ), this means that each unique face will have their own pose database then for animation instead of animating the bones directly you only animate the pose attributes ( Smile, Sneer Left ). Allows you to share any facial animation no matter how different the proportions/character is.
Epic has added support for this technique i believe in 4.18 as they are using it in fortnite now too.
You can find some docs on that here:
This is just a brief overview but hope it helps.
Thank you a lot, this is a very quality post.
The thing that confuses me most:
With bone weights, are you adding just a layer over whatever animation is currently happening the mesh?
Do I have to add face bones to the character at a 3d package?
You rig the face as you would rig a character body with bones, for example you have a few bones for the lips then you do a normal linear skinning pass. Then the bones drive that geometry, blending happens on pose level, so you could combine a smile with a sneer and further on.
Here is a tutorial and walk-through of doing this with maya that might be helpful:
[quote=“Highflex, post:4, topic:115622”]
You rig the face as you would rig a character body with bones, for example you have a few bones for the lips then you do a normal linear skinning pass. Then the bones drive that geometry, blending happens on pose level, so you could combine a smile with a sneer and further on.
Here is a tutorial and walk-through of doing this with maya that might be helpful:
Thanks again. But are facial bones being added on top of existing rig used for movement? Because Inever saw a tutorial with head attached to body and body actually having bones like legs, spine, etc
I found this via a lookup for Facial Animation Sharing | Unreal Engine Documentation - but where do I find these characters/faces in the first place? I’m - in general - looking for some quick introduction/use cases… so I am not likely to make my own characters i Maya as my first task - I would rather like to play with some sample body/face animations, just to explore what could be done… (will post here if/when I find what I look for. )
@Highflex im confused as to what exactly this means. "One important caveat however is that your animation must not have any bone transform data within it. Any bone transform data, even with one mesh’s reference pose, won’t work for other meshes so it is important to remove bone transforms (keeping only curves) and start with each mesh’s own reference pose if you want to share the curve between different meshes. This enables you to share the facial curves between different faces. "
how do you remove transform data and keep curve data? it doesn’t make sense to me. if you delete key frame “translation, rotation, scale” it gets rid of curve data in maya.
There’s a big disadvantage to using bones for facial expressions. If you have characters with different reference poses (same skeleton, different bone locations due to different head shapes/sizes), you usually need to set retargeting translation to Skeleton for those bones. That means they will ignore any translation data coming from an animation. But facial bones often drive facial expressions by changing location, not just rotation. That means either you cannot use bone translations to drive facial expressions, or you need to set retargeting translation to Animation and make sure all of your characters have facial bones with the same base-pose locations.
With morphs, it doesn’t matter, since you are just sending curve data to the mesh and the mesh blends in whatever morph you’ve assigned to the curve name. It’s also more flexible since, for example, you could have two characters with very different smiles (e.g., Donald Glover vs. Bill Belichick) driven by the same animation. With bone-driven expressions, you’d need to create a different pose for each one.
Well if you are looking for a resource to play with the Genesis series that comes with Daz Studio is a good choice as the base rig supports both Morph and Cluster facial animation and all ready rigged as to the necessary facial joints.
The hard part is in the set up as it’s a more of a learn by exploration kind of process as the final result will depend more on the type of project you are working on but included is a rather steep learning curve and with a ready to use resource like Genesis you can get right into the required process. Not to worry though it’s one of those things that hard until it because easy
Note: Cluster s is a rigging thing so it’s ready to go as to that need but usually you will need to purchase the shapes if you want to go the morph direction as you can get into thousands of different shapes for a single character.
What you need.
First UE4 does not have the means of directly creating dialogue takes so you will need some way to author the animations be it clusters or morphs.
Morph targets are easy as they are additive where with clusters you can animate using transforms just as you would a run cycle but for it to work in combination as an additive to a action state you will need to layer the animations.
First things first though using Daz Studio and Genesis, I would recommend Genesis 3 as it’s best suited for video game use, do some basic "make it move " animations using DS, including facial animation using clusters, and export the package to figure out the set up in UE4.
This is how I started one step at a time
Can anyone do facial morph services to be used with I phone live link in UE4 without the need to do any remapping? Interested candidates, please email me on: sandra_kawar@yahoo.com
If anyone is interested, as per 2021, for face animations we use Character Creator + Iclone + Iclone Live Link. It’s a pretty costly set-up but worth it.
Is there any alternative solution similar to iClone or UE face mocap that does not involve iPhone? I don’t want to buy an iPhone for just this purpose. I’d prefer some cheaper depth sense camera (Kinect, SoftKinetic, RealSense etc.)