This is a very complex topic, but i’ll try to explain it in terms of how its usually done in most AAA games and what makes sense for a very streamlined pipeline in ue4:
The two most common methods to do facial animation are either morphs or bone weights, morphs have the disadvantage of being targeted to one specific character only, so whats commonly done is that the base facial structure and movement is driven by bone weights / placed joints in the face at various spots, then morphs are used if needed to correct anything that the bones can’t achieve easily due to linear skinning such as specific lip compression or extreme lip positions.
When it comes to animations the big advantage of this method is you can create a pose database of most common poses a face can do ( Look into FACS for this ), this means that each unique face will have their own pose database then for animation instead of animating the bones directly you only animate the pose attributes ( Smile, Sneer Left ). Allows you to share any facial animation no matter how different the proportions/character is.
Epic has added support for this technique i believe in 4.18 as they are using it in fortnite now too.
You can find some docs on that here:
This is just a brief overview but hope it helps.