Facial Animations in UE4

Hey Guys! This is kind of getting started question so please bear with me. I have a few questions regarding the facial animations :

  1. If I’m using performance capture or just face capture tech like FaceShift in this case, then is it better to create a facial rig or create blendshapes for use inside UE4? As people on various forum threads are suggesting that blendshapes are pretty heavy on the memory side and you get limited flexibility.
  2. If I manage to get my animations done in MAYA after capturing the data and cleaning up it a bit and then export it as an .FBX. How could I play various facial animation. Is it similar to the body animations, you can create different montage sections and various sequences and play them via blueprints (Anim Blueprint)?
  3. How can I play facial animations along with the full body animations?

It is pretty difficult to find any tutorials or info related to facial animation inside UE4, so I thought better ask the community.

Hope you guys can help me out here.

Thank You.

Hi Amaresh,

it all depends what you need to accomplish.

  1. In my case I’m testing both blend shapes and joint based rig and yes, blend shapes are heavier to compute, but unless you’re developing an MMO ala GTA V if you’re using a blend shape rig it won’t hit too hard on the CPU.
    Regarding the flexibility remember that you’re doing everything inside Maya, so it all depends if you’re able to achieve what you want there with the blend shapes you created, and more specifically in the case of Faceshift, you’re given N amount of blend shapes yo work with.

  2. For both joint and blend shape you have an animation which you can use in the anim blueprint…I know there aren’t any tutorials, but with a bit of trial and error you’ll see that its quite easy to setup, even because you can trigger them easily during gameplay or during a cutscene

  3. In this case you’ll have a body animation and a face animation that play together, so worst case scenario you have a blend between the neck rotation and the face, otherwise you can just play them together without any issues.

I might to a tutorial in the future, but currently I’m quite busy, so I won’t promise anything :wink:

PS: Saw your comment on my video, but I reply here so that anyone could add/suggest/know about it :slight_smile:

To elaborate a little on 2, through some trial an error on my part they animations come in as a regular animation that you can use just like other animations. SO you can use a Montage to call the facial animation, or layer them on to other animations. For me I’m using animations for specific attacks that trigger when the player uses the fire button using animation montages. Still in the early proof-of-concept phase for them, but they work so far.

I’m going for a single player third person game.

As far as I know FaceShift requires 48 blendshapes. Is that good enough?

In terms of flexibility. There are pros and cons. I don’t think blendshapes are reusable whereas rigs are.

Blendshapes are good in this case as they will give different emotions or effects as far as multiple characters are concerned. But at the same time multiple blendshapes need to be created for all the main characters and hence requires more memory.

Rigs are good for reusability purpose. But are going to provide the same effect for all the characters which will look at a bit artificial. And hence, it is good for all the side characters(not so important).

I could be wrong. If that’s the case, feel free to correct me.

Thanks for replying and looking forward to your tutorial.

Yes, 48 should be good enough, the problem itself is Kinect which with 30fps is not capable of tracking lips movement, so you’ll end up with a very poor lipsync performance.
The trick for me was to track the upper part of the face with Kinect and then combine it with lipsync from Softimage, perfect results :slight_smile:

Both Blend Shape and joint rigs have pros and cons and both of them are reusable, but overall the weighting, corrective shapes and the tracking software is what makes the difference between a natural and artificial performance…I was really inspired by Judd Simantov making of The Last of Us ( strongly suggest to take a look at it ) and Jeremy Ernst 2011 GDC paper, which have some very nice tips and tricks for facial animation :wink:

Problem with blend shapes is they are unique to the character so for each character you have you would need a full set of shapes. 10 characters * 48 = 480 blends.

The better option would be clusters as you can reuse the animation on any character you wish and clamp to taste.

I’m looking to use Primesense Carmine 1.09 that comes with the kit. It gives a better result I think.

I’ve seen the presentation by Judd Simantov and it is awesome…now I’ll take a look at the GDC paper you suggested.

Thanks.

FACE ANIMATION PROCESS

The value of the blendshape weight is something that can be moved from one character to another. So what’s the trick in creating hundreds of blendshapes automatically for all character types with the click of a button? To use a preset file for all the joint transforms. The joint transform presets file contains the min/max xyz translation and rotation values of driven keys for each joint that is associated to a control. This settings file is easily customizable per character, and will allow animations to be moved from one character to another without much issue. The ‘driven keys’ provide a barrier so the tracked locators will never transform the mesh into an undesirable position. A blend shape is created after setting each control value to a -x/+x/-y/+y. The control rig is used as an in between for retargeting between characters. The motion capture footage is used to drive the X and y translations of the control rig. The scale of the tracked locators will determine how exaggerated the animation will appear. The rig x and y translations can be exported into UE4 to drive the values of the blendshapes to get the identical animation. In UE4, use the TICK() function and lock the fps to 30 to play the animation. To be in sync with audio you have to take into account dropped frames by using DeltaTime and skip those frames.

Whether you will be creating blendshapes or using fbx skeletal animations, it should not impact the animation process for setting keyframes, transfering animations between character or how anything is going to be setup(it’s just an export process). The control rig, or the imported motion capture data are the only things that should ever be animated directly. If custom blendshapes are built in zbrush initially, or not built until the very end after the animation is completed, the keyframes on the rig will remain the same.

//-------------------------------------------
// Skeletal animations
//-------------------------------------------
Before exporting a skeletal animation, bake the keyframes on the joint hierarchy and export as fbx.
Play the face skeletal animation on a separate animation state as to not interfere with the slot node or montage that will drive the body animations.

Skeletal face animations should be baked on the base skeleton if the character is retargeted onto other characters. The face animations won’t retarget correctly using any of the retargeting methods if the face structure is far different unless you’re transforms are based on rotation in which ‘AnimationRelative’ seems to work the best. For best results, skeletal face animations should be baked onto the character they’re made for and setting the retargeting method to skeleton or animation. To prevent disastrous transforms that occur on retargeted characters that have no animation playing on the face joints, create an idle state that takes in no animation.

idleFaceState.png

Playing a face animation is just about setting a reference to an animation into the final pose. One thing to consider is that UE4 will crash if you’re changing out animation references that are being accessed in the animation state. You can prevent this by setting a boolean value to blend out of the current animation being played and call an event that allows the animation file to be switched out.

//-------------------------------------------
// Blendshape animations
//-------------------------------------------
Blend shape animations are baked onto the control rig and the animation is exported to a text file that will drive the blend shape weights.
The text file can be imported into Unreal by dragging and dropping the text file into the content browser and defining a data type.
I have chosen to have each row in the data file represent a keyframe, and each column to represent a control name for the rig.
The min and max values of the animation can be scaled up or down at runtime by multiplying a float value in order to increase it’s exaggeration.
The animation is played by referencing the data file and then pulling out all the rows of the animation(total keyframes) and then setting the weight value of the morph target with “Set Morph Target”

Here is the blueprint diagram I used to play blendshape animations with synced audio that can loop a set amount of times.

How do you setup blendshapes for a face that contains multiple meshes?
As far as I know, you have to create a blendshape for each of these objects that need to deform. If they are to transform in the same fashion then it’s beneficial to combine the meshes. This is obviously not possible in all cases when you’re wanting to define multiple materials or otherwise but this is no real issue and there is an easy approach to solving this. All the blendshapes for all the meshes attached to the skeleton are all accessable from the skeletal mesh that owns them. So after they’re created you can think of them as a global variable.

The blendshapes that are created have to be identified in some way in order to be used. So you run into issues when you have two objects (teeth and face for example) that use the same animation values but have different blend shape names. There is a simple fix for this, which is to have an adjustable two dimensional array of all the names of the blendshapes that you want to use and their associated animation control name. It definitely becomes an issue when you have different and additional blendshape names for each of the characters but luckily it doesn’t make a difference if you’re trying to modify a blendshape value that doesn’t exist. Although, blueprint makes it very simple to quickly adjust an array containing all the blendshapes that are being used on a character.

In this example, the eyes are a single mesh, the teeth are a single mesh, the face is a single mesh. This shows how to scale a blendshape animation, and scale past the bounds from the 0-1 range creating undesirable animation.

Blendshapes can be supplemental to the face skeletal animations already in place. They can also entirely remove the bones in the face allowing you to quickly skin a character and make adjustments to the face details easy in the future and allowing you to animate the characters face shapes off of the body mesh without having to retarget anything.

This is just my process, I know there are a lot of really good options out there. I have created a tool that automates the process of creating the blendshapes, a face animation rig, setting relationships between mocap locators and rig controls, setting the driven keys from a presets file, saving driven key presets per character, transferring the face animations between characters with different joint placements, cleaning up skeleton transforms and so forth. I have also built a face animation player in blueprint that will allow x and y rig translations to drive the blendshapes weight.

-DOWNLOAD](BFPTools)- For Maya
*Use with caution still undergoing development

Installation: place contents into C:\BFPTools\

run the command in maya: source “C:\BFPTools\UnrealTools\FaceMocapMain_2015.mel”;

Auto create blendshapes: All blend shapes for the character are created with a single button click.

Automatic face rig creation and easy associate mocap data to rig

Hey, I’m using a modified mesh from Mecaw, that generic guy from Faceshift, and I’m having problems using blendshapes/morphtargets…

Summarizing, animations don’t run well on Build/Standalone Mode.

I applaude you sir, this is pretty amazing!

My method is quite different, but this could help a lot of people

Well done :wink:

Wow, Garner! Thanks for that detailed and informative write-up!

Thank you very much!
fireapache I’ll check out the project file you uploaded hopefully there’s something I can help out with but no idea right off the bat why it isn’t working. Nicholas thanks I’ve checked out your full body morph system that’s pretty amazing! If the tool helps out others save time that would make it all worth it. Jared thanks a lot!

I’ve added some new features to the tool:

New features

Overriding a control with a blendshape instead of skeleton for any control direction
Limiting the blendshapes created based on whether there’s an x,y,z value for a control’s influence or a blendshape override
Up to 4 blendshape overrides per control for +x,-x,+y,-y
Sending the current pose to zbrush for editing(Auto duplicate, unlock, freeze, goz)
Disabling controls from creating blendshapes if they have animation data applied(useful when using a single blendshape based on several controls)
Added an excel template for viewing the data file as a whole.

Download 1.2](http://leemangarner.weebly.com/uploads/7/1/0/9/7109684/bfptoolsv1.2.rar)

Hey Lee,

that’s a pretty decent face pipeline you’ve set up there. Kudos! :slight_smile:
Is it for a private project only?
If I understand you correctly your facial rigging and animation system can output blendshape and bone animation data because everything is based on your predefined default facial poses related to the driven key poses triggered by the “poor man’s” mocap data?
How many poses do you usually use for it? Is it comparable to Faceshift’s basic 48?

Cheers,

Hey Chrisschulte,

It’s for any public project, I’d be happy if you tried it out and gave some feedback and possibly any suggestions to make it better. Also feel free to adapt on the code. I haven’t used faceshift but I’ve seen good things in their videos. As of now it’s just poor man’s mocap data and also some phoneme presets for manual keying. If I come around to it I know I can include a real time image tracker straight into maya to control the rig with UDP. I’ve tried some stuff in the past and it completely froze Maya so hopefully there’s time in the future.

The output can be in the form of:

  • CSV keyframe data - which can be exported and reimported onto any character, in Maya or Unreal
    
  • FBX animation - which can be imported and played normally in Unreal
    

The initial setup that is defined in the settings file is 84 blendshapes, but it’s easy to drop the blendshapes all the way to 1. Just set the control’s min/max x or y values to 0 and it won’t auto create a blendshape.

Rig controls can easily be enabled and disabled by selecting the control and pressing the enable/disable button in the rig window. If a control is enabled and has an absolute value of anything greater than 0, it will create a blendshape for that control direction. There are a maximum of 4 blendshapes created per control per mesh assigned to the control.

Latest Version 1.3

Download](http://leemangarner.weebly.com/uploads/7/1/0/9/7109684/bfptools1.3.rar)

I’ll be posting most updates here
http://leemangarner.weebly.com/programming.html

As soon as I convert the blendshape keyframe player to a single C++ class I’ll share it.

Here is the code for the blendshape player. The blendshape player adjusts the blendshape values based on the keyframes exported from the plugin uploaded earlier.

The blendshape player consists of:

  • FaceAnimationPlayer.h / FaceAnimationPlayer.cpp (Handles playing the keyframe data)
    
  • FFaceAnimationData (struct containing keyframe table and audio references)
    
  • FKeyframeData (struct holding all preset control rig names that hold keyframe data)
    

How to use: Create an instance of the FaceAnimationPlayer class. Send a reference of the skeletal mesh containing the blendshapes into InitPlayer(). Play animation by sending a reference of the data table into PlayNewAnimation(). Your animation will play immediately.

keyframe structs to hold keyframes in a data table


//Blendshape Face Animation
USTRUCT(BlueprintType)
struct FFaceAnimationData : public FTableRowBase
{
	GENERATED_USTRUCT_BODY()

	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Anim", meta = (ToolTip = "Animation data reference"))
		UDataTable* keyframeData; //Reference to keyframe data struct holding the blendshape keyframes
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Anim", meta = (ToolTip = "Audio file to play"))
		USoundCue* voiceAudio;


	FFaceAnimationData()
	{
	}
};


//------------------------------------------------------------------
// Face animation keyframe data for blendshapes for single control
//------------------------------------------------------------------
USTRUCT(BlueprintType)
struct FKeyframeData : public FTableRowBase
{
	GENERATED_USTRUCT_BODY()

	//Name: frameNumber

	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString chin;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString nose;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString brow;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString topBrow;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString middleMouth;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString topMouth;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString leftBottomMouth;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString leftTopMouth;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString leftBottomMouthTwo;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString leftTopMouthTwo;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString leftCornerMouth;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString leftJaw;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString leftBottomCheek;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString leftMiddleCheek;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString leftTopCheek;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString leftBottomEyeRidge;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString leftEyeRidge;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString leftTopBrow;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rightBottomMouth;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rightTopMouth;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rightBottomMouthTwo;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rightTopMouthTwo;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rightCornerMouth;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rightJaw;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rightBottomCheek;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rightMiddleCheek;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rightTopCheek;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rightBottomEyeRidge;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rightEyeRidge;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rightTopBrow;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rEye;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString lEye;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rBottomLid;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rTopLid;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString lBottomLid;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString lTopLid;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rBrow1;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rBrow2;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString rBrow3;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString lBrow1;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString lBrow2;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString lBrow3;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString mShape;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString wShape;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Keyframe Data")
		FString tShape;



	FKeyframeData()
	{

	}
};

FaceAnimationPlayer.h


// Candax Productions

#pragma once

#include "GameFramework/Actor.h"
#include "MyStaticLibrary.h" //FaceAnimation Struct
#include "FaceAnimationPlayer.generated.h"


/**
 * Blendshape Animation player
 */
UCLASS(Blueprintable)
class BFPONLINE_API AFaceAnimationPlayer : public AActor
{
	GENERATED_UCLASS_BODY()
	
	
public:

	virtual void BeginPlay() override;
	virtual void Tick(float DeltaSeconds) override;

	//---------------------------------------------------------
	// PROPERTIES
	//---------------------------------------------------------
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Anim")
		FFaceAnimationData animData;

	UPROPERTY(VisibleAnywhere, BlueprintReadWrite, Category = "Mesh")
		USkeletalMeshComponent* sMeshComponent;

	UPROPERTY(VisibleAnywhere, BlueprintReadWrite, Category = "Audio")
		UAudioComponent* voiceAudioComponent;

	UPROPERTY(VisibleAnywhere, BlueprintReadWrite, Category = "Audio")
		USoundCue* voiceAudio;

	UPROPERTY(VisibleAnywhere, BlueprintReadWrite, Category = "Animation")
		float fps;

	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		float deltaTime;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		int32 currentFrame;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		int32 maxKeyframes;

	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		float lastTime;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		float currentTime;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		float totalTime;

	//Loops
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		int32 loops;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		int32 maxLoops;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		int32 lastLoop;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		bool finishedCycle;


	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		int32 framesPassed;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		int32 useableFrame;

	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		bool bHasStarted;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		bool bUseJointRotation;

	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		float animationScale;

	//Blendshape Controls
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		int32 curControl;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		int32 maxControls;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		FName controlName;

	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		int32 additionalBlendshapeIndex;


	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		TArray<int32> AdditionalBlendshapeControls;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "Animation")
		TArray<FName> AdditionalBlendshapeNames;


	//---------------------------------------------------------
	// FUNCTIONS
	//---------------------------------------------------------
	UFUNCTION(BlueprintCallable, Category = "Blendshape Anim Player")
		void InitPlayer(USkeletalMeshComponent* meshC);

	UFUNCTION(BlueprintCallable, Category = "Blendshape Anim Player")
		void ReformatMinMax(FString keyframeData);

	UFUNCTION(BlueprintCallable, Category = "Blendshape Anim Player")
		void GetMaxKeyframe();

	UFUNCTION(BlueprintCallable, Category = "Blendshape Anim Player")
		bool LimitFPSRate();

	UFUNCTION(BlueprintCallable, Category = "Blendshape Anim Player")
		bool PlayNewAnimation(FFaceAnimationData animation, int32 newMaxLoops);

	UFUNCTION(BlueprintCallable, Category = "Blendshape Anim Player")
		bool ResetAnim();

	UFUNCTION(BlueprintCallable, Category = "Blendshape Anim Player")
		void AnimationCycle();

	UFUNCTION(BlueprintCallable, Category = "Blendshape Anim Player")
		void AddAdditionalBlendshape(FName blendshapeName, int32 controlID);

	UFUNCTION(BlueprintCallable, Category = "Blendshape Anim Player")
		FString GetFaceAnimKeyframe(UDataTable* dataTable, FName frameNumber, int32 controlNum);

	UFUNCTION(BlueprintCallable, Category = "Blendshape Anim Player")
		FName GetFaceAnimControlName(int32 controlNum);

private:

	float minXValue;

	float maxXValue;

	float minYValue;

	float maxYValue;


};


FaceAnimationPlayer.cpp


// Candax Productions

#include "BFPOnline.h"
#include "FaceAnimationPlayer.h"

AFaceAnimationPlayer::AFaceAnimationPlayer(const class FObjectInitializer& PCIP)
	: Super(PCIP)
{
	//Enable Ticking
	PrimaryActorTick.bCanEverTick = true;
	PrimaryActorTick.bStartWithTickEnabled = true;
	PrimaryActorTick.bAllowTickOnDedicatedServer = true;

	sMeshComponent = PCIP.CreateDefaultSubobject<USkeletalMeshComponent>(this, TEXT("SkeletalMesh"));
	sMeshComponent->AttachTo(RootComponent);
}

void AFaceAnimationPlayer::BeginPlay()
{
	Super::BeginPlay();

	fps = 30.0f;
	lastLoop = -1;
	maxControls = 44;
	finishedCycle = true;
	animationScale = 1.0f;
}

void AFaceAnimationPlayer::InitPlayer(USkeletalMeshComponent* meshC)
{
	sMeshComponent = meshC;

	//Add additional blendshapes
	AddAdditionalBlendshape("eyesMesh_rEye", 30);
	AddAdditionalBlendshape("eyesMesh_lEye", 31);
	AddAdditionalBlendshape("teethMesh_chin", 0);
}

void AFaceAnimationPlayer::AddAdditionalBlendshape(FName blendshapeName, int32 controlID){
	AdditionalBlendshapeNames.Add(blendshapeName);
	AdditionalBlendshapeControls.Add(controlID);
}


bool AFaceAnimationPlayer::PlayNewAnimation(FFaceAnimationData animation, int32 newMaxLoops)
{
	ResetAnim();
	maxLoops = newMaxLoops;
	animData = animation;

	GetMaxKeyframe();

	//Set voice audio
	voiceAudio = animData.voiceAudio;

	bHasStarted = true;

	return true;
}


void AFaceAnimationPlayer::Tick(float DeltaSeconds)
{
	Super::Tick(DeltaSeconds);

	//----------------------------------
	//Play Animation
	//----------------------------------
	deltaTime = DeltaSeconds;


	//Wait until an animation has started
	if (bHasStarted){

		//Accumulate delta time
		totalTime += deltaTime;

		AnimationCycle();
	}
}


void AFaceAnimationPlayer::AnimationCycle(){


	//Wait until frame has finished processing
	if (finishedCycle){

		finishedCycle = false;

		//Set keyframe based on fps
		LimitFPSRate();

		//Keyframes table
		UDataTable* tableRef = animData.keyframeData;

		//-------------------------------------------------
		//Loop through all predefined blendshape controls
		//-------------------------------------------------
		for (int32 i = 0; i < maxControls; i++){
			//Get the value for the current keyframe
			FName useableFrameName = FName(*FString::FromInt(useableFrame));
			FString curKey = GetFaceAnimKeyframe(tableRef, useableFrameName, i);

			//Reformat min max keyframe values
			ReformatMinMax(curKey);

			//Get the controlname from the id
			FName curControlName = GetFaceAnimControlName(i);

			//Set Morph Targets
			sMeshComponent->SetMorphTarget(FName(*(curControlName.ToString() + "MinX")), minXValue);
			sMeshComponent->SetMorphTarget(FName(*(curControlName.ToString() + "MaxX")), maxXValue);
			sMeshComponent->SetMorphTarget(FName(*(curControlName.ToString() + "MinY")), minYValue);
			sMeshComponent->SetMorphTarget(FName(*(curControlName.ToString() + "MaxY")), maxYValue);

		}
		//-------------------------------------------------
		//Loop through additional blendshape controls
		//-------------------------------------------------
		for (int32 j = 0; j < AdditionalBlendshapeControls.Num(); j++){

			//Get the value for the current keyframe
			FName useableFrameName = FName(*FString::FromInt(j));
			FString curKey = GetFaceAnimKeyframe(tableRef, useableFrameName, AdditionalBlendshapeControls[j]);

			//Reformat min max keyframe values
			ReformatMinMax(curKey);

			//Get the controlname from the id
			FName curControlName = AdditionalBlendshapeNames[j];

			//Set Morph Targets
			sMeshComponent->SetMorphTarget(FName(*(curControlName.ToString() + "MinX")), minXValue);
			sMeshComponent->SetMorphTarget(FName(*(curControlName.ToString() + "MaxX")), maxXValue);
			sMeshComponent->SetMorphTarget(FName(*(curControlName.ToString() + "MinY")), minYValue);
			sMeshComponent->SetMorphTarget(FName(*(curControlName.ToString() + "MaxY")), maxYValue);
		}


		//Complete Cycle
		curControl = 0;
		finishedCycle = true;
	}
}



void AFaceAnimationPlayer::ReformatMinMax(FString keyframeData)
{
	//Handle null keyframes
	if (keyframeData == "")
	{
		//Null - Return;
	}

	//Split keyframe data
	FString xLitVal = "";
	FString yLitVal = "";
	keyframeData.Split(TEXT("x"), &xLitVal, &yLitVal, ESearchCase::IgnoreCase, ESearchDir::FromStart);

	//Convert keyframe into float
	float xVal = FCString::Atof(*xLitVal);
	float yVal = FCString::Atof(*yLitVal);

	//Scale animation
	yVal = animationScale * yVal;
	xVal = animationScale * xVal;

	//Greater or less than X
	if (xVal > 0){
		minXValue = 0;
		maxXValue = xVal;
	}
	else{
		minXValue = xVal * -1;
		maxXValue = 0;
	}

	//Greater or less than Y
	if (yVal > 0){
		minYValue = 0;
		maxYValue = yVal;
	}
	else{
		minYValue = yVal * -1;
		maxYValue = 0;
	}
}

void AFaceAnimationPlayer::GetMaxKeyframe()
{
	UDataTable* tableRef = animData.keyframeData;
	FString fMaxKeyframes = GetFaceAnimKeyframe(tableRef, "maxFrames", 0);
	maxKeyframes = FCString::Atoi(*fMaxKeyframes);
	maxKeyframes--; //Reduce frame by 1 to use index 0 as 1

	//------------------------------------------------
	// Can get and store all keyframe data locally *
	//------------------------------------------------
}

bool AFaceAnimationPlayer::LimitFPSRate()
{
	//Convert fps to milliseconds
	float fpsMil = 1000 / fps;

	//Add delta time to current time
	currentTime += deltaTime * 1000;

	//Check if elapsed time is greater than the last time + a new frame time
	if (currentTime > (lastTime + fpsMil)){

		//Determine if we have to skip keyframes based on time elapsed
		framesPassed = FMath::FloorToInt(fps * totalTime);

		//Go to next frame
		lastTime = currentTime;

		//Increment frame counter
		currentFrame++;

		if (currentFrame >= FMath::FloorToInt(fps)){
			currentFrame = 0;
		}

		//Set the amount of times the animation has looped
		loops = framesPassed / maxKeyframes;


		//Check if we have started a new loop
		if (lastLoop < loops){
			//Only play set number of loops
			if (maxLoops > 0){ 
				//has gone over defined maxloops
				if (loops >= maxLoops)
				{
					bHasStarted = false; //Stop Play
					return false;
				}
			}
			
			//Infinite Loop
			lastLoop = loops;

			//Replay Audio
			UGameplayStatics::PlaySoundAtLocation(this, voiceAudio, sMeshComponent->GetComponentLocation());
		}


		//Get the keyframe to use
		useableFrame = framesPassed - (loops * maxKeyframes);
	}
	else{
		//Have to wait


	}

	return true;
}



bool AFaceAnimationPlayer::ResetAnim()
{
	bHasStarted = false;
	currentFrame = 0;
	loops = 0;
	lastLoop = -1;
	finishedCycle = true;
	curControl = 0;
	framesPassed = 0;
	totalTime = 0.0f;

	return true;
}



FString AFaceAnimationPlayer::GetFaceAnimKeyframe(UDataTable* dataTable, FName frameNumber, int32 controlNum)
{
	if (frameNumber != NAME_None){
		FKeyframeData newData = *dataTable->FindRow<FKeyframeData>(frameNumber, "", false);

		//Get the keyframe data from the control 
		//The id of the blendshape name
		switch (controlNum){
		case 0:
			return newData.chin;
		case 1:
			return newData.nose;
		case 2:
			return newData.brow;
		case 3:
			return newData.topBrow;
		case 4:
			return newData.middleMouth;
		case 5:
			return newData.topMouth;
		case 6:
			return newData.leftBottomMouth;
		case 7:
			return newData.leftTopMouth;
		case 8:
			return newData.leftBottomMouthTwo;
		case 9:
			return newData.leftTopMouthTwo;
		case 10:
			return newData.leftCornerMouth;
		case 11:
			return newData.leftJaw;
		case 12:
			return newData.leftBottomCheek;
		case 13:
			return newData.leftMiddleCheek;
		case 14:
			return newData.leftTopCheek;
		case 15:
			return newData.leftBottomEyeRidge;
		case 16:
			return newData.leftEyeRidge;
		case 17:
			return newData.leftTopBrow;
		case 18:
			return newData.rightBottomMouth;
		case 19:
			return newData.rightTopMouth;
		case 20:
			return newData.rightBottomMouthTwo;
		case 21:
			return newData.rightTopMouthTwo;
		case 22:
			return newData.rightCornerMouth;
		case 23:
			return newData.rightJaw;
		case 24:
			return newData.rightBottomCheek;
		case 25:
			return newData.rightMiddleCheek;
		case 26:
			return newData.rightTopCheek;
		case 27:
			return newData.rightBottomEyeRidge;
		case 28:
			return newData.rightEyeRidge;
		case 29:
			return newData.rightTopBrow;
		case 30:
			return newData.rEye;
		case 31:
			return newData.lEye;
		case 32:
			return newData.rBottomLid;
		case 33:
			return newData.rTopLid;
		case 34:
			return newData.lBottomLid;
		case 35:
			return newData.lTopLid;
		case 36:
			return newData.rBrow1;
		case 37:
			return newData.rBrow2;
		case 38:
			return newData.rBrow3;
		case 39:
			return newData.lBrow1;
		case 40:
			return newData.lBrow2;
		case 41:
			return newData.lBrow3;
		case 42:
			return newData.mShape;
		case 43:
			return newData.wShape;
		case 44:
			return newData.tShape;
		}

		return "";

	}
	else
		return "";
}


FName AFaceAnimationPlayer::GetFaceAnimControlName(int32 controlNum)
{
	//Get the keyframe data from the control 
	switch (controlNum){
	case 0:
		return "chin";
	case 1:
		return "nose";
	case 2:
		return "brow";
	case 3:
		return "topBrow";
	case 4:
		return "middleMouth";
	case 5:
		return "topMouth";
	case 6:
		return "leftBottomMouth";
	case 7:
		return "leftTopMouth";
	case 8:
		return "leftBottomMouthTwo";
	case 9:
		return "leftTopMouthTwo";
	case 10:
		return "leftCornerMouth";
	case 11:
		return "leftJaw";
	case 12:
		return "leftBottomCheek";
	case 13:
		return "leftMiddleCheek";
	case 14:
		return "leftTopCheek";
	case 15:
		return "leftBottomEyeRidge";
	case 16:
		return "leftEyeRidge";
	case 17:
		return "leftTopBrow";
	case 18:
		return "rightBottomMouth";
	case 19:
		return "rightTopMouth";
	case 20:
		return "rightBottomMouthTwo";
	case 21:
		return "rightTopMouthTwo";
	case 22:
		return "rightCornerMouth";
	case 23:
		return "rightJaw";
	case 24:
		return "rightBottomCheek";
	case 25:
		return "rightMiddleCheek";
	case 26:
		return "rightTopCheek";
	case 27:
		return "rightBottomEyeRidge";
	case 28:
		return "rightEyeRidge";
	case 29:
		return "rightTopBrow";
	case 30:
		return "rEye";
	case 31:
		return "lEye";
	case 32:
		return "rBottomLid";
	case 33:
		return "rTopLid";
	case 34:
		return "lBottomLid";
	case 35:
		return "lTopLid";
	case 36:
		return "rBrow1";
	case 37:
		return "rBrow2";
	case 38:
		return "rBrow3";
	case 39:
		return "lBrow1";
	case 40:
		return "lBrow2";
	case 41:
		return "lBrow3";
	case 42:
		return "mShape";
	case 43:
		return "wShape";
	case 44:
		return "tShape";
	}

	//Default
	return "chin";
}

Somewhat of a necro but how does all of this play into a system with customizable character faces since those are usually done with blend shapes too.

Hey Damir,

There is a max bounds that every blendshape deformation can reach. This max bounds is determined by the 0-1 range of the blendshape value. Every character’s blendshape actual deformation is different, but their 0-1 range is always the same(The base mesh is equal to 0, the actual blendshape mesh is equal to 1).
The blendshapes themselves are stored on the character model and don’t need any extra customizations between characters to transfer animations between them within UE4. The only thing that is manipulated is the blendshape values from 0-1, and this is done with a keyframe file. The keyframe file is the same, but each of the characters transformations are unique to their blendshapes. The player file code above just checks a file for each blendshape’s value every 30fps. Because all characters have their independent range of deformation in relation to the same scalar value, it’s possible to transfer animation between them. Multiplying a blendshape value by a float will either greater or lessen the effect for that particular animation.

I hope this answers your question. I’m making significant improvements on this tool and will release a new version soon which already includes:
+easy file management of all character face and body animations
+right click thumbnails and apply face animation to character
+automatic apply blendshape sets to characters with a button click

and tons more features. I’m in the process of incorporating opencv into this to allow for real time solving, got my fingers crossed on that.