Is there a tutorial for setting up hands, assigning animations to common hand actions like grab?

I’ve been trolling forums, examining examples in UE4 editor but I’m getting this feeling like there’s no tutorial for basic - from scratch - setting up the vr hand pawns, assigning meshes to them, applying textures, then mapping the appropriate animations for grabbing, pointing, or whatever other animations one wants to add. I know there are hands in the blank VR template but for reasons only the Lord knows they mapped the grab event to the trigger button and ignored the grab buttons entirely… Instead of just clicking everything until I figure out how to re-map those buttons, I thought I should actually understand how the controllers and animations work instead of just trusting a demo or sample setup will be adequate for my project.

Is there a tutorial that breaks this down well?

I came up with this after a bit of work. The beauty of this solution is that it is hardware agnostic, so I can very easily support Oculus Touch, HTC Vive, Leap Motion, and whatever new motion controllers Valve eventually releases. This implementation I came up with is designed to be future proof. Here is a short video of the final result working with Oculus Touch:

I decided to decouple my VR hardware from characters. I think I may one of the few people who did this, but it is one of the best software engineering decisions I’ve made so far, and the results speak for themselves. I’m not going to go into details on how I did that (since its not the topic here). I think at this stage in the VR industry, there isn’t going to be much, if any documentation or tutorials on how people set this up because most people are still figuring it out for themselves. I’ll share a lot of my hand implementation so you can use it as inspiration for your own solution. Maybe someone will improve upon it or point out some limitations :slight_smile:

The basic gist of my approach is to have a skeletal mesh animation with bones for each finger, and then I manually set the bone rotations for each finger based on various states of the VR input hardware. I do this within the animation blueprint for the controlled character. Here is the most important bit from my animation blueprint which modifies per-finger bone rotations. This is for the left pointer finger.

I collect the input from hardware devices such as motion controllers. When the player pulls the trigger button, the character grips their hand into a fist. Here’s how I do it:

It may not be super obvious from that image, but I have a custom “Input” interface I created which invokes interface calls on whatever character is being controlled. Here, the hardware inputs generated a “grab” event. I send an interface call to whatever creature I’m controlling and it may have its own handling logic for how to process any grab events. Then, I set all fingers in the hand to their maximum curl value. I use a “desired curl”, which means that the hand is going to try to move the finger curl values to the desired value over time. This is done within the creature tick function and the end result is a gradual curling and uncurling of fingers over time. Fingers could curl around the geometery of a held object such as a soda bottle without too much extra coding.


Some devices such as the Oculus Touch have capacitive touch. That means if your finger is even resting on a button, you get some input data which lets you respond to those events. In effect, I created finger pointing, thumbs up, and a combination of both based on various finger touching configurations. This implementation really isn’t too hardware specific, so if there was ever a second hardware motion controller out on the market (ahem valve’s new one), then it’s relatively easy and straight forward to add in support with minimal refactoring.

A lot of the magic happens in my hand data structures within code. Here are the relevant structs and classes which represent a player hand. Look through these and read the comments. If you have any questions, let me know.




USTRUCT(BlueprintType)
struct FBoneXForm
{
	GENERATED_USTRUCT_BODY()

	//world space position of the bone
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Bone")
		FVector Position;

	//This is the minimum rotational constraint value of this socket bone. This is where we are at when "curl == 0".
	UPROPERTY(BlueprintReadOnly, EditDefaultsOnly, Category = "Bone")
		FRotator MinimumConstraint;

	//This is the maximum rotational constraint value for this socket bone. This is where we are at when "curl == 1".
	UPROPERTY(BlueprintReadOnly, EditDefaultsOnly, Category = "Bone")
		FRotator MaximumConstraint;

	//This is the current curl of a finger. Range: 0.0 -> 1.0
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Bone", meta = (UIMin = "0", UIMax = "1", ClampMin = "0", ClampMax = "1"))
	float Curl = 0.0f;

	FORCEINLINE FBoneXForm();
	FORCEINLINE FBoneXForm(FVector Pos, FRotator MinRot, FRotator MaxRot);
};

FBoneXForm::FBoneXForm()
{
	Position = FVector::ZeroVector;
	MinimumConstraint = FRotator::ZeroRotator;
	MaximumConstraint = FRotator::ZeroRotator;
}

FBoneXForm::FBoneXForm(FVector Pos, FRotator MinRot, FRotator MaxRot)
{
	Position = Pos;
	MinimumConstraint = MinRot;
	MaximumConstraint = MaxRot;

	//default to a slightly cupped hand so that any capacitive touch code can be visually obvious
	//CurMinCurl = 0.2f;
	Curl = 0.2f;
	//DesiredCurl = 0.2f;
}

USTRUCT(BlueprintType)
struct FPlayerFinger
{
	GENERATED_USTRUCT_BODY()

	//3 bones for the finger
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Finger")
		TArray<FBoneXForm> Bone;

	//whether the finger is touching the controller. By default, it's touching the controller.
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Finger")
		float CapTouch = 1.0;

	//How much the finger is curled. 
	//0 = fully opened; 
	//1 = fully closed;
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Finger", meta = (UIMin = "0", UIMax = "1", ClampMin = "0", ClampMax = "1"))
		float Curl = 0.0f;

	//This is the current minimum curl for a finger when we have capacitive touch.
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Finger", meta = (UIMin = "0", UIMax = "1", ClampMin = "0", ClampMax = "1"))
		float CurMinCurl = 0.0f;

	//This is the value which "curl" is trying to reach over time
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Finger", meta = (UIMin = "0", UIMax = "1", ClampMin = "0", ClampMax = "1"))
		float DesiredCurl = 0.0f;

	FORCEINLINE FPlayerFinger();
};

FPlayerFinger::FPlayerFinger()
{
	Bone.SetNumZeroed(3, false);

	Bone[0] = FBoneXForm(FVector(0, 0, 0), FRotator(0, -20, 0), FRotator(0, 90, 0));
	Bone[1] = FBoneXForm(FVector(0, 0, 0), FRotator(0, -10, 0), FRotator(0, 110, 0));
	Bone[2] = FBoneXForm(FVector(0, 0, 0), FRotator(0, -10, 0), FRotator(0, 30, 0));

	Bone[0].Curl = Curl;
	Bone[1].Curl = Curl;
	Bone[2].Curl = Curl;
}

USTRUCT(BlueprintType)
struct FPlayerHand
{
	GENERATED_USTRUCT_BODY()

	//Whether or not this hand was tracked
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Hand")
		bool Tracked = false;

	//This is how much vertex weight we give to the prebaked animation vertices vs. hardware vertices.
	//0 = full anim weight
	//1 = full hardware weight
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Hand", meta = (UIMin = "0", UIMax = "1", ClampMin = "0", ClampMax = "1"))
		float AnimWeight;

	//This is where we want to move our anim weight to over time
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Hand", meta = (UIMin = "0", UIMax = "1", ClampMin = "0", ClampMax = "1"))
		float DesiredAnimWeight;

	//A normalized value which indicates the blend weight of a hand between fully open hand and fully closed hand.
	//This is used for non-leap motion, per-finger bone rotation values.
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Hand")
		float Curl = 0.2f;

	//This is the elbow position
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Socket")
		FTransform Elbow;

	//This is the pivot point for the whole hand
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Socket")
		FTransform WristBone;

	//the center position of the palm
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Socket")
		FTransform Palm;

	//3 bones for the thumb
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Finger")
		FPlayerFinger Thumb;

	//3 bones for the pointer finger
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Finger")
		FPlayerFinger Pointer;

	//3 bones for the middle finger
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Finger")
		FPlayerFinger Middle;

	//3 bones for the ring finger
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Finger")
		FPlayerFinger Ring;

	//3 bones for the pinky finger
	UPROPERTY(BlueprintReadWrite, EditAnywhere, Category = "Finger")
		FPlayerFinger Pinky;

	FORCEINLINE FPlayerHand();
};

FPlayerHand::FPlayerHand()
{

	Palm = FTransform(FRotator(5, 0, 0), FVector(4.5, 0, 0));

	Thumb.Bone[0] = FBoneXForm(FVector(3.1f, 0, 0), FRotator(0, -20, 30), FRotator(0, 20, 0));
	Thumb.Bone[1] = FBoneXForm(FVector(3.2f, 0, 0), FRotator(0, 0, 0), FRotator(0, 15, 0));
	Thumb.Bone[2] = FBoneXForm(FVector(3.2f, 0, 0), FRotator(0, 0, 0), FRotator(0, 60, 0));

	Pointer.Bone[0] = FBoneXForm(FVector(9.61f, 0, 0), FRotator(0, -20, 0), FRotator(0, 90, 0));
	Pointer.Bone[1] = FBoneXForm(FVector(3.25f, 0, 0), FRotator(0, -10, 0), FRotator(0, 110, 0));
	Pointer.Bone[2] = FBoneXForm(FVector(2.28f, 0, 0), FRotator(0, -10, 0), FRotator(0, 30, 0));

	Middle.Bone[0] = FBoneXForm(FVector(9.28f, 0, 0), FRotator(0, -20, 0), FRotator(0, 90, 0));
	Middle.Bone[1] = FBoneXForm(FVector(3.8f, 0, 0), FRotator(0, -10, 0), FRotator(0, 110, 0));
	Middle.Bone[2] = FBoneXForm(FVector(2.8f, 0, 0), FRotator(0, -10, 0), FRotator(0, 30, 0));

	Ring.Bone[0] = FBoneXForm(FVector(8.5f, 0, 0), FRotator(0, -20, 0), FRotator(0, 90, 0));
	Ring.Bone[1] = FBoneXForm(FVector(3.18f, 0, 0), FRotator(0, -10, 0), FRotator(0, 110, 0));
	Ring.Bone[2] = FBoneXForm(FVector(2.57f, 0, 0), FRotator(0, -10, 0), FRotator(0, 30, 0));

	Pinky.Bone[0] = FBoneXForm(FVector(7.5f, 0, 0), FRotator(0, -20, 0), FRotator(0, 90, 0));
	Pinky.Bone[1] = FBoneXForm(FVector(3.06f, 0, 0), FRotator(0, -10, 0), FRotator(0, 110, 0));
	Pinky.Bone[2] = FBoneXForm(FVector(1.68f, 0, 0), FRotator(0, -10, 0), FRotator(0, 30, 0));
}


I would tentatively say that this isn’t going to be something anyone can just easily copy/paste and have running in a few hours. Expect a few days/weeks of planning, design, implementation and testing. I hope what I’ve shared helps accelerate your dev cycles or gives people a solid head start on a direction.

Why not create the enumerator or struct in UE4 itself?

Also, what does your first comment mean : “I decided to decouple my VR hardware from characters.”
Just wondering.

It is in UE4… just defined within C++ code instead of a blueprint. I discovered long ago that the best practice is to create enums in C++ because modifying an enum later doesn’t break things as much.

Normally if you are creating a VR controlled character, you would create motion controller components within the character blueprint and have logic which reads the motion controller transform and then places hands at those locations. The HMD transform logic would also be done within the character blueprint. This causes “tight coupling”. If you add a second category of character which can be controlled with VR inputs, then you would have to reproduce the existing logic for the second character type. Now, let’s say you made a modification to your VR logic which adds some sort of new capability (such as ducking down). Now, you have to go through all of your VR characters and update their logic, one by one. Lets say you also want to add support for a new VR hardware device: You initially supported HTC Vive, but now you want to support Oculus Rift as well. This means you have to go through all of your VR controlled characters and update the control logic again. It gets tedious and it introduces chances for human error, which scales by the number of VR controlled characters, and also increases the amount of testing you need to do.

So, when I “decouple” my VR hardware interfaces from my characters, it means that none of my characters know anything about VR or motion controller components. I have a single, common VR interface which can be used to control any character which implements the interface. Think of it like puppets and puppeteers: Each character is a puppet that can be controlled by a puppeteer. The puppet doesn’t care about how it is being controlled, it just cares about satisfying the incoming commands as best as it can. The puppeteer doesn’t care about the implementation details of how a command is satisfied, it just wants to give commands and expects the puppet to handle the response. If the puppeteer changes the way they control a puppet, the puppet should have no idea, nor care: It just has strings attached to its arms and it moves when the string is pulled. How that string is pulled is irrelevant to the puppet.

I accomplish this by having a “VRHead” pawn class I created. It contains all of the VR components and interfaces and translates all of those hardware inputs into commands which can be sent to a character, such as “move your hand to this position”. The VRHead “possesses” a character, similar to how a ghost would possess a living being and cause it to move according to the ghosts will. A character can also be depossessed, which means it is no longer being directly controlled by a player controller. Someone might think, “Hey wait a minute, why aren’t you just doing this in the player controller class?”. The answer is because player controllers cannot be instanced as disembodied actors within the game world. I need to have the capability to be a disembodied observer.

The end effect is that I can control any character within my game by possessing it with my VRHead. I can possess and depossess any character at will (which is great for debugging). All of the VR hardware implementation is done within the VR head, so if new VR hardware comes out, I just have to modify the VRHead class to support the new hardware capabilities and any controlled creature automatically becomes compatible. This means I only have to change my VR hardware implementation in one class, QA and testing is much easier, duplication of logic and human error factor is reduced to zero.

I hope I have convinced some people of the value of decoupling VR hardware from characters. I believe it is the best practice and everyone should do it. The current VR template in UE4 does not do this and should eventually be changed.

Thank you <3 <3 <3

Keeping it separate is definitely a good practice - it is the same reason the standard PlayerController is separate from the Pawn and can possess different types of pawns.