How do you figure out the orientation of the player torso?

The issue is that a player could be turning their head to look over their left or right shoulder, but if the player wants to walk forward, they should walk in the direction their torso faces, not the direction their head faces. This way, you can be looking over your shoulder as you’re running away from a monster.

I’m wondering if anyone has figured out a way to derive the torso orientation of the player based off of the motion controller positions and orientations and the HMD position and orientations?

I’ve got a 70% working solution at the moment, but the missing 30% bothers me a lot because there’s a lot of edge cases I don’t account for.

I’ve started looking at the extremes of physical movement to get an idea on the torso direction. For example, your head can’t yaw more than about 90 degrees left or right, so the torso yaw will always be within 90 degrees of the head yaw. I also looked at the hand position extremes of the left and right hands, but couldn’t really make any generalized inferences about that. Elbows cause complexity because you can reach your hand behind your back by bending your elbow. However, at the far extremes, a hand can trace out a hemisphere with the center located at the shoulder socket. There are some physical limitations though, for example, you can’t touch your elbows behind your back, and your left arm can’t reach across further than your right arm, etc. I don’t know how to make use of these facts to derive torso rotation however (why can’t VR just come with a belt or something?)

Here’s my current code implementation.



void AWizard::GetPlayerTorso(FVector& WorldLocation, FVector& DirVec, FRotator& Rotation)
{

	FVector BHForward = GetActorRotation().Vector();
	BHForward.Z = 0;
	FVector Gaze = GazeYaw();

	//if we're not tracking both hands, we can't get a hand midpoint!
	if (LeftMC->IsTracked() && RightMC->IsTracked())
	{
		FVector LMC = GetMCPos(true);
		FVector RMC = GetMCPos(false);
		//We draw a line from one hand to the other and then divide it in half to get the midpoint on that line.
		FVector HandMidPos = (LMC + RMC) / 2.0f;



		BHForward = GetActorLocation() - HandMidPos;
		BHForward.Z = 0;
		BHForward.Normalize();

		//the hands are behind the actor center point
		if (FVector::DotProduct(BHForward, Gaze) < 0)
		{
			BHForward *= -1;
		}
	}
	
	
	
	//get the forward facing vector for the motion controller. Note that if you flip the controller upside down, we have to flip the forward vector.
	FVector LMCForward = FVector();
	if (LeftMC->IsTracked())		//if we lost tracking, we ignore this hand until it comes back
	{
		LMCForward = LeftMC->GetForwardVector();
		if (FVector::DotProduct(LMCForward, Gaze) < 0) LMCForward *= -1;
		LMCForward.Z = 0;		//remove pitch
		LMCForward.Normalize();
	}
	

	FVector RMCForward = FVector();
	if (RightMC->IsTracked())
	{
		RMCForward = RightMC->GetForwardVector();
		if (FVector::DotProduct(RMCForward, Gaze) < 0) RMCForward *= -1;
		RMCForward.Z = 0;		//remove pitch
		RMCForward.Normalize();
	}
	

	//FVector TorsoVec = RMCForward + LMCForward + BHForward + Gaze + LastTorso;
	//FVector TorsoVec = BHForward * 3 + Gaze;
	FVector TorsoVec = BHForward * 0.1f + Gaze + RMCForward * 0.1f + LMCForward * 0.1f + LastTorso * 5.f;
	TorsoVec.Z = 0;			//remove any pitch
	TorsoVec.Normalize();

	//run a sanity check against the gaze yaw vector: We know it can't be more than 90 degrees due to human physical constraints
	if (FVector::DotProduct(Gaze, TorsoVec) < 0)
	{
		//houston, we got a problem... either the player became an owl or our math is wrong
		//let's just take the vector average between the gaze vector and the last good torso vector
		TorsoVec = Gaze + LastTorso;
		TorsoVec.Z = 0;
		TorsoVec.Normalize();
	}

	LastTorso = TorsoVec;
	DirVec = TorsoVec;
	Rotation = FRotator(0, UKismetMathLibrary::MakeRotFromX(TorsoVec).Yaw, 0);
}

You will need at least one more sensor, on the chest or belt, to track the torso seperately. You can test this by fastening one of the vive controllers to your belt and use it to get the forward vector.

Yeah, that’s not gonna happen. You have to develop for the lowest common denominator in terms of hardware configurations. Nobody would buy my game if they needed to buy a third controller or belt.

I’ve improved my solution above to get to about 80%, which is probably good enough for now. Still not perfect, but it’s workable. I’m certain that it’s possible to derive the torso orientation of the player by examining the controller and hmd positions over time and using what we know about human physical constraints, come up with a decent approximation.

If, we can get some crazy dude that know how to use deep learning, and just use mocap normal human(not crazy yoga people) movement, it could potentially we have a good solver that can solve upper body position with just controllers and HMD tracking info.

It’s likely that you’ll have to massage the torso based on the head tracking unless you want to allow your player to go all Exorcist on you and have their head on backwards to their body. If your player turns around, there should be a threshold where if the head turns then the body follows. that arch should be just about where a real person can turn their head sideways just in front of their shoulder.

Also it entirely depends on how your controlling the movement of the body. If you’re just using xBox controllers for now, then it’s basically just FPS controls. Your Player Character’s rotation and position are based on the controller movement but the “camera” head rotation is based on the HMD rotation within the body.

(meaning if you rotate using the XBox controller, you’ll rotate the camera as well.)

Do be careful with this as motion often makes a player nauseous. You’re going to have to slow movement way down compared to conventional FPSes.

I’m basing my algorithm for deriving the torso orientation based off of normative human physical constraints (assuming my physical constraints are normative).

I found my physical limits on how far I can turn my head to the left and right without turning my torso. If my torso is oriented at 0 degrees, then my head can yaw between 90 and -90 degrees, relative to my torso orientation. This is a good helper for approximating torso rotation.

The second thing I look at is the position of the players arms relative to the HMD position. If you draw a line connecting one hand to the other, and then find the midpoint along that line, and then draw a line between this midpoint and the torso position (flattening the Z value), you get an additional direction vector. If the player moves both of their hands behind their back, we can detect this by running a dot product against the head orientation vector. If this dot product is negative, we know the hands are behind the back and we can flip this vector direction.

The third thing I look at is the orientation of the wrists themselves. Generally, we know the forward direction of the motion controllers is suggestive to the forward direction of the torso, particularly if only look at the yaw and invert the yaw if the motion controller is being held upside down.

So, generally, I combine all of these various data points to get an 80% approximation to the torso rotation. You can move your arms anywhere and turn your head anywhere, and generally, the derived torso will be approximately within 20 degrees of your own. It could be improved though, and that’s why I’m asking around to see if anyone else has figured it out. Here is some vague hand waving on how I think it could be improved:

**not implemented bonus: **This part can probably be improved a lot and I don’t do this yet: But, generally if you hold a hand controller straight forward, then pitch your wrist up and down, there are physical constraints on how far you can pitch your wrist. We also know the hand position relative to the HMD, so we know if the players hand is fully extended or not. If the hand is fully extended, then the elbow joint has no rotation and any pitch we have is influenced only by the wrist rotation and shoulder rotation. Somehow, there could be some fancy math done to get a pretty good approximation to the shoulder socket position, which is directly connected to the torso.

Also not done: We know the head doesn’t rotate more than 90 degrees from the torso rotation, but the arms also have rotational constraints. I suppose if we know the length of a players arm and use what we know about human anatomy and proportions, we can approximate the elbow position and derive that from the motion controller position and rotation. If our elbow position is accurate, we can get a pretty good idea on the shoulder position based on the measured arm length and head position. And if we can get a good shoulder position for the left and right shoulder, we can run a vector from the neck position to the left and right shoulder positions, run a cross product against the up vector and choose the cross product vectors based off of the HMD gaze vector.

Have the end user “calibrate” by touching various parts of their body (with the motion controllers). Probably the easiest way. Another way would be to use average lengths of the human anatomy, but that would be required the user to still calibrate (T-pose, A-Pose, Arms down, Arms extended up, arms extended forward). This would give you at least the position of the body parts. Then it’s just a matter of using IK and FK to get orientation.

8093e2e416c80fe1f509af8f98d1366898bdbf97.jpeg

1326baf874c5b607ee6749ffe1c10ae18caa2548.jpeg

The benefit of this is that you don’t have to be perfect, as long as you’re in the ballpark, people will feel it. It will look right to others and when youre in HMD, you’re not seeing your own elbow… Although if you pay attention, you could move your own elbow without moving your controller position and notice that the elbow doesn’t move… but really, no one pays that much attention to their elbows.

[QUOTE=
If, we can get some crazy dude that know how to use deep learning, and just use mocap normal human(not crazy yoga people) movement, it could potentially we have a good solver that can solve upper body position with just controllers and HMD tracking info.
[/QUOTE]

Funny… That’s nearly what my research I’ve been doing for the last couple of months is about. I’m presenting it tomorrow, and will probably be refining it for weeks to come. I’m using HMD + controllers and some extra state and prior frames’ data fed into an ANN, and training it to predict foot movement for full body with only 3 tracked sensors. Will update, but the future looks promising. (what else is new?)

Link to your research

Can I get a link to your research paper? I am trying to work in same area

Very interresting stuff going on here. Anyone got some videos to share of how it’s looking so far?

I’ve got it working in my VR game (Spellbound). There’s about a 15 degree margin of error, so turning your head will cause your torso to rotate slightly. I still need to do more refinements on the weighting of data to reduce the margin of error. Ideally, people should be able to look over their shoulder as they’re running forward.

I did some experiments in the car park this morning.
Judging by my footprints in the snow, I had a real hard time maintaining a straight forward line while looking backwards over my shoulders. Hope no one saw me is all…

There is a good reason why controller facing is used generally when head to movement direction isn’t wanted. If you lock to degrees then you lose full rotational degree of motion freedom, and if you don’t than you can’t move and look somewhere else at the same time.

Yes, but you might not have to worry about maintaining a very exact, unchanging forward vector… There’s a natural tendency to wobble when changing head directions.

The challenge is in best responding to the input of the end user and their intent. I want to be able to run forward in the direction of my torso while looking over my shoulder as something is chasing me, and I still need to refine the precision / technique of my approach. A 15 degree margin of error may not be good enough for what I’m trying to do because it also means that I would change my direction of travel by 15 degrees, which could mean that I’d run into something if I’m running down a straight hallway.