AddTorqueInDegrees() in Tick function

Hi all!

I am making a helicopter game and I am having inconsistent results depending on the framerate of the game.

Here is my input for the helicopter:

void AHelicopter::Yaw(const FInputActionValue& Value)
{
	// Heli cannot do anything if the engine is not running
	if (EngineState != EEngineState::Running) return;

	float YawInput = Value.Get<float>() * (bIsFreelooking ? 0.0f : 1.0f);
	TargetRotation.Yaw = YawInput * RotationSensitivity;
}

void AHelicopter::Pitch(const FInputActionValue& Value)
{
	// Heli cannot do anything if the engine is not running
	if (EngineState != EEngineState::Running) return;

	float PitchInput = Value.Get<float>() * (bIsFreelooking ? 0.0f : 1.0f);
	TargetRotation.Pitch = PitchInput * RotationSensitivity;
}

void AHelicopter::Roll(const FInputActionValue& Value)
{
	// Heli cannot do anything of the engine is not running
	if (EngineState != EEngineState::Running) return;

	float RollInput = Value.Get<float>() * (bIsFreelooking ? 0.0f : 1.0f);
	TargetRotation.Roll = RollInput * RotationSensitivity;
}

Here is how that input gets used in the Tick() function:

// Update the current rotation to match the target rotation
CurrentRotation = UKismetMathLibrary::RInterpTo(CurrentRotation, TargetRotation, DeltaTime, RotationSpeed);
const FVector LocalTorque = FVector(-CurrentRotation.Roll, -CurrentRotation.Pitch, CurrentRotation.Yaw);
const FVector WorldTorque = UKismetMathLibrary::TransformDirection(GetActorTransform(), LocalTorque);
HelicopterMesh->AddTorqueInDegrees(WorldTorque, NAME_None, true);

I have noticed if the game runs at 200+ fps, the observed rotation is slow. And by contrast if the game is running at 30 fps the rotation is noticeably faster.

My intuition would say that because I am using this in tick, I should multiply the rotation force by DeltaTime, but that would lower the rotation speed even more as the framerate goes up. I am not sure what is happening and would greatly appreciate any help.

I have found the source to the problem.

It turns out that the engine scales mouse input by DeltaTime. So in my input functions, as the framerate goes lower, the InputValue goes higher because delta time goes up with a lower framerate. However this is not good for mouse input and felt really bad.

I ended up solving the issue by creating an input modifier and adding it to wherever I used the mouse for input in my InputActionMapping context. Here is my InputModifier:

I divide the incoming input with DeltaTime and then scale it back down with the Scale variable, which is exposed so I can tweak it in the InputActionMapping context (I used 0.01 for the Scale).

I know this is an imperfect solution but I don’t know how we can get the Mouse’s delta every frame from the input action itself?

I would love to know if there is a better solution!

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.