Hello. I have a game with a custom pawn that simulates physics and that moves via SetPhysicsLinearVelocity with bAddToCurrent set to true. I noticed my pawn’s velocity in game is not what it says it is. In fact, the velocity is dependent on the framerate, but only if the framerate is below 37.5 FPS (DeltaTime = 1/37.5). Otherwise, the velocity is accurate.
When below this value, the actual velocity, which I measured by taking the position via GetActorLocation() between frames and dividing by DeltaTime, differs from the velocity I set the pawn to have in the following way:
ActualVelocity = SetVelocity* (37.5/FrameRate)
I have taken the usual measures to making sure velocity is time independent by multiplying my acceleration by DeltaTime. I used to have substepping on, but turned it off and have the same issues. Anyone know where this 37.5 number comes from, or perhaps a better way to do this?