Hello,
I am trying to write my own simple gravity, that is frame rate independent. So I decided multiply my Force with Delta Seconds to compensate.
However, when checked in detail, there is an issue with using AddForce function like this:
To test the theory that this will be frame rate independent, I start the game with the object from a high point, and measured the time it reaches the floor with different frame rates using the commands:
t.maxfps 60, t.maxfps 45, t.maxfps 30, t.maxfps 15
Then I measured the time on my phone stopwatch looking at the pc screen, and the results are inconsistent, and as follows;
60FPS - 15.5seconds
45FPS - 13.7seconds
30FPS - 11.4seconds
15FPS - 15.5seconds
Now, of course measuring by hand in phone introduces some error, but this error should be below a second. There is a clear difference in them. So next, I tried to calculate the real acceleration of the object in the blueprint editor as well, following this graph attached to event tick:
Then I measured acceleration like this:
Finally, measured the acceleration during this free fall, I get the resulting numbers:
60FPS - 15.5seconds , Acceleration=16.7 cm/s2
50FPS - 14.5seconds , Acceleration=20.0cm/s2
45FPS - 13.7seconds , Acceleration=22.2 cm/s2
30FPS - 11.4seconds , Acceleration=33.3 cm/s2
15FPS - 15.5seconds , Acceleration=16.7 cm/s2
10FPS - 19.1seconds , Acceleration=11.1 cm/s2
Until we go down to 15FPS, it seems like the acceleration is increasing with frame time, and in an eerily similar number to it in milliseconds… (My Gravity float is 1000.0), but it gets back to the same number with 60FPS when I go to 15FPS.
What is going on? Am I missing something very simple? How can we fix this?



