TLDR: only in 60fps timers keep correct scaling between 1.0 and 0.05 (or any other), why is that? how to get a timer that always constant in result?
i have created multiple timers with different number of calls per second. and expected them to be equal in scale under any fps as they should all resemble “time”.
But it seems that different calls per seconds, will give different results under different FPS. /
I expected 0.05 timer to be always 20 times more calls then 1.0 timer. but that is only true if FPS is 60
I used t.maxFPS command to limit fps in all tests.
Maybe my own understanding is fundamentally wrong (most likely). it seems like you can only use timers if all of them across the game have the same number of calls per second. otherwise fps changes and their result changed as well.
Tests made 20 sec each using my phone’s timer (0.2-0.5sec error of my finger press)
Print string:
1 - purple, 1 sec timer
0.1 - green, 0.1 timer
0.05 - orange, 0.05 timer
Limiting the precision of floating point values(?)
If (most likely this is the case) the error does not accumulate, then everything is correct.
The problem (peculiarity) is that the timer will not be called exactly after the specified time, it will be called in the nearest frame after at least the specified time.
60 frames per second is about 0.016 per frame.
So the timer at 0.05 will actually go off at frame 4, or through 0.064 seconds. The next one will go off at frame 7 (0.112 seconds from the start).
This is at the end a sampling theorem problem – your main loop doesn’t run at a perfect 60 Hz, and thus any timers will be quantized to when the main loop/tick happens to actually run.
The most accurate way to deal with this, is to get the current game time inside tick, and accumulate that to a variable. Then enter a loop that checks whether the variable is greater than 1/60, if so, subtract 1/60 from the variable, run your “one tick” function, and then check the loop again. If a particular frame takes longer to run, you’ll run your “one tick” function more than once in a single game loop tick.
You need one (the most frequent) timer (0.05 in this case). Then count how many times it has triggered.
Every second triggering means that 0.1 has passed.
Every 20 means that a second has passed.
In this case, the number of short calls in one long one will always be the same.
But if the main problem was that the number of triggers was stable, and to agree that the short interval would not be shorter than the frame - then my solution, simpler in my opinion, also has a place to be.
Thank you, i will test that implement. right now i linked the attack speed to the number of calls, it give me an “ok” results. better then before. but i want more control
Hi, as far as I know timers are framerate independent, but in the code you have shown in the first image of the original post essentially doesn’t use timers, but a delay and recursive calling the event (which very likely won’t be framerate independent). If you want to use timers, then you would need to set the timer once and set it to looping (i.e. one event that sets the timer and one event that gets called every X seconds).
So as far as I know e.g. a timer set to looping every 0,05 seconds may not be called exactly every 0.05 seconds but will be called every 0,05 seconds on average, even if it means it being called several times in a row in a single frame (e.g. if one frame ends at 0,04s and the second at 0,12s (second frame taking 0,08s), then the timer would be called twice after the second frame). Whereas in you current setup, it would only be called once after the second frame (you get an error for every frame that is longer than 0,05s, therefore your current setup only works if the frametime is a multiple of 0,05s (which 60fps is)).
Note that using a delay-loop or non-looping timer is essentially the same, as chrudimer pointed above. Your case looks like a good example of a small overtime error accumulating with every trigger. A looping timer would fix that right away.
Oh, i already performed that in a different post due to the use of an interface. i was forced to designed it that way instead of connecting both the execution and the delegate into the event.
Did not knew it will “normalize” the time and compensate on missing frames.