Are Timers frame dependent? Or no?

This is more than a bit surprising. Also, I cannot reproduce it.

As far as I can tell, looping timer will always trigger more often than a Delay-loop of the same interval.
This is because looping timers look back in time, can trigger multiple times per frame if necessary, and apply overtime to the next interval. For example with a looping timer of 0.1, if a frame takes 0.4s then the timer will trigger 4 times at once.
Delays on the other hand are like timers with no looping, so they’ll trigger once the interval has passed, and the extra frametime is lost as jwatte said in first post. So little by little the Delay-loop will derive away from the looping timer.

Example Timer :


image
As you can see the first couple timers trigger after 100ms + a bit of jitter, as expected, however from Timer 2 to Timer 3, only 0.093 have passed. That’s because looping timer looks back in time to catch up on lost time whenever it has the chance, trying to stay true to the original setup time (.722 in this case, so it will never trigger at .#21 for example).

Example Delay :


image
With Delay as you can see it just keeps slowly deriving due to not having the handling that looping timers have, ie. it triggers once time has passed, then it’s considered done, then another timer is created for the next delay. The extra frametime at the moment on trigger is lost.

Running both gives me same expected results - I cannot reproduce your issue.


image

1 Like