Are timers not Independent of fps? (screenshots inside)

TLDR: only in 60fps timers keep correct scaling between 1.0 and 0.05 (or any other), why is that? how to get a timer that always constant in result?

i have created multiple timers with different number of calls per second. and expected them to be equal in scale under any fps as they should all resemble “time”.

But it seems that different calls per seconds, will give different results under different FPS. /
I expected 0.05 timer to be always 20 times more calls then 1.0 timer. but that is only true if FPS is 60


.

I used t.maxFPS command to limit fps in all tests.

Maybe my own understanding is fundamentally wrong (most likely). it seems like you can only use timers if all of them across the game have the same number of calls per second. otherwise fps changes and their result changed as well.

Tests made 20 sec each using my phone’s timer (0.2-0.5sec error of my finger press)
Print string:
1 - purple, 1 sec timer
0.1 - green, 0.1 timer
0.05 - orange, 0.05 timer




Limiting the precision of floating point values(?)

If (most likely this is the case) the error does not accumulate, then everything is correct.

The problem (peculiarity) is that the timer will not be called exactly after the specified time, it will be called in the nearest frame after at least the specified time.

60 frames per second is about 0.016 per frame.
So the timer at 0.05 will actually go off at frame 4, or through 0.064 seconds. The next one will go off at frame 7 (0.112 seconds from the start).

1 Like

This is at the end a sampling theorem problem – your main loop doesn’t run at a perfect 60 Hz, and thus any timers will be quantized to when the main loop/tick happens to actually run.

The most accurate way to deal with this, is to get the current game time inside tick, and accumulate that to a variable. Then enter a loop that checks whether the variable is greater than 1/60, if so, subtract 1/60 from the variable, run your “one tick” function, and then check the loop again. If a particular frame takes longer to run, you’ll run your “one tick” function more than once in a single game loop tick.

3 Likes

Thank you for your detailed answer !
Is it possible to see an example for that? im not really sure how to approach to execute such a blueprint

So i have to have more fps then number of calls on my timer?

You need one (the most frequent) timer (0.05 in this case). Then count how many times it has triggered.
Every second triggering means that 0.1 has passed.
Every 20 means that a second has passed.

In this case, the number of short calls in one long one will always be the same.

This is exactly the wrong way to do it, because if your frames take longer than 50 ms to render, you will get fewer than 20 triggers in a second.

The way to do it is:

MyActorClass

  • TimeAccumulator: float
    Tick:
  • TimeAccumulator += DeltaTime
  • while (TimeAccumulator > 0):
    • RunOnce()
    • TimeAccumulator -= TimePerOneRun

If you want to run 200 times per second, TimePerOneRun is 1.0/200 or 0.005 for example.

1 Like

Your solution looks more correct than mine. :kissing_face:

But if the main problem was that the number of triggers was stable, and to agree that the short interval would not be shorter than the frame - then my solution, simpler in my opinion, also has a place to be.

Thank you, i will test that implement. right now i linked the attack speed to the number of calls, it give me an “ok” results. better then before. but i want more control

Your timers in original post are not looping.. What is calling them repeatedly ?

Hi, as far as I know timers are framerate independent, but in the code you have shown in the first image of the original post essentially doesn’t use timers, but a delay and recursive calling the event (which very likely won’t be framerate independent). If you want to use timers, then you would need to set the timer once and set it to looping :slight_smile: (i.e. one event that sets the timer and one event that gets called every X seconds).

So as far as I know e.g. a timer set to looping every 0,05 seconds may not be called exactly every 0.05 seconds but will be called every 0,05 seconds on average, even if it means it being called several times in a row in a single frame (e.g. if one frame ends at 0,04s and the second at 0,12s (second frame taking 0,08s), then the timer would be called twice after the second frame). Whereas in you current setup, it would only be called once after the second frame (you get an error for every frame that is longer than 0,05s, therefore your current setup only works if the frametime is a multiple of 0,05s (which 60fps is)).

I don’t know, i just call them all from begin play one after the other

Thank you sir, can you show an example that shows the difference? it will help me a great deal

I detailed this a bit in another post, see

Note that using a delay-loop or non-looping timer is essentially the same, as chrudimer pointed above. Your case looks like a good example of a small overtime error accumulating with every trigger. A looping timer would fix that right away.

Sure, here is an example of a looping timer:

1 Like

Oh, i already performed that in a different post due to the use of an interface. i was forced to designed it that way instead of connecting both the execution and the delegate into the event.

Did not knew it will “normalize” the time and compensate on missing frames.

Bro deleted 2 folders and timers went accurate, isn’t that scary ?

Guys, can i reset the timer to start?

Lets say its 50% of a sec, can i force him to go back to zero of a sec at a given moment? how to do that? in blueprint

Apologies for doable reply, may any of you know how to reset the timer back to 0 in a given moment ? (frame?)

@Chatouille @RedGrimnir @jwatte @PREDALIEN

Go download this LE Extended Standard Library | Fab its free, it comes it very handy bunch of nodes including a node with “DelayFrames” .

Delay whatever frames you need then for timer : use clear and invalidate timer. and run timer again.