Are Timers frame dependent? Or no?

Hello,

I red in some sources/answers different opinions about frame (in)dependent timers.
However when I’m running my timer on server - it’s actually depend on frames. I’ve compared it with delay node - which looks like independent on frames.

image

So my question - am I doing something wrong or Timer “Time” input node uses not seconds like it written?
If it’s really frame-dependent what alternatives I can use beside delays?

Hello,

Timers and Delays are not the same thing, and they shouldn’t be.

Timers are Frame-Dependent pauses.
Delays are Frame-Independent pauses.

That’s it.

Use Timers when the sequence of events matters.
Use Delays for random, untethered events.

(Aside: I’ve tested this, and these are the results I’ve come up with, because the documentation doesn’t seem to confirm or deny this.)

1 Like

This is not true. The Delay node, which only works in blueprint event graphs, are a shallow wrapper on top of the underlying unreal engine timer system.

The real cause for the observed behavior, is that the unreal game loop is single threaded, and it will actually only execute your code “between” frames, during ticks. It renders a frame, then updates time to whatever “now” is, and then expires/fires any timer or delay that would have expired up to this new “now” time. So, the jitter of when your timer/delay happens, is dependent on the frame time of the frame right before it happens. However, the overall duration is not frame rate dependent – if you set it to 3.5 seconds, it will happen after 3.5 seconds plus maybe a little bit of jitter, no matter whether you run at 10 fps or 100 fps.

3 Likes

And would this occur with a delay as well(?), because that means I misinterpreted the jitter as a frame dependence.

I didn’t have that issue with the delay, but I want to know if you have.

Delays just run inside the main game loop, like any invoked event.
If you set a delay for 0.01 seconds, and your frame rate is 10 fps, your delay will actually continue after 0.1 seconds.

Okay, then I stand corrected.

Thank you, but how you will explain that 1 second of Delay equals to 1.7sec of Timer in my example? And when I changed engine fixed FPS - timer also was affected by this - but delay not.

Hi, are you sure your TESTDelay variable is not set to 1 by default? As I would definitely expect 1 second timer to take something around 7 x 0.1s delay, but not 17 x 0.1 delay.
This is because of the thing explained above, delays end a frame after passed time.

1 Like

Yes, I’m sure. I’ve recorded video comparing looping Timer with interval 0.1s vs Delay. Both increment their values by 0.1 every 0.1 sec.

1 Like

The timer and the delay may run in different run queues, but they are both run in the main thread, serially, between individual rendered frames. The exact way they calculate the game time at which to run, seems to vary. I agree it’s a bit surprising, but neither timers nor delays are real-time accurate features.

1 Like

Thanks for this, because I thought I was losing my mind.

Again, I’ll take J’s word for it, but I had this same result.

Ah well.

1 Like

This is more than a bit surprising. Also, I cannot reproduce it.

As far as I can tell, looping timer will always trigger more often than a Delay-loop of the same interval.
This is because looping timers look back in time, can trigger multiple times per frame if necessary, and apply overtime to the next interval. For example with a looping timer of 0.1, if a frame takes 0.4s then the timer will trigger 4 times at once.
Delays on the other hand are like timers with no looping, so they’ll trigger once the interval has passed, and the extra frametime is lost as jwatte said in first post. So little by little the Delay-loop will derive away from the looping timer.

Example Timer :


image
As you can see the first couple timers trigger after 100ms + a bit of jitter, as expected, however from Timer 2 to Timer 3, only 0.093 have passed. That’s because looping timer looks back in time to catch up on lost time whenever it has the chance, trying to stay true to the original setup time (.722 in this case, so it will never trigger at .#21 for example).

Example Delay :


image
With Delay as you can see it just keeps slowly deriving due to not having the handling that looping timers have, ie. it triggers once time has passed, then it’s considered done, then another timer is created for the next delay. The extra frametime at the moment on trigger is lost.

Running both gives me same expected results - I cannot reproduce your issue.


image

1 Like

Thank you,
I tested on Listen Server and was playing with Fixed Frame Rate of engine in Project Settings.

Ok I deleted “Intermediate” and “Saved” folders - and the issue gone :no_mouth:
I don’t know is it was some setting stored here or just some bug. But now timer even more accurate than delay - and timer no more dependent on frames.

1 Like

Wuh?

Happy for you, but my brain is not making the connection.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.