A few questions about "delta seconds"

Hello all,

I am very new to UE, and am having a real issue understanding aspects of delta seconds. After reading several explanations for delta time I still have a few questions. Maybe I need the ELI5 version.

  1. Am I correct in that delta time is the time in between ticks, and a tick is equal to fps?

  2. I saw a tutorial that said if you multiply the value of whatever I want “speed” to be by delta seconds, the resulting distance traveled the same even when fps is higher or lower. How is this?

  3. Is there a difference between getting delta seconds from the “delta seconds” pin on the Event Tick node versus the Get World Delta Seconds node? When should I use one over the other?

Thank you!

I never used worls delta seconds, i also don’t have idea what it is.
Delta seconds from tick are very important for everything that needs to be updated as fast as possible, for example object location in multiplayer. Lets say you playing on 60 FPS but your friend 30 FPS. In your script, you write code that have to move object from A to B, like falling rock. Where to draw rock each tick? you cant set rock location -10 cm each tick, because someone who have 30 FPS, 40, or 150 FPS will have different location… But, delta time - if you playing at 1 FPS, your delta time is 1, 50 FPS = 0,02. 100 FPS = 0,01. So if you multiply everything with delta time, you will get right values for something what need to be updated in tick.


Name368 has a pretty good example there. Basically, delta seconds is there to make things frame rate independent. In order to use delta seconds for something like a movement speed, you would first need to establish a movement rate like 10m/s. Let’s say your frame rate is 60fps(0.01666s), then you’d multiply your 10m/s by 0.01666s, the seconds cancel out and you’re left with 0.1666m moved in that time frame. You also use it for nodes like “finterp to constant” and with similar nodes like that, they tend to have an interp speed that’s pretty much the same as the movement rate stuff that I was talking about. I’m not 100% positive, but I think the non-constant “interp to” nodes have a ramp up/down effect on them so keep that in mind.

The main difference between delta seconds and world delta seconds is that world delta seconds accounts for time dilation. So using my movement example earlier, if you were in a bullet time mode, that slows speed to 1/10, the moving object would only travel 0.01666m per tick, instead of the 0.1666m that it normally would.

  1. You’re correct.

  2. As Name368 explained, multiplying your float variable representing your speed per second by the World Delta Time will give you approximately the same speed on every machine running your code, regardless of their performances.
    If you have a speed of 100:
    0.03 to render a frame -> 30 FPS -> 1000.03 = 3 -> each tick, the object will move by 3 uu
    0.01 to render a frame -> 100 FPS -> 100
    0.01 = 1 -> each tick, the object will move by 1 uu
    Therefore the object will move as much on all machines.

  3. As far as I know there is no difference between getting the output of the tick event or using GetWorldDeltaTime. I often use GetWorldDeltaTime just to avoid dragging the output tick pin all accross my blueprint.


But i don’t understand how the result is same…
as @ said:
in 30 fps machine the object will move by 3 uu every tick = 90 uu every second.
in 100 fps machine the object will move by 1 uu every tick = 100 uu every second.
then how is the result same???

Nope, 30 FPS means every frame lasts 1/30s -> 0.033333333… s, therefore with a speed of 100 centimeter per second (cm/s) you would get 100 cm/s*1/30 s = 100/30 (cm/s)*s = 3.33333333… cm per frame (and not 3).

If the velocity is constant then the formula is s = v*t (traveled distance = velocity * time). You know the velocity it is 100 centimeter per second and the time is the time since last update. So if you update the position per frame then the time is the delta time per frame.

Delta Time is calculated by the World at the beginning of each frame. It is the elapsed time since the last frame. It represents the time it took to render the previous frame.

One thing to remember is that the frame rate does not determine the frame time. Rather, the individual frame times determine the frame rate. In other words, the fps is determined by the number of frames per second. The actual time to render a frame is dependent on the scene and the performance of the hardware. Because of this, the number of frames per second varies and Delta Time is different from frame to frame.

Things get a little more complicated when you consider V-Sync, which is the refresh rate of the monitor. (adaptive tech like G-Sync and FreeSync solve this)

Sometimes it is desired to artificially cap the framerate to keep a consistent fps:
By default, the Frame Rate is smoothed https://docs.unrealengine.com/en-US/…ate/index.html
You can also specify a fixed framerate.

1 Like


no, a tick it is more like spf - seconds per frame (how many seconds - or rather how large a fraction of one - it took to render the frame)

And basically fps*tick=1 (unless the game slows down or speeds up), so it is easy to calculate one when you know the other (which is why the BP below works)

1 Like

Since this is the first search engine result for “delta tick vs world delta seconds”, I wanted to bring up something that I didn’t see mentioned:

If you change the “Tick Interval (secs)” parameter on an actor, the “Delta Seconds” value of said actor’s Event Tick node will account for that, while GetWorldDeltaSeconds will not.

So if you have actor A with their tick interval set to 1 second, Event Tick will fire every second and output a Delta Seconds value of approximately 1. GetWorldDeltaSeconds, on the other hand, will still output the time since the last frame as it normally would.

1 Like