I have a very simple slow motion system in my game that triggers when I press the space button, doing so sets the global time dilation in my game to something like 0.5.
That works fine, however I noticed that variables are changing as if time isn’t slowed down. For instance elsewhere in my game I can apply damage to an object every tick, it’s applying damage every tick whether time is slowed down or not.
I understand why I think? I’ve been reading a little on things like Delta time and timers but I’m completely unsure on how to work them into my game. If that’s even the way to go…
Any clarity or examples on how to go about this would be a huge help.
When you apply damage, multiply the damage by the return value of “Get Global Time Dilation.” So if your dilation is 0.5, your damage will be cut in half too and take twice as long to reach the point it would if time dilation were 1.
Even if you multiply your damage by the value of the time dilation, you will get different results when you have different performance. If your game thread runs at 60fps it will do more damage than when it runs with 30fps.
If you want to do 200 damage per second, you need to multiply 200 with your tick delta time (from your Tick Event or with the “get world delta time” function)
When you run at 30fps your delta time will be around 0.033 → 6.6 dmg per tick → 200 damage per second.
When you run at 60fps your delta time will be around 0.016 → 3.3 dmg per tick → 200 damage per second
Edit: That way you don’t need to worry about time dilation because the delta time will be higher/lower
Delta time is how many seconds it took the game to render the last frame. Of your game runs at 60 fps then delta would be around 0.016 seconds. 30 fps would be around 0.033 seconds and so on.
When you multiply a number by delta and apply it every frame, you’re essentially saying, “I want this value to change by this much every second.” Delta fractionalises the value based on your frame rate, so unless your frame rate varies wildly you’ll get consistent results.
Say you want to move your character by 1 unit. If you do this every frame, then over the course of 1 second, computer A running the game at 60 fps has moved the character 60 units. Computer B running at 30 fps has moved the character 30 units in that same second.
If you multiply 1 by delta, computer A won’t move the character 1 unit every frame, but instead 0.016 (1 unit/60 fps) units every frame. Over the course of 60 frames (1 second for computer A), he’s moved 1 unit.
I didn’t mention this before because I thought you were already doing it
I know that this problem already seems to have been solved, but I thought that I would mention one thing for posterity…
According to the description of the GetWorldDeltaSeconds function, this returns Delta Seconds adjusted by the Global Time Dilation value. So the bottom image is probably the most precise. The one above it might have seemed to do the same thing, but should have been actually reducing the damage output even more, since it was scaling the damage by the Global time dilation, THEN scaling it by a time value which is ALSO being scaled.
So - correct me if I am wrong - if the Time Dilation is set to 0.5, the bottom example would take twice as long (real time * 0.5) to apply the same amount of damage as it would at full speed. But the top example would take 4 times as long (real time * 0.5 * 0.5), event though time would still only be passing at half speed. Therefore, the top example would give the same amount of damage when applied with slow-mo running or not. But the bottom example would only do half the damage in slow-mo as it would in real time.