I did some very basic performance tests, just to sate my curiosity on the topic:
I created three different actors that all do the same logic every tick: they add DeltaSeconds to a float until it hits 1, and then subtract DeltaSeconds from the float until it hits 0, and then starts adding again, and so on.
The first utilizes conventional Event Tick logic, the second utilizes a DelayUntilNextTick loop, and the third utilizes Gates. All of these blueprint setups can be viewed under the below dividers.
Normal Event Tick Loop
DelayUntilNextTick Loop
DelayUntilNextTick Loop Using Gates
This one was hard to get all in one screenshot in a clear way, so just trust me that the logic works. While being less clear, the benefit to this one is that it doesn’t require a “Growing” bool to determine if the tick should be adding or subtracting from the float
The test involved spawning 10000 of one type of actor, and letting the game run for a minute with ‘stat GPU’ enabled. Here are the results:
-
No Actors Spawned
FPS: ~120
Average MS: 5.82
-
10000 Event Tick Actors Spawned
FPS: ~25
Average MS: 14.79
-
10000 DelayUntilNextTick Loop Actors Spawned
FPS: ~27
Average MS: 15.33
-
10000 DelayUntilNextTick Loop With Gate Actors Spawned
FPS: ~15
Average MS: 16.09
NOTE: I don’t really know what I’m doing, because I’ve never done any formal UE profiling before, haha. Maybe I should be doing CPU profiling instead of GPU profiling, but I didn’t want to spend too much time researching this topic for what was supposed to be a quick little investigation.
With that in mind, the results were kinda inconsistent, likely because I’m not doing something right. Sometimes my FPS was 16 when the test starts and never changes, sometimes it’s 25 when it starts, sometimes it’s 20… the point is that it never really changed from the starting FPS when the game started running.
Similarly, the ‘stat GPU’ average MS value behaved fairly strangely. For all of the tests with actors, it would frequently start at about 6ms, and after some seemingly random amount of time (between 5 and 30 seconds), it’d increase up to 15-20ms or, and then stay there indefinitely.
One thing I can assert with 100% confidence, though, is that doing the Gate method had a noticeable impact on performance, as indicated by the FPS. While I do think it’s nice to not have to use a bool for conditionally running the tick logic, this performance decrease definitely makes me much more cautious about using it in the future.
I didn’t observe any evidence of using a non-gated DelayUntilNextTick loop causing any performance hits, though. Which is nice, but I’m still not sure if it’s bad practice to do so.