I’m just getting started with Blueprints and have been looking into the Content Examples Project lately.
In the Blueprints_Overview Level, I’ve stumbled upon example 1.6 (Simple Math Example).
It’s supposed to count and display the play time in seconds using the following event graph:
I feel like the first section of this doesn’t really do what it’s supposed to do though. As far as I understand, the Event Tick sends out an execution signal for every frame of gameplay. The Delay then passes on that signal after one second delay. As long as that delay is still underway, it won’t accept a new signal input.
Therefore I think there won’t be an output after every second but after 1 second + time remaining in the current frame, creating a slight error depending on your frame rate.
If you had a framerate of 4.5 fps (to keep it simple) the following would occur:
0s: 1st frame starts, 1st delay starts
ca. 0.89s: 4th frame starts
1s: 1st Delay ends, output signal is created (Integer set to 1)
ca. 1,11s: 5th frame starts, 2nd delay starts
2s: 9th frame starts
ca. 2,11s: 2nd delay ends (Integer set to 2)
ca. 2,22s: 10th frame starts, 3rd delay starts
… and so on.
Obviously this error would get smaller with a higher frame rate, but it would still add up after some runtime (for a frame rate around 120 fps this would still be an error of about 15s per hour).
I think using the Delta seconds output from the Event Tick, adding them up every tick and converting the result to an integer e.g. using Floor would have been more accurate and avoided this error.
Can anyone explain why this kind of architecture was used? Did I get anything wrong about how the elements work? Am I mixing up different kinds of frame rates here?
Thanks in advance.