Hi Stephane,
Could you please clarify what you mean by a fixed time delta? Are you referring to a value that is always calculated based on the target FPS, regardless of the actual frame time?
From what I understand, this isn’t the case - when low-end devices fail to maintain target FPS, this metric almost exactly matches the default FrameTime metric from the CSV profile, with a one-frame offset:
[Image Removed]As I see in UEngine::UpdateTimeAndHandleMaxTickRate, the stat unit FrameTime metric (FApp::CurrentTime - FApp::LastTime) includes the time spent on: actual game logic + waiting for render + the delay of the next frame to fit t.MaxFPS. In contrast, the FrameTime from the CSV profile includes: the delay of the current frame to fit t.MaxFPS + actual game logic + waiting for render.
The issue is that the delay to fit t.MaxFPS is calculated based on the duration of game logic from the previous frame. So if the engine waits for a certain amount of time, but the actual logic in the current frame takes longer than in the previous one, the current frame will exceed the t.MaxFPS target. This overshoot is only corrected in the next frame by reducing the delay before starting the game logic. However, if the game logic then happens to be shorter, the frame will end up being shorter than the t.MapFPS target. This is what causes the inconsistency in the CSV FrameTime metric, whereas the stat unit FrameTime metric doesn’t have this issue because it measures the game logic time + delay calculated based on the duration of that specific game logic execution, but not from the previous iteration. Please correct me if I’m wrong.
[Image Removed]