Timer rate faster than it should be

So I’ve come across something weird, and I’m not sure if it’s a bug or something I’m missing. Look at this screenshot:

VSync is off in this picture. Despite this, ‘Frame’ says 16.67 ms. DeltaTime reflects this. But I’m still getting a frametime of 4.43 ms. So, DeltaTime seems to be precalculating based on the value of ‘Frame’, but the actual framerate is much faster than that. This results in my game getting sped up 3 fold. Turning VSync on fixes this (not sure if it would break if I got under 60 fps with VSync on).

Simple way to reproduce this:

1 .Create a blank project

  1. Create a timer

  2. Print the remaining time to screen with AddOnScreenDebugMessage

  3. Switch VSync on and off while running the game. You will see the remaining time rate decrease much faster than it should with VSync off

Can anyone else confirm this is happening as well, or if I’ve missed something to make it frame independent?

UE 4 also has frame smoothing, you can disable it in the General settings for your project.

That might be the issue.

Timers are already frame-rate independent. I’m confused though, the timer must still be running at the correct rate surely, unless you’re multiplying the time you give to it by delta seconds? You set it up with a constant float and it does the rest.

Where are you getting the 4.43 ms frame time from? The frame time is still 16.67ms, 4.43ms is just the length of time it took the GPU to draw the frame, but the framerate is currently capped to 60, so it’s actually idle for most of that time. You probably have a maximum framerate set elsewhere. Try entering t.MaxFPS 250 in the console, which should un-cap it up to 250, and you should also ensure v-sync is off.

I never use VSync because the frame-smoothing is horrible and causes juddering for me, so I recommend just clamping fps using the above method instead.

You pointed me in the right direction Pumpy Bird, thanks. The problem was that ‘Use fixed frame rate’ was enabled.

I did some testing with this. My timers are all constant values and not based on delta time, as they should be. Once I disabled that option, everything worked, regardless of vsync/actual fps. However, with that option on, the problems still occur. I’ll make a video in a second demonstrating what happens.

Here’s an example. I made a 180 second timer on BeginPlay, and added a debug message on screen with the timer’s remaining time each tick. It’s not as obvious when I’m recording because the frame rate is closer to 60 than without recording, but you can see the change most clearly at about 2:25 in the video. Everything speeds up, not only the objects (which are completely weighted by deltatime), but also the timer if you watch the debug messages and count the time as it goes by. Since timers are supposed to be in real time, this should never happen.

Interesting… if that’s the case this should definitely be flagged as a bug!

EDIT: Could you post the code you’re testing with so I can file a report (unless you want to!)

I can make a simple project to replicate the bug, just gimme a sec.

I made a basic code project called FPS_Bug_Test, and just added this code in for the GameMode .ccp and .h:

FPS_Bug_TestGameMode.h


// Fill out your copyright notice in the Description page of Project Settings.

#pragma once

#include "GameFramework/GameMode.h"
#include "FPS_Bug_TestGameMode.generated.h"

/**
 * 
 */
UCLASS()
class FPS_BUG_TEST_API AFPS_Bug_TestGameMode : public AGameMode
{
	GENERATED_BODY()

	virtual void BeginPlay() override;

	virtual void Tick(float DeltaTime) override;

	FTimerHandle TestHandle;
};


FPS_Bug_TestGameMode.cpp


// Fill out your copyright notice in the Description page of Project Settings.

#include "FPS_Bug_Test.h"
#include "FPS_Bug_TestGameMode.h"


void AFPS_Bug_TestGameMode::BeginPlay()
{
	GetWorld()->GetTimerManager().SetTimer(TestHandle, 180.f, false);
}

void AFPS_Bug_TestGameMode::Tick(float DeltaTime)
{
	GEngine->AddOnScreenDebugMessage(-1, 3.f, FColor::Yellow, FString::Printf(TEXT("Time left: %f"), GetWorld()->GetTimerManager().GetTimerRemaining(TestHandle)));
}

I ran the debugger through VS2013 if that matters. Then I changed some project settings, just setting the GameMode as FPS_Bug_TestGameMode, and then going to general settings, and enabling ‘Use Fixed Frame Rate’, and setting it to 60 fps. Then I just used PIE. While it was running, I opened the console and typed r.VSync 1, and r.VSync 0, just switching between the two and watching the difference in speed of the timer counting down.

You can put the bug report for this, is it done through AnswerHub?

Yes, please report it and the reproduction steps in the bugs section of answer hub so it can get fixed.

Here it is for reference:

I’ve just skimmed through this quickly, so apologies if I’m missing something, but I don’t see any bug here.

Timers operate in game time, not real time - if you set a timer and then pause the game, you don’t want it counting down. The fixed frame rate setting is I believe intended for debugging and benchmarking. All it does is cause a fixed value for delta seconds to be passed to the tick function, regardless of frame duration in real time. The engine will still pump out as many frames as it can manage, up to the cap. So as the time taken to process a frame varies, objects and timers will appear to speed up and slow down.

Is that really intended though? I would think ‘Fixed frame rate’ would actually fix the frame rate, not just set DeltaTime to what the value would be if it was that frame rate. I would also expect my timers to not be based on DeltaTime. It just seems to be pumping frames as fast as possible, including past the cap that’s set with that option. It sets DeltaTime to a manual value based on the cap, but makes the frame rate as fast as possible without waiting based on the cap. So if you set the cap in the editor to be 60 fps, DeltaTime will be 1/60, but the frame rate is not actually limited by this option in any way.

And if it is working as intended, then all it’s doing is manually setting DeltaTime, without relying on what the actual framerate is. In that case ‘Use Fixed Frame Rate’ seems like an inaccurate representation of that option, and should be something like ‘Use Fixed DeltaTime’.

Yes it’s intended, though as I said I think it’s intended as a benchmarking setting and not for use in a released game. I agree completely that it has been misnamed.

And when I spoke of the frame rate cap, I meant the max fps setting, which is distinct from the fixed frame rate. That said, I’m not actually certain whether and how the former is implemented when the latter is enabled.

You have to keep in mind that, while it may be possible to cap a frame rate by just idling for a while after doing necessary frame processing, it’s not possible to cap it at the other end of the scale - if for whatever reason the frame can’t be processed in a given amount of time, then you just have to wait for it. That’s why a fixed frame rate isn’t really viable, since you could never guarantee that the player wouldn’t experience weird slowdown when the engine struggled to keep up. It would also be completely useless for networked multiplayer games.

Why you wouldn’t expect timers to be based on delta time I don’t understand. They would be useless for controlling in-game events if they weren’t synced to in-game time.

Yea I was also confused about the other side of the spectrum, being that you can’t really fix a frame rate if the engine takes longer than needed to process the tick. I thought maybe the option was akin to an internal VSync or something like that.

It’s not that I don’t expect timers to be based on delta time, it’s that I expect it to be based on real time in game. If I give a timer a 10 second count, I would expect it to take 10 seconds to go off (unless time dilation was changed perhaps, I’m guessing it changes based on that as well). But in this case DeltaTime is manipulated independent of the actual frame rate so it’s not based on real time anymore. It just seemed weird with this option that all of a sudden the timer was dependent on frame rate, which shouldn’t be happening under normal circumstances. But if the option naming was changed to reflect that it’s supposed to operate that way, then yea that’s fine.