Different results replicating WorldTimeSeconds in GameState and PlayerController


I have spent the whole day on this and it is driving me crazy… I am trying to syncronize the world time seconds between server and client. I am using the exact same code inside the PlayerController and the GameState but with different results… Let me explain:

Sending a replicated timestamp to the clients:

void AArenaGameState::PostInitializeComponents()

	GetWorldTimerManager().SetTimer(TimerHandle_USTS, this, &AArenaGameState::UpdateServerTime, 5.0f, true);

void AArenaGameState::UpdateServerTime()
	if (GetLocalRole() == ROLE_Authority)
		ServerTimeReplicated = GetWorld()->GetTimeSeconds();

On the receiving end, correct the server time by player ping:

void AArenaGameState::OnRep_ServerTimeReplicated()
	if (GetWorld()->GetFirstPlayerController() == nullptr || GetWorld()->GetFirstPlayerController()->GetPlayerState<APlayerState>() == nullptr) return;


	float RoundTripTime = GetWorld()->GetFirstPlayerController()->GetPlayerState<APlayerState>()->GetPing() * 4.f / 1000.f;
	float AdjustedTime = ServerTimeReplicated + (RoundTripTime * 0.5f);
	ServerTimeOffset = AdjustedTime - GetWorld()->GetTimeSeconds();

Then whenever current server time is required on the client I use this:

float AArenaGameState::GetServerWorldTimeSeconds() const
	return GetWorld()->GetTimeSeconds() + ServerTimeOffset;

Now when I send a reliable RPC call to my client with the current server timestamp I should be able to calculate how long ago this happened in the past on the server:

void ACharacter2DBase::DoSomething_Implementation(float ServerTimeStamp)
	float TimeDelay = (GetWorld()->GetGameState()->GetServerWorldTimeSeconds() - ServerTimeStamp) * 1000.f;

When I implement the code for syncing the servertime in my PlayerController, everything works as expected and TimeDelay in my character function is roughly the same as the players ping. HOWEVER when I implement the same code in the GameState, where I feel it belongs, TimeDelay (in the character class) will often be in the 3 digits negative, as if the event happened in the future. I am scratching my head over this for a while now. The only explanation that I have is that whenever I synchronize ServerTimeReplicated over to my clients, it will actually take longer for the updated value to be received by the clients than when doing in the PlayerController. While the RPC will be fast in both cases.

So am I right and replicating a value in the GameState takes just longer to arrive at the clients? Maybe because the GameState is replicated to EACH client, while the PlayerController is only replicated to ONE client?

I am only using one client on ListenServer and one remote client to test this, so I would not expect to much of a difference, but still. It’s weird, any help is appreciated.

One more note: If I set the interval at which to replicate the value super low, e.g. changing the timer from 5.f to 0.001f, the results suddenly become more accurate. This leads me to believe that the value replicates quite slow on the GameState. But why?

Well well well, looks like in AInfo (baseclass of GameModeBase) a variable NetUpdateFrequency is set from the default value of 100.f to 10.f. Setting it back to 100.f does the trick.

Searching around it is also possible to leave NetUpdateFrequency at 10.f and instead call ForceNetUpdate() inside the UpdateServerTime() function to make sure the value is sent across.