Best Practices: Server Timer in Multiplayer Game

Hi All,

just a simple question in terms of a server timer for a multiplayer game.

In the respective Game Mode, I have setup a server time that triggers events like “Round Start”, “Preparation Time”, etc. Each event interfaces to each PC in the level and runs a script. Since this is only happening once each round, I see no problem in that. (Any better way of doing that?)

But the thing is: I want the client UI to properly display the server time so I don’t want to setup individual timers in the corresponding clients’ widgets. So until now, I was running an individual timer event on each client to retrieve the current time from the game mode. Now I am thinking of letting the game mode send the current server time to the client each time it changes: So basically the same setup like mentioned above, but now happening every second.

Is the latter option feasible performance-wise? I mean if the game has 100 players, will the GM catch up with sending the time every second? What if I want the timer to be faster than one second?
What would be the common way of retrieving the server time properly?

Looking forward to hearing your experiences!

I stumbled across this the other day, it might be useful:

3 Likes

A very nice article, thanks a lot for pointing at it.

The main take-aways for me were:

  • Someone else confirmed that my approach is efficient (yeah!)
  • I might be able to same some performance in requesting the time from the GM only once every X seconds and let the client do the “count down” in between the requests itself. Question is if the additional setup is really saving performance.

It’s really nice to see that I had a lot of similar thoughts regarding this topic. Thanks again!

1 Like

Hi!

Maybe I misunderstand but why would you send the server time every second? Time is one of the most predictable things ever, if you retrieve the server time just once on the client you can just let the client count by itself forward. It will diverge over time of course so you should sync periodically but there is really no need to send the time every second.

I don’t know maybe I am missing something but the game state is already sending the server time periodically to the client, you just need to retrieve it. The net connection also measures the average lag to the server, so you just need to do ServerTime + Lag every now and then and you have perfectly synchronised clocks. No need to make things more complicated than that.

1 Like

Hi Grimtech,

thanks for the input and sorry for being imprecise.

Blockquote
It will diverge over time of course

this is exactly the problem.
Therefore I originally set up one server timer and another timer on every client’s device to retrieve the time. But then I have two timers running (from each client’s pov) that seemed a little bit unnecessary to me (or unelegant). But as stated in the article linked above, this is actually the best practice.

Blockquote
you just need to do ServerTime + Lag every now and then

Yes, that’s what I took away from the article as well. But then I have three timers running: One on the server, one on the client to retrieve the server time every 5 seconds and another one that is doing the counting and gets adjusted by the 5-sec timer.

Any idea / way of comparing the performance of both options? I really just need the timer to be most precise when the round is nearly over so I could write a big chunk of code to get that but not sure if its really worth it performance-wise

Maybe we are talking past each other but I don’t understand where all your timers come from, you only need one on every machine. The server has the authoritative timer which counts ahead as it wants, every client has its own timer which it periodically synchronises by overwriting its own time with the replicated server time + its own average lag, both of which the engine already provide for you. I don’t think you really have to synchronize more than once every 5 - 10 seconds it you want millisecond accuracy.

As for performance: it’s highly unlikely that you will be bottlenecked by this so regarding framerate you will not see a difference with either approach. But from a mere logical standpoint: in the approach I described you just retrieve some values that the engine provides anyway and do one write operation to overwrite the client time every 10 seconds or whatever. That’s literally nothing compared to an extra RPC and replicating the time every second or more which would be insane (less for performance reasons, it would just generate a lot of unnecessary traffic). I only glanced over the article but to me it seems they are just doing themselves what the engine already does anyway.

1 Like

My apologies, I misunderstood.

So the bottom line is: Correct approach, but wrong implementation. Use the engine-built-in timer and do not create a custom one.

I can follow your reasoning very well and have to agree with it - it does indeed impose unnecessary traffic. Thanks very much for your patience in explaining it to a dummy like me