Also, my approach uses physics substepping, as I wrote above
I don’t really see how the server-resimulation could work out.
Let’s say, Client A has a ping of 200 milliseconds. Client B has a ping of 20 milliseconds.
At TIME 0, the game starts.
With pre-game syncing, both clients local games start the game at the exact same time and input movement immediately.
The input of Client B would be on the server after 20ms.
The input of Client A after 200ms.
At this moment the server has to recalculate the past 200 milliseconds for Client A’s movement. Okay, now the server has the input for TIME 0 at TIME 200 of Client B. But whops, whats about the input for the 200 ms it should recalculate now (from TIME 0 to TIME 200)? Not there… still on it’s way. That means, to be able to recalculate everything, the server would have to run constantly 200ms in the past, and sync Client B’s input by delaying it 180 milliseconds. Otherwise, they are not in sync and ping can influence the game mechanics, which is a no-go. Now when the server runs 200ms in the past, this means that it replicates for 200ms the locations of locally already past time, and needs additional 20 ms / 200ms to get the resimulated, corrected data to the clients, which will lead for input-latency: Input of the players will not be applied for Client B for 420 milliseconds, who only has 20ms ping.
Correct me if I got this whole concept wrong.