While running some tests as I was working on a Network project, I found some strange results which I can’t explain or wrap my head around, so I’ve come looking for help.
- The first test I setup (which I will call Method A) consists of calling a Server Only Event from Remote when I press a Key. The Event takes a Vector3 Input which is the position of the player at the moment the event is called. Then on the server side, when I receive the event, I print the difference between the position that was transmitted and the current position of the player on the server.
What I was expecting to find was the difference to be of 0, as this was the theory I had seem to have understood from looking at Unreal Tournament source code. However with the player moving in straight line, I’ve found the difference to be sometime of 0, and sometime of a small (and slightly inconsistent) value which I’ll call “u” without anything that seemed to explain why I would get a value or another. I’ve then figured that this u value was basically the distance my player would cover in one frame. With the way I’ve set up the test, that mean that the event is either played on time (as with position being synchronized with the client), or one frame too early (the server need one more frame for the player to be at the same position).
- My second test (Method B) was to introduce some buffer for the inputs to be treated after the movement update has been done. This buffer has been implemented in c++, mostly by copying UT code, and works both on the client and on the server. That means that client side, after the key is pressed, the event will only fire once the movement is done, and on the server side, once the event is received, it will wait for the movement to be done before printing anything.
The result was values varying in the same manner as Method A, though this time between 0 and -u, meaning one frame too late (one frame after the frame where the position would have been synchronous).
- I’ve then tried to test the different part of the buffer. Methode C is buffer being activated only on the client side.
The result varies between u and 2u, so one frame too early or two frame too early.
- Method D is buffer being activated only on the server side.
The result varies between -u and -2u, so one frame too late or two frame too late.
At this point, I was pretty much lost, so I ran some more tests on the side :
- Test n°1 was to test the difference introduced by the buffer on the client. I printed the difference between the location when the key was pressed and when the buffer would fire. I found a consistent difference of -u. That was absolutely expected and that was a sanity check more than anything. Because the buffer waits for the next instance of movement before firing, it’s logical that we get a difference of one frame worth of movement. This explains the difference between the results of the Method A and Method C.
- Test n°2 was to test the difference introduced by the buffer on the server side. Same idea as Test n°1 except this time it’s between when the Event is received on the server and when the buffer fires. This time the difference I found was of consistently of -2u, so two frames worth of movement. That’s something I do not understand. One frame of difference is a results I would have understand, but I can’t wrap my head on how it had to wait another frame. Aside from that, it does explain the difference between Method A and D, and with the results of Test n°1, the difference between Method A and B.
- Test n°3 : Out of nowhere, I decided that instead of waiting for the buffer, the Event would open a Gate in the OnTick which would print the result, as to wait the beginning of the next frame. With this test, the inconsistency between results came back as I got results varying between 0 or -2u (not -u ! Only 0 and -2u !).
As I was running the test, Method B was also active and I noticed that the inconsistency was synchronous with Test n°3 results. When I would get 0 with Method B, I would get -2u with Test n°3 and when I would get -u with B, that would be 0 with 3.
My main problem is the inconsistency between the results which I heavily suspect to be due to how replicated events are handled in the frame. So my question is does anyone know something about that and as an idea on how to fix it ?
Aside from that, I absolutely do not understand the results of Test n°2 and Test n°3. Test n°2 might be due to how the buffer is coded (though it seems weird to me since it’s supposed to work the same way on the client side), but I absolutely cannot understand Test n°3 results. Why would I only get 0 and -2u ? Why never -u ? How does it just skip a frame ?
I’ve tried to run the test with some latency introduced through Clumsy, however that didn’t change the results nor the value of u (as it should).
TL;DR: Does anyone know when in the frame is a replicated event handled upon reception ?