Client running ahead of server?

Context: Testing in 2-player listen server standalone. This video is of the client window. Every tick the server is updating a vector variable with “get actor position” then rep notifying the client. The client then spawns a debug sphere. The unit movement speed is replicated and both are using the same speed.

Why is the server falling behind the client? Why does it just appear that the client unit just has a faster movement speed.

I was expecting the debug sphere to be ahead of client because of the time it takes to send the unit spawn request to server and then replicate the unit back to client. I feel like this is some oddity with the standalone window that I’m unaware of. Video below.

Im having a similar issue, question is below this one in the list. For some reason i cant get the location of an object to be replicated for host and client

I just read through your post. I posted there. My issue is different than yours - my location is replicating fine and my client-side prediction is pretty smooth, but im not getting expected behavior.

Anyone else have suggestions for my post?

Hi

I’m not sure what you mean. Are you replicating speed or loction?

Also you say the client spawns a debug sphere and then you say it’s the server, it’s a little bit confusing.

Maybe you can share the code? It’s hard to tell what could be happening otherwise.

Sorry I’m not at my computer at the moment.

I was trying to test actor position on the server compared to client so I could work on keeping clients in sync.

The server is updating a variable with the actors position every tick. This variable is a rep notify.

When the rep function is triggered on client the client draws the debug sphere at that replicated position.

I was expecting the server to be ahead or behind at a consistent rate, but as you can see the client is pulling away like it has a faster move speed.

When the actor is first created the actor move speed is replicated so they should share the same speed. I’ve tested whether they have the same move speed and that the timeline components are the same - and they are.

When I’m focused on the clients test window I can see the actor in the servers window stuttering - that’s what led me to believe it’s appearing this way because of the way Unreal handles the standalone test environment

Anyone else have any suggestions? I’ve exhausted everything I can find online.

I can’t move forward tuning my client prediction without solving this because I suspect it’s just UE standalone.

The other thing that I’m thinking is that the standalone window that isn’t focused seems to have frame rate issues (the actor stutters). When I focus it again the stuttering stops. So I’m wondering that because I’m not multiplying the actor movement speed by delta seconds that’s causing the appearance of a slower movement speed?

sounds like you have the client moving itself? so yes it’ll be ahead of server.

otherwise what calls the movement on the client?

Yes, agreed I would expect the client to run ahead of the server. I’m path finding on the server and sending a serialized path to client which takes that path and moves on it.

I was expecting the client to be ahead of the server by a consistent amount. What I didn’t expect (if you watch the whole video) is the client pulling further ahead of the server as if it has a faster movement speed. It’s almost like the timeline isn’t using time and is tied to the framerate.



is that running on client though or server? or possibly both by accident?

What is this called from ?

image

MovementSpeed is replicated, but maybe it is replicated to the client too late - after the timeline has started ? Have you checked by printing the timeline’s GetPlayRate directly, from the Update branch ?

No I’m not allowing the server to run this code. The server has its own movement event. Which is essentially exactly the same.

I tested both the server and clients play rate and they are both identical. I’m completely at a loss for what’s happening. The FPS in the standalone window that isn’t focused is 12fps (stuttering movement) while the focused window is 130fps. It feels like the speed difference is because of this, but I don’t know how to test or fix that because i thought timelines were frame independent.

Huh, the server movement code is not in the same place ? Sounds whacky.

There seem to be many points where it can diverge, even if you confirmed the Timelines are the same / same length / same curves / same play rates…

  • How often is UpdateClientPaths called ? If the client updates multiple times for a single stretch, the ClientCurrentLocation is gonna advance, making the LERP result go faster than originally intended. Does server movement do the same thing ?

  • What are XMovementOffset / YMovementOffset ?

  • Client path is modified, using result from FindSpotOnPath. Does server do the same thing ?

  • You are observing that client goes faster than server. Which one is correct though ? Calculate the time it should take based on your move speed / play rate, and measure which one is correct. Also, try printing LocationAlpha from the Update branch, and you’ll clearly see if the problem comes from timeline or elsewhere.

Look at this video - when I focus the desktop the 2 standalone windows appear to split the FPS (it shoots to 60FPS) and the servers actor becomes consistently spaced with the client. It’s a framerate issue - how the hell is that happening with a timeline?

As far as why I have separate processes for server and client - The server code is a bit more involved but the movement piece is exactly the same and I slapped the client movement together just for testing (end product will have them using the same)

I’ve tested your entire list of bullet points. It’s most definitely a frame rate issue - see my post above. How do I fix this?

Any chance you have a MaxDeltaTime set ?

Should be in one of the engine inis
image

Nope its 0. How can I rework the timeline movement to instead use a deltatime adjusted movement?

In the blueprint snippets above can you see any part that would cause the timeline to be framerate dependent?

Nope, timeline should already be progressing off of deltatime, resulting in same overall length on both machines even with different frame times.

This is definitely not a standard movement implementation, but I don’t see anything fundamentally wrong in this code, Lerping from A to B using a timeline alpha going from 0 to 1.

The only thing I can think of that could induce frametime dependency, is as I mentioned earlier (first bullet point) : if ClientFollowPath or ClientContinueFollowPath are repeatedly called, every time resetting the timeline and starting a new Lerp segment, then differences in framerates could cause an accumulation of small errors, eventually causing the noticeable drift.

Dang! Thanks for working with me. How would you personally handle actor movement along a grid? On the server I have all of my pathfinding and collision all tied to grid position. Actors Lerp from grid tile to grid tile reporting in as they move. The event ‘Continue follow path’ is called for each subsequent tile.

Edit: I should add that it needs to be extremely lightweight as the project will have 2K units at max intensity