I have a simple but important question (for me).
What do the values in Stat Net mean? Don’t get me wrong, I know what they are supposed to mean, but what / whose data do they show?
Since I am programming a MMO-Strategy game, I have to be very carefull about replication (don’t replicate data to clients who aren’t allowed to see them, only replicate when necessary, outsource moving objects with predictable path to clients etc). So I came up with a Agent-System, keeping track of everything any player should be able to see. Now every moving object is currently only existing on the server (not replicated) and the owning client (manually spawned representation), the Agent is passing the data from the Server to the target client (Client Function), but the In Rate (Bytes) in Stat Net increases on all clients in the same way. I don’t know if I implemented something wrong, or if the In Rate means something different than I thought. Whose In Rate is that? The server shouldn’t have any additional in Rate for this task, so it can’t be the server, but since the server is dedicated, all game-instances are Clients. Which one does Stat Net represent? Or is it the sum of all clients? Also, somewhere I red that the In Rate is calculated per second, somewhere else, that it is calculated per tick and then averaged.
And can you give me any estimation what In Rate is suitable for a MMO (depending on what this In Rate actually displays)? Like a upper border I should try not to cross at any given time? Of course, since it is a MMO, I try to avoid every bit that is not necessary, but it would give me an idea if I have to rework some systems. Goal is to fit as many players in a game as possible. Lets say, 500 people, what do you think would be a good, manageable, In Rate / Out Rate be (for a Strategy-Game: The players don’t dynamically move objects (characters) on the map by player input)
Thanks very much