Announcement

Collapse
No announcement yet.

Replicating lots of data: Replication, RPC, or custom sockets?

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

    Replicating lots of data: Replication, RPC, or custom sockets?

    I am trying to find the quickest way to replicate a voxel world, made up of chunk actors. Each chunk contains a replicated array of visible voxels.

    I have noticed that the built-in replication is slow/throttled for these actors when the array is large. It is especially slow when new players join, as it takes over 1 second to replicate each actor. What are some other techniques for replicating large amounts of data?

    I know the built-in replication has some restrictions, and the throttling is intentional. I've also read it won't replicate arrays that are larger than 2048 entries.

    I have seen some efforts to compress the voxel data before replication with NetSerialize, but I can't seem to find any working examples. I thought that the engine would try its best to compress (gzip?) the data when replicating anyway?

    Should I continue to pursue replication with compression, attempt to use RPCs to sync the data, or use sockets?

    #2
    Instead of replicating so much data, why not just serialize a single uint8* property to store all voxel's visibility bit?!
    I think engine also has a "BitField" property or something like that.
    | Savior | USQLite | FSM | Object Pool | Sound Occlusion | Property Transfer | Magic Nodes | MORE |

    Comment

    Working...
    X