Replicating Large Arrays and Data Sets

Article written by Alex K.

The replication system in Unreal Engine was built and optimized for relatively small amounts of real-time data required to maintain the simulation state of an action game. Unfortunately, this means that the system isn’t as optimal for some other use cases, like synchronizing larger data sets or larger arrays, and we don’t recommend using Unreal replication heavily for this purpose. Even so, this is a use-case that some projects might come across, and there are some settings and limits that can be adjusted to help in that situation.

First, if you’re using a version before 4.26, raising the bandwidth limits is recommended. These are configurable in the *Engine.ini file:



These values were all raised to 100,000 bytes in 4.26. It’s also worth noting that you may have to specify the NetDriver settings for a subclass, such as [/Script/OnlineSubsystemUtils.IpNetDriver]. If the same values are set for the subclass, those values will take precedence.

You should also increase the value for the net.PartialBunchReliableThreshold CVar. Using this threshold, if property or RPC parameter replication exceeds a certain multiple of the maximum transmission unit (MTU) size, the partial bunches will be treated as reliable, improving robustness when larger arrays or other data structures are replicated. This value is by default 0, and it’s recommended to try setting this to 4 starting out.

If you’re using version 4.25 or lower, you can set the maximum replicated array size limits in the *Engine.ini file or in the editor under Project Settings → Engine → Network. In version 4.26, these values have been deprecated and are no longer used, as the engine now uses the previous values’ maximums.

You may also get errors in your log saying “Received a partial bunch exceeding max allowed size” or “Attempted to send bunch exceeding max allowed size.” When sending or receiving a bunch, UChannel will check its size against NetMaxConstructedPartialBunchSizeBytes. If your bunches are too large, you may want to adjust the net.MaxConstructedPartialBunchSizeBytes CVar. It’s worth noting that this value is already quite large, so increasing it is not recommended unless completely necessary.

1 Like


With regard to the Steam network, can you then address or advise the issue when sending larger arrays and datasets, how to flush the socket?

Otherwise you are stuck only able to send <500k bytes 10x/sec and you blow out your connection. Increasing the size does nothing and further, seems to cause real issues with Steam notwithstanding the claim you can send up to 100MB in a single message!

I have had to resort to using RPC only sending 1 record at a time and only every 0.1 seconds. Fortunately, it does work, and fortunately I don’t usually have to update more than about 2-3MB at a time. But my players NEED to be able to transfer and download data resultant from game play.