Article written by Alex K.
The replication system in Unreal Engine was built and optimized for relatively small amounts of real-time data required to maintain the simulation state of an action game. Unfortunately, this means that the system isn’t as optimal for some other use cases, like synchronizing larger data sets or larger arrays, and we don’t recommend using Unreal replication heavily for this purpose. Even so, this is a use-case that some projects might come across, and there are some settings and limits that can be adjusted to help in that situation.
First, if you’re using a version before 4.26, raising the bandwidth limits is recommended. These are configurable in the *Engine.ini file:
ConfiguredInternetSpeed= ConfiguredLanSpeed= [/Script/Engine.NetDriver] MaxClientRate= MaxInternetClientRate=
These values were all raised to 100,000 bytes in 4.26. It’s also worth noting that you may have to specify the NetDriver settings for a subclass, such as
[/Script/OnlineSubsystemUtils.IpNetDriver]. If the same values are set for the subclass, those values will take precedence.
You should also increase the value for the
net.PartialBunchReliableThreshold CVar. Using this threshold, if property or RPC parameter replication exceeds a certain multiple of the maximum transmission unit (MTU) size, the partial bunches will be treated as reliable, improving robustness when larger arrays or other data structures are replicated. This value is by default 0, and it’s recommended to try setting this to 4 starting out.
If you’re using version 4.25 or lower, you can set the maximum replicated array size limits in the *Engine.ini file or in the editor under Project Settings → Engine → Network. In version 4.26, these values have been deprecated and are no longer used, as the engine now uses the previous values’ maximums.
You may also get errors in your log saying “Received a partial bunch exceeding max allowed size” or “Attempted to send bunch exceeding max allowed size.” When sending or receiving a bunch, UChannel will check its size against
NetMaxConstructedPartialBunchSizeBytes. If your bunches are too large, you may want to adjust the
net.MaxConstructedPartialBunchSizeBytes CVar. It’s worth noting that this value is already quite large, so increasing it is not recommended unless completely necessary.