Hi there,
we are currently trying to migrate our dedicated servers infrastructure from x86 to ARM but, during our performance testing, we have seen that ARM is 30% slower on the CPU even if the x86 and arm machines should have been compatible in terms of performance.
Taking a look to the insight trace captures, it seems to be mainly related to GameNetDriver.Tick that, on x86, seems to be “constant”, while on ARM it has huge spikes with a duration many times over the “normal” one.
These spikes seem to be located near an outgoing replication of struct which contains only a byte buffer (200 bytes approx.) that we use to transfer some physics’ prediction for each pawn; this structrure, due to game logics, has quite a high replication frequency which obviously impacts the overall performance.
Is there any known issue with ARM performance with UE replication? Is this caused by our buffer replication that is not optimized on ARM?
Do you have any suggestion on how to profile this further?
Regards,
Fabio Segantin