I am currently learning about hard and soft references, and am trying to optimize memory for my UE 5.6 multiplayer FPS game. I have ended up in one of those classic situations where I have a massive dependency chain, so every blueprint now has a memory size map of over 3.5 GiB and basically loads the entire game into memory. So now everything is always loaded, and nothing ever gets unloaded. Let’s just say I‘ve earned the hard way why over-abusing the cast node and hard references is bad, since it loads other blueprints into memory and can very quickly explode the dependency graph if not handled correctly.
As far as I understand with my current knowledge, the best way to break dependency chains is to use a mix of native references, blueprint interfaces, event dispatchers, and soft references, in places where hard references can be avoided.
But for a multiplayer FPS, I’m not sure what should be soft references and what should remain hard references. I have heard mixed advice, but it seems that if the object/class should always be active all the time, and is critical to the game (like the player), then it should be a hard reference. If it is something that you spawn in and destroy and does not need to be in memory 100 % of the time, then it should be a soft reference.
But soft references also take time to manually load into memory, so I was thinking, how does this rule apply to a multiplayer FPS - where reaction time and instant feedback is very important?
What can realisticaly be turned into soft references?
you have a good understanding of it so really the answer depends. for one, what issue if any are you having? if your whole game fits in memory then there is no need to change anything.
it really comes down to 2 things, memory which is only an issue if you run out, and dependencies which is only an issue if you want to reuse logic between projects.
aside from that my advice would be to use soft references for effects (sounds, materials, niagara etc) reason being is they are usually the big memory hogs anyway and two it doesnt effect gameplay directly, ie if a impact effect doesnt fire once no one is likely to notice
My issue is mostly multiplayer performance. It’s a 12-player PvP multiplayer FPS on a medium-large map. So far, it’s made entirely in blueprints with a listen server architecture (one of the players is the host). So far, performance is CPU bound, and I could imagine part of the reason is that the CPU is spending a lot of time waiting for memory.
When testing with a full lobby of players, performance starts to drop a few minutes into the game, and noticeable hitches and stuttering start to appear. Some of it might still be related to network bandwidth and replication, but I have made a lot of optimizations there already. I profiled a bit deeper, then noticed that every blueprint has an absolutely enormous memory size map and dependency graph. And nothing seems to ever get unloaded from memory because of this entanglement and the circular dependencies.
So that got me thinking, does poor memory management have an effect on multiplayer performance in a setup like this? I know that replication cost scales with the number of players, so I could imagine the same thing applies to garbage collection. If everything hard references each other, I suppose that garbage collection gets more expensive and time-consuming, and will not be able to free up memory because of said hard references. And if things with replication logic get stuck in memory, I could imagine that is part of the reason why performance starts to suffer.
And also, if I want to improve performance on less powerful computers, this seems to me like a good area to optimize. I ofc know that you can only get so far with multiplayer performance on a listen server with an only-blueprints-based architecture, but is my assumption correct, and do you think optimizing memory usage will improve multiplayer performance in a setup like this?