Could anyone please tell me why the compile time progressively increases over time, becoming slower and slower? This happens only in my player’s blueprint, which has a lot of things going on. I’ve already tried to create as many functions as I could and also to collapse nodes (though I’m not sure if that makes a difference). I have an i7-10700F and an RTX 3070.
Restarting the engine fixes the problem but I have to do this every once in a while which gets very annoying when dealing with fixes and all sort of stuff.
Hard to tell from just a description but I’m guessing you are probably using hard references in your blueprint this will cause the engine to load into memory any other blueprint that is referenced causing higher memory consumption.
Check if you are not hitting your max ram capacity and the system may be forced to rely on virtual memory which is slower than physical ram.
Edit: If memory is the bottleneck causing the slowdown then look into class and actor soft references and the mechanics how to load them in game
I’ve noticed that in the “Memory” tab, the “Cached” section shows 5.1GB, which has increased over time. Considering I only have 16GB of RAM, I suspect I might need an upgrade. I plan to upgrade to 64GB of RAM next week.
Could you please confirm if this is indeed the issue?
Another thing that comes to mind is memory usage when attempting to cast to other objects.
If you try to cast to many classes inside of the blueprint you are actually loading these cast attempted classes into memory. You could try replacing the casts with interfaces if this is impacting your project.
5.1 GB usage isn’t bad considering windows also eats up it’s fair share of memory. Unreal can use a lot of memory if you let it. This of course will be reduced once the game is packed and the editor is not taking up RAM (only the game), but you need to be able to get to that stage.
A pretty good talk regarding asset dependency chains:
I do use interfaces. I have only a few casts of player which I will also get rid of and have 0 casts soon.
However, I’ve noticed a while ago that when I collapsed nodes and made 2 more event graphs to move around the code, the compile time was slightly better but still annoying.
Is it a viable choice to move a lot of the code inside components also?
The character uses the components so they are loaded with it directly so this probably won’t reduce memory consumption. Haven’t tested this directly though.
Perhaps if more code is in the components then during BP compile only the main character will be evaluated, so the engine will have less code to check. Just don’t go overboard with components as they too have an impact on performance (best use basic components that don’t even have their own transforms, just hold information / functions)
On a side note I’ve actually ordered x2 32gb RAM (64 in total) based on our yesterday’s discussion.
Tonight I will test them out to see if there are any improvements.
It will help, but in the end, this is going to cause you problems.
To be clear, this might not be the cause of the blueprint lag, but it definitely won’t help
I once helped this guy with BP lag, when I finally got a copy of his project, the BP was just MASSIVE. No wonder there were problems. You can only stuff a certain amount of code in one BP
He had literally written the whole game in one BP.
PS: Both subjects ( soft ref / interfaces ) are a bit subtle, so maybe come back if you get stuck.
Thank you, I will deep dive into both your video and the very first video from 3dRaven (from above). I will devour everything regarding soft references and interfaces and will come back. Thank you for your help!
So, I’m working my way out to create soft references.
The size of the player’s Size Map is starting to decrease.
I am doing this by creating lots of soft variables that I use.
Now, my question is, will I encounter other type of issue/issues for having so many soft reference variables, or am I good to go?