Crash after multiple changes in PCG script

When working in PCG script editor, each change triggers : “prepare PCG tasks” process. After a number of times, Unreal crashes with “pagefile too small” . The system has 64 Gb internal memory and 3 x 64000 Mb pagefile (on different disks). Investigation of memory usage shows the following:
Open tabs: The world editor, Blueprint editor, PCG editor.
Unreal uses about 8Gb memory.
Editing in PCG editor, memory usage after “prepare PCG tasks” process ends, UNREAL Editor uses:
Added one connection between nodes: over 21 GB
Added ONE node into PCG script and connect it to another: over 31 GB.
Remove another connection: Just over 12 GB.
Add connection: over 23 GB.
Add a node: Over 34 GB
Disconnecting nodes: Over 45 GB
Connecting last added node IN: Over 40 GB
Connecing last added node OUT: freezes system, uses up to 99% of memory, or UNREAL crashes.
The memory usage seems to depend on the type of node(s) that is involved, in some cases Unreal wil crash, in other cases just freezes the system - and in some case, the only solution is to reset the system or power it off

An additional note: I noticed that this happens if I’m editing the script while the level that contains the output, is opened - it’s a big one so that may also contribute to the issue.
I noted - but am unsure - that if there is another PCG script in the level, the task is started also but it has no issues.
If I edit the script in another (almost empty) level, without the output, it doesn’t happen, the task is not started.

The culprit is UnrealEditor - there must be a massive memory leak in PCG.
I had one project with PCG open (5.3) and opened a second with PCG open (5.2), system became very, very slow. Crashed the second project by closing all windows from te taskbar, stopped the first one the same way, so there was no UnrealEditor active, according the taskbar. But taskmanager showed still about 50% of memory in use, and resource manager showed the reason:

This machine has 64 Gb internal memory and 3 x 64 GB pagefile on 3 disks.

I know it might be caused by hard-stop Unreal, but my feeling is that is that somewhere in preparing the PCG tasks, there is a huge memory leak. If I restart UnrealEditor again without restarting the machine, it is obvious that this memory is locked for use by other processes and performance is degraded.

I found that PCG has an editor memory leak problem, every time the content is generated, it doesn’t release memory, I’m not talking about the memory that the static mesh should take up, but the cache when it is generated, every time you generate PCG, it increases memory, but when you have generated what you want to generate, save the scene, and then exit the editor. Reopen the scene, and the memory usage is back to normal

I found more proof of this.
In Static mesh spawner, I replaced static meshes in “mesh entries” entries, after regeneration and cleaning and generating the PCG volume in the scene that contains the graph, both static meshes were showing. I didn’t try it, but I guess if I changed it again, the last and all previous meshes would show up. The condition remains until UE is stopped (and garbage collection has finished. This process can take minutes in background!)
It means that references to replaced meshes are NOT removed/freed/deleted.

1 Like

hey i confirm on 5.3.2 same problem when the pcg load the scatter point data to apply on landscape or any surface there is indeed a memory leak , an amount of memory is added after each generation until you save or leave/close the editor or it crashes when too much memory stack up !