GPU Memory leak after rendering with movie render queue

Hi everyone,

I’ll get right into it:
Every time I start a render job from movie render queue, I can watch my GPU memory climb to near max (I have a GTX 1070ti, so for me that amounts to 8Gb dedicated GPU memory). So far so good, the render completes all good. But after the render is done the engine or rather any and all applications on my PC become super choppy, even though there was no problem before… In the Task manager resource tab I can see that the GPU memory is still near max, and it stays there until I close UE. If I then proceed to boot it up again all is good an normal, with very little GPU memory allocated. This is 100% repeatable and happens every time I render for the first time, and after that first rendering working on is next to impossible and it feels like I work on a 1990’s machine… Even subsequent renders of the exact same scene & shot take way longer. Is this a feature where some of the data for the rendering is preserved in video memory to speed up subsequent renders, that just doesn’t work as intenden on my machine or is this a bug? Any help is much appreciated!

Specs:
UE 4.27.2
Running in D3D11 mode
x64 Win 10 B19041
i7-7700 8x3.6GHz
32Gb Ram
GTX 1070ti w/ 8Gb VRAM

Cheers, Andreas

1 Like

Yep, I’m facing the same issue and can confirm that it is present in UE 5.1 as well. Will try to find solution and post it here (if I would be able to find one)

Hey,
Did anyone find a fix for this issue?
I have a large archviz scene to render out and MRQ crashes EVERY SINGLE TIME.

I run a rtx2080 super with an i7, and UE5.0 is pretty much unusable due to d3d crashing.
Now on 5.1 everything looked like it was fixed, but MRQ is failing constantly.

Tried on a beefier 3090 with i9 and I get exactly the same issue. Scene runs flawlessly at 35fps.