Video memory has been exhausted

Thanks changing to dx11 fixed it! :slightly_smiling_face:

Be mindful that, changing to DX11 disables nanite. Just an heads up!

2 Likes

Capture

Can someone please assist. I have tried to update my streaming pool and as you can see in the right column, Pool Capacity is 5000, but the Streaming Pool says there is 0. I have updated in both the project and in the UE program file, but still, I get messages of Pool over budget.

Can someone please explain this to me as it should read 5000 not 0, shouldnt it? And yes, I keep getting messages for over budget.

Ryzen 9
RTX 3070
32Gb Ram

Edit: DirectX12 is causing most of the issues - 30 series video cards have this issue apparently.

But this doesnt tell me why my screen grab still shows my pool at “0”.

any thoughts?

Thanks

Dean

Going to guess that it is probably because you have over 5 gigs of non-streaming mips that are probably consuming all of your texture memory and there is no room left to stream anything.

Thank you: ) Very insightful and a great reminder. This is the solution.

Thanks for your suggestion.
I updated to the latest NVIDIA driver.
It didn’t work for me.
Just to let other users know.

take this with a grain of salt. as its opinion more than fact. But I have found having photoshop open, especially with multiple files, will cause the message to appear. It seems to have only started after using nanites. Though have not tried to disable them. Unlike the error of “over max steaming budget” This one does not seem to have any performance issues. For myself its only worrying moving forward that will cause issues with final product. Though from what i have read it seems that wont be the case.

It has been like this for years. The released version of the engine is just to showcase “amazing new stuff” to get games and studio’s eyes on them, no matter if it don’t work and if you will NEVER see games shipped with those “amazing new stuff”.
Most of those features are simply deleted after years (like displacement/tesselation) no matter they were never close to be production ready.
After 8 years of using UE, i’m convinced that big AAA studios have access to a whole different product that the public release which is not realy made to create real games. It’s more to showcase and test thing and see the public reactions.

4 Likes

So I have a 13980hx Intel CPU with 32Gb DDR5 Ram and a 4090 mobile GPU with 16gb vram…

I created an empty scene with a untextured landscape based off of a heigh map. add one single megascan rock. a Cine camera for the sequencer, animated it moving away from the rock slightly just to have some movement. Click render to render out a sequence using the deferred renderer. (i.e. Unreals in game render engine). The scene renders out fine. I then put the landscape on a layer, the rock on its own layer and the sky in its own layer so that I can enable stencil rendering (for rendering layers). still using Deferred renderer it crashes with an out of video memory error.

Now this is pretty much an empty scene, I have extremely good hardware. I tried the same thing on my desktop pc with a 3090 24gb video card and the same issue happens. Obviously video memory isn’t the issue. Maybe we need video cards that don’t exist yet? In which case how does epic do in house testing?? LOL

I spent the entire day and night on friday till around 3am trying to troubleshoot this and I couldn’t find a solution. I even switched to DX11 and it didn’t help. As long as I try render stencil layers using the movie render queue it will run out of video memory no matter how little I have in my scene. Honestly I would love to know how others are rendering out elaborate cinematics out of unreal using stencil layers( Unreals version of render layers). So far I cannot get it to work on any hardware, laptop or desktop and I really do have top of the line hardware that should be more than capable.

Have you tracked the GPU usage while opening Unreal?

I always tirn cirtual texture off. Because i can never use the textures that way. I have virtual textures turned on th project settings. How do i make it so i can use the virtual textures in my materials? Thsnks!

SETTINGS > SCALABILITY SETTINGS > MEDIUM?

Worked for me!

1 Like

Enable virtual streaming in the texture.
Assign a virtual texture in the texture node in the material. Then “virtual color” will be an option in the node instead of “color” (and “virtual normal” instead of “normal”). The virtual options only appear if the node is set with a virtual texture.

But as I understand, VT is trade off of memory for CPU, and may help especially with big textures (4k+) but wouldn’t turn it on for everything. As soon as you do you can also see it also increases in disk size. So space and cpu in exchange for memory. And I assume only helps if the whole texture isn’t visible all at once.

It isn’t meant as an improvement over regular textures but as an additional option.

I also suffered memory exhausted in 5.3 when used 16GB (aprox.) of the 24GB of my RTX 3090.

Is this still happening in 5.4?

I fixed the issue in my project by lowering the streaming pool size using this command: r.Streaming.PoolSize xxxx

Replace the “xxxx” with the amount of MB. 3000 worked well for my project, but it’ll depend on your project and graphics card

2 Likes