VRAM optimization for a low-end PC target in a walking horror game

I am developing a walking horror game for a general audience and aiming to optimize it for low-end PCs.

Currently, my project shows VRAM (VMem) usage of 11.70 GB in the editor. While it runs smoothly on my development PC, I am concerned if this is too high for users with lower specs once packaged.

My system memory (RAM) usage is well-controlled at 2.46 GB.

  1. What is a reasonable VRAM target for a general indie game aiming for low-end compatibility?

  2. Does the VRAM usage shown in the editor typically decrease or stay the same after the project is packaged into a Shipping build?

    **Setting aside the high game metrics for a moment…**

1 Like

I would say 4 ( or 8 )GB is a typically low end VRAM capacity. Steam have surveys which show the most common configurations

It’s worth noting that the editor will just soak up all your VRAM until it’s gone. There’s no cleanup run during editing and gameplay.

However, in a packaged game, things happen efficiently. GPU memory will be cleared for new assets / levels if at all possible. So as long as your worst level fits in 4GB, you’re good to go :slight_smile: You can also limit your games VRAM by setting the streaming pool size with a console command, to see how well the distribution version works.

An easy way to tell, is restart the editor, and immediately run the worst level, with the stats showing.

Apart from that, you’d need to use insights, which can be used on a packaged game, apparently.

3 Likes

Thank you very much for your reply.

So, to be honest, does that mean the Mem and VMem values shown there are not entirely reliable?

It seems that the most accurate way to measure would be to actually package the project, or at least run it in Standalone mode and analyze it with Unreal Insights, since that would give numbers much closer to the packaged build. Is that correct?

Also, does stat Streaming show the actual VRAM usage?

I have been developing the game with the console command:

r.Streaming.PoolSize 2000

Since the PoolSize is set to 2000 and I am not receiving any warnings, does that mean the game’s memory usage is relatively light and within a safe range?

I would appreciate your clarification.

1 Like

Figures in the editor mean very little, that’s why profiling the packaged game is the way to go.

The only exceptions ( as a rule of thumb ), and running the level immediately after a restart ( and watching the stats ), and possibly also the CPU load.

You see, when the game is packaged, everything is baked, which means materials are much cheaper to load, all meshes are converted to instances etc. Quite a big difference.

If your whole thing runs in under 2GB, it sounds like you’re fine :smiley:

Another thing you can do is load the game onto Steam, but don’t release it yet. That way, you can give people keys to test run it for you. Also, if you rummage around there, you can find the right ‘curators’ who will be interested in playing your game, and can give you feedback.

1 Like

Thank you very much. Yes, I will use the numbers right after restarting as a reference point for optimization.

I had never thought about letting people test the game through Steam before release — that’s a very helpful idea. I’ll definitely consider it. Buying an inexpensive used PC with minimum-spec hardware for testing also seems like a good option.

Now, I have another question, this time about the Game value.

Right after restarting, the FPS stays stable at 60 as shown in the image, and there is a clear difference between the Frame and Game values.

However, after playing several times in the Selected Viewport from the editor, the game gradually starts to stutter. Eventually, the Frame and Game values become almost the same, and the FPS drops to around 30–40.

When I play in Standalone mode, it runs much smoother (around 50–60 FPS).
In the editor, it becomes somewhat heavier, and during Play In Selected Viewport it becomes significantly heavier.

Restarting every time is a bit tedious. In UE5, is it unavoidable that cache or other data accumulates over time while repeatedly playing in the editor?

My PC specs are RTX 5070 and 32GB RAM, so hardware performance should not be an issue.

Here is part of the Unreal Insights capture.

I also found some articles suggesting that Slate might be the culprit. I’m not entirely sure why it would cause this after doing some research, but I’ll include the information here just in case.

1 Like

Of course this will tell you a lot :slight_smile:

Apart from that, it looks like you have two things going on:

1 Something using too much CPU? Is there much on tick?

2 This is building up. So whatever is using CPU is not being destroyed or re-used

Does this sound like any system in your game? I see a lot of widget activity, for instance.

I also see what looks like the actor count rising. While you’re playing, the actor count ( in the world outliner ) should stay roughly the same ( unless you’re spawning enemies etc, which I think you’re not ). If the number just keeps going up, what is causing it? Try playing for a while, then press F8 and take a look at the outliner, what is there a lot of that should not be there?

1 Like

Thank you so much for your thoughtful help! I finally figured out the cause. I spent the whole day investigating and was completely exhausted, but now I think I can finally sleep well.

The cause turned out to be something very basic: I was simply opening too many tabs.

I often use “New Editor Window (PIE)” to play the game. While doing that, the Blueprint nodes I was working on remain visible, right? Since I use multiple monitors, I usually keep large Blueprint graphs open on another screen while playing.

It seems that this was the cause of the slowdown. Just having the Blueprint open is not necessarily a problem, but the real issue was having a large number of Blueprint nodes visibly rendered on screen at the same time.

When I leave code like in this screenshot open on the side, stat slate shows results like this.

And when I zoom in like in this other screenshot, the results look like this.

So opening too many tabs is not good after all. However, even after things became lighter, there are still around 1000 widgets showing in the stats. I assume this is unavoidable due to things like the Content Browser files and other editor UI elements, correct?

1 Like

Ok, that’s better :slight_smile: CPU is still pretty high.

I don’t know where those widgets are coming from, I don’t know your setup. Like I say, watch the number of actors in the outliner. Does that tell you anything, does it just keep creeping up?

Also, try cutting large sections out of the level and saving as a new temporary level. What can you leave out that makes a big difference to this temp level?

Also, try with only one window open.