Summary: To increase performance for my UMG-only project (no 3d graphics needed), I have tried disabling world rendering and limiting frame rate to 30 FPS, and still find that my packaged blank project uses 20-25% of my RTX 3090 at runtime. Any tips to bring this usage down?
Hello,
I am working on a UE4.27 project that only needs to use UMG for UI graphics, with absolutely no 3d graphics required. This project is not a 2d game that might make use of Paper 2D, or anything like that. Rather, it is more like a series of UI menus, all using widgets.
My objective is to make the packaged project as lightweight as possible, in terms of GPU resources.
Starting with the āBlankā starter project, I have tried the following:
Using the āSet Enable World Renderingā function (with a boolean input of FALSE to disable rendering). This definitely helps lower GPU usage by 2x-3x in my testing.
Setting and lowering frame rate limit using console command āt.MaxFPS xā (where x is max FPS). Iāve set my limit to 30 fps, for the sake of this discussion.
Lowering Screen Resolution scale. This doesnāt make any difference since Iāve already disabled world rendering. That makes sense.
Changing actual screen resolution. Makes only a slight difference since, again, world rendering is already disabled.
My issue is that after taking the āBlankā UE4 project and disabling world rendering and limiting FPS to 30, I am still using more GPU than I would expect or like to (20-25% on an Nvidia RTX 3090, as shown by MSI Afterburner). I am confused how rendering a black screen at 30 FPS takes this much GPU.
My question to you all is:
Would anyone be willing to share ways to further limit GPU usage? Remember that I need absolutely no 3d graphics.
Have tested this node in one of my tesscenes (a Paper2D world with Jump&run stuff), and that alone dropped the usage from 43% to 7-9% at max, while the HUD was still rendered and functional. Which makes that node really interesting for pure 2D games ^.^
However, my game has vsync enabled, or smooth framerate, because without it, i ran into the issue, that it was capped to 120 fps in editor, but in a testbuild, it ran on unlimited power, making the GPU run like crazy at 99% usage.
Thank you for the suggestion! Unfortunately, I just tested this out and it does not seem to have made any noticeable difference. Iāll keep learning about optimization and see if I can find another applicable method.
Edit: actually it raised the frame rate to the refresh rate of my monitor (60 hz) and now it draws slightly more resources compared to when I had it set to 30 FPS.
Ok, for a test i created a completely blank and therefore black new level, just with an event and a retriggerable delay coupled to a flipflop , that turns on and off this world render node, just to see, how much an empty level draws at my machine (using a 2070).
Framerate is set just like in my screenshot.
Whelp, i thought i suck at coding, but even that completely black and empty level starts with 25-30%, with absolutely nothing in it.
The moment, the flipflop turns off the world rendering, it drops to 5-8 %.
Yeah, even blank scenes with nothing in it, not even a camera, seem to be pretty power hungry o.O
Just to provide an update here, Iāve looked at this topic a bit more and done some more testing and it is pretty clear to me that itās more complicated than just saying āUnreal Engine is taking up 20-25% of my GPU when rendering a blank screenā like I did earlier. There are caveats, here.
If I run my packaged game twice simultaneously, my GPU usage does not double as compared to running only one instance of the game (it instead uses substantially less than double; e.g. going from 25% total usage for one instance to 33% total usage for two instances [so the second instance appears to only consume 8%]).
The other windows process to pay attention to here is the āDesktop Window Managerā whose consumption appears to rise and fall according to the framerate and resolution of your UE4 project. However, Desktop Window Managerās GPU usage does not increase by much (and may even decrease) when you open additional instances of your UE4 packaged project.
Whatās really interesting to me is that disabling world rendering in your UE4 project drops the amount GPU consumption by UE4 (ok, that makes sense) BUT it raises the consumption of the Desktop Window Manger (photo attached). I have no explanations for any of this.
Also, I tried this same experiment (blank project, limited framerate, unrendered world + black screen) on a much older GPU with around one-third the power of my RTX 3090, and it was using less of that GPU (only 16-18%) than my RTX 3090. So, without digging into the optimization knowledgebase, itās not super clear how the way that UE4 handles rendering relates to the reported usage of the GPU.
In my recent experience, the practical bottom line is: limiting your framerate and turning off world rendering really does lower your resource consumption, and even though running one instance of the game may keep your GPU at a higher-than-expected usage threshold, most of that thresholdās magnitude is due to the Desktop Window Manager. Because Desktop Window Managerās GPU usage does not really increase as a function of the number of running UE4 instances, starting additional instances of the UE4 game will only increase your GPU usage by a relatively small bit. That means you can run more instances before overloading your GPU than you would predict, if you had based your prediction off of the observed resource consumption of running one instance.