Hey all! I’ve been working on my first UE5 project and for my birthday I nabbed myself a new graphics card. It’s a 3090ti suprim from MSI. Previously in my project, I was getting about 28 FPS average. It swayed between 22-29 with my Gigabyte 2080 OC with FPS smoothing off, and screen percentage set to 100%.
I just got my new 3090 in there and drivers installed. When I fired up the project I noticed my view port performance didn’t jump up like I was hoping. I was actually somehow getting the same magical average number of about 28 FPS. So, that was odd. Things I have tried:
A few restarts and no dice.
I tried looking at my CPU usage to see if somehow it was bottle-necking me and it never went above 52% utilization.
So then I tried setting screen percentage to 200% to see if my FPS would drop on the 3090 and it did not. It actually went up a tiny bit and I started seeing more 30 fps in the range which is weird as hell.
-Then I tried running the command t.MaxFPS 144 to set it to my monitors refresh and it did not change anything. Set it back to t.MaxFPS 0 to unlock presumably and that also didn’t change anything.
That’s all I could gather to do from google searching and such. I am going to try uninstalling Unreal and reinstalling it and see if that may possibly help. Let me know if you have any ideas, it’s all really appreciated! Thanks ya’ll!
Your graphics may be trying to pull directly from your built in graphics instead of your card. You can try resetting your Nvidia Global 3d settings. Here is a non-Epic affiliated link to instructions on how to do so from Nvidia themselves: To set a default 3D setting.
If that does not work, how much RAM do you have? And do you have any running background processes?
Any additional specifics and information you provide can go a long way in solving your problem!
Thanks! I’ll give that a go right now. I have yet to uninstall and reinstall Unreal. It’s just quite a large download. But I should give that a go too. Will report back
I will double check power settings but I believe it is! I also forgot to mention I disabled the integrated GPU as part of my checks just a moment ago and used only one monitor connection to the GPU. It STILL didnt work which is crazy. I thought the same thing you did!
Good thinking! I was watching a lot of youtube videos and turning things on and off I didnt reaaally understand. Been following William Fauchers videos on UE5 setup and debugging issues with Lumen and Nanite. Let me check my settings, I believe I have hardware raytracing turning off because of a nanite issue I was getting with shadows on far off objects being lower quality becuase they used the base nanite mesh as a source for the shadow.
Well the odd part is in Unreal its not even using much over 55% of the GPUs power. It is odd that the score is below average, if I had to suspect something it would be the heat. Peak heat on it was high. I cannot water cool it like I used to with my old GPU because my watercooling block doesnt fit. Will have to invest in a new one. But I’d like to figure out why Unreal isn’t even using all of GPU, the temps weren’t high during my testing. You can see in the video above.
Likely its my CPU throttling me at that point if I’m looking to get into the top 10 lol. It’s a couple generations old now. Also these guys would have overclocked their systems, I dont expect to get anywhere near as close.
In the Nvidia Control Panel in the 3D settings set it to use the Nvidia GPU as the global default, I’ve found that trying to set it on a program specifically still doesn’t get as good performance as having it as the global default.