Hello everyone!
I am facing a problem that is completely unclear to me and I don’t even know where to start. I will be very grateful for any help or advice.
I have two computers.
The older one with i5 13400 and RTX 3060, and the newer one with i7 14700K and RTX 4090. Both computers have 48 GB RAM.
The project was initially being developed on my older computer, but then I bought a second one and decided to continue working on it. I was surprised to find that the FPS of the packaged project under Windows 11 has almost no changes depending on the computer.
In my game
RTX 3060 gives 30 FPS, while
RTX 4090 - 43 FPS -
on one monitor and with the same settings. This already looks pretty strange. At the same time, in the editor on 4090 I see about 100 frames in Epic quality, but as soon as I launch Play in Editor - it immediately becomes 30-40 frames, as in the packed project.
The game also behaves strangely when I change the graphics settings (I use the standard Game Settings).
Through MSI Afterburner I observe the behavior of the video card while working with the packed project.
On RTX 4090 at a resolution of 1280x720 - 38 frames (video card consumption is 80 watts), at a resolution of 3840x2160 - 43 frames (video card consumption is 150 watts). If I set Foliage = 0 - the frames do not change, although there is a lot of vegetation. What is typical is that the video card consumption changes in accordance with the settings and resolution, but FPS somehow does not. The maximum power consumption of the video card in FurMark is 450 watts, that is, the full load of the video adapter does not occur in any case. MSI Afterburner shows video adapter load at around 50-60%.
On 3060 the frames are slightly lower, but the essence is the same - no changes in FPS occur when changing the settings.
I am in desperate need of help:(
Ready to provide any additional info and answer all the questions.
Thanks in advance to everyone!