Does anyone know if Nvidia automatically calibrates a packaged game’s rendering to optimal settings through the driver app, like it does for all other published games in my library?
Or does the game company set the suggested defaults themselves?
I doubt it per game, nvidia most likely testing some popular/category of games and make some automation test on several different pc on a cloud data centre. Then use these as baseline to be compared to a specific game.
The optimise btn will look for an ini file, if it doesnt exist or is read-only, it will say ‘cannot optimise’.
Yes, I figured as much, but had trouble finding any documentation on the side of Nvidia and/or Epic.
You helped point me in the right direction, and I came across a list of supported titles:
So, any title on this list will not show.
I’ll contact them to see if there’s a way around this, because I am CERTAIN that the settings I’m using in my unfinished packaged game (packaged for debugging purposes) are not optimal, because the frame rate is lower in-game than in the editor (only slightly, but it should be better).
I’ve been tinkering with the settings and though I do see improvement, I’m sure there’s a switch I’m missing.