How to get native display resolution

Hi,

I would like to make a simple initialization script that reads the native resolution of the display (or alternatively, reads the native resolution and then provides the user a few resolutions to pick from, based on that).

My problem is that on a Mac with Retina display, I can only get max ½ of the resolution (¼ of the pixels). On my iMac Pro, for example, when I launch an Unreal Engine build, it’ll start at 2560x1440, even though the screen resolution is 5120x2880.

When I do the same using Unity, Godot or SceneKit, the Window gets intitialized using the native resolution of the display. Only UnrealEngine seems to pick its max resolution based on dots instead of pixels (which leads to the discrepancy).

Using the console and doing a r.SetRes 5120x2880 doesn’t work.

I know there’s a workaround by using a tool like RDM to set the desktop resolution to 5120x2880 non-retina. After launching the app in this mode, it runs fine at native resolution, but I can’t expect users to do that every time they launch an UnrealEngine application.

Is there a way to teach UnrealEngine how to get the display resolution?

HiDPi works fine in the editor, even when I use the “new editor window” setting. This is only a problem with standalone builds.

Many thanks!

Just wondering have you found a fix to this because I am also stuck… :slight_smile: