Why is *window size* not *screen DPI* used for DPI calculations in Slate and UMG?

For desktop platforms in windowed mode (and possibly also mobile platforms that allow non-fullscreen apps - haven’t tested this), the DPI scaling curve isn’t using the DPI of the screen itself, which is dependant on screen resolution vs physical size of screen, but rather treating the windows size changes as DPI changes.

All other software I’ve used, as well as UI frameworks like Qt, use the former, which makes sense, as DPI a is property of the screen - it’s constant for a given screen regardless of window size. Unreal is the only software I see that does the latter. It seems counter-intuitive and goes against reasonable expectations. Using an application in windowed mode, you expect the elements to have the same size as full-screen, instead of scaling down. But also, using the same application on an actual high-DPI screen, you expect the elements there to be larger in pixels, and same in physical size.

So, if you have a regular-DPI (1.0) screen, and display Unreal in windowed mode smaller that full-screen, it’s treated as high-DPI. Therefore in windowed mode, the elements on screen are scaled down, making text harder to read, buttons harder to press etc. On a high-DPI screen, using windowed mode makes the DPI even higher, so the problem persists on any screen of any DPI.

So, for Unreal, the only solution is setting the DPI curve to a constant 1 to have elements not change size. That solves the strange behaviour in resizing the window, but that also means the application becomes DPI unaware, making elements appear too small on high-dpi screens.

Why is there no setting to have the Unreal DPI calculation to be linked not to game window size, but to the actual DPI of a screen, like every other UI framework and software out there?