Unreal Engine 5.5 Released

This one:

Why would you even need to do that though? Itā€™s 2025, not 2005, just set the resolution scale higher, since itā€™s going to be downsampled to fit your screen anywaysā€¦ Want 4k on a 1080p monitor? r.screenpercentage 200 and youā€™re done. Want 720p on a 1080p monitor? r.screenpercentage 66 and so on.

And why not?

Iā€™m sorry, but Iā€™m not a fan of conformist arguments that appeal to us being happy with having one (or many) less option because ā€œit was useless, thank goodness they broke it.ā€

Iā€™m working on a PC benchmark, so running the benchmark at the desired resolution is important. Even though some people might think Unreal is only meant to create what they have in mind.

Using screen percentage, I doubt that setting ā€œ4Kā€ via screen percentage yields the same performance as setting a real 4K resolution. Plus, 200% for a 1080p screen isnā€™t the same as for a 720p or 1440p screen, not to mention ultrawide displays. Am I supposed to also program an algorithm to calculate the screen percentage equivalence based on screen size? Are you telling me that games offering a dropdown menu with resolution options are doing so through screen percentage? Iā€™m also not sure if screen percentage can go beyond 200%, which would mean a 1080p screen couldnā€™t run the benchmark at 8K, for example (yes, I can already imagine your perspective: why would you want to run it at 8K if weā€™re happy with fewer options?).

Sorry if I seem rude, but Iā€™m really tired of Unreal breaking things and users taking a defeatist stance, saying, ā€œitā€™s better this way, with broken options that limit their use.ā€ Honestly, I canā€™t make any sense of that attitude.

Anyway, I just tested my packaged benchmark in native UHD and in QHD at 150% (scaled UHD), and, indeed, the performance is not the same, with a difference of about ~5% in this test. Why is this important? Because a user with a 1080p screen who wants to run the benchmark in UHD should get the same performance as when they buy a UHD monitor and run the benchmark at its native resolution. Itā€™s that simpleā€”I want realistic numbers.

I have also tried 1080p fullscreen on my QHD monitor, scaled to 200%, and the quality of the image was much worse and blurry than when running it at UHD directly.

So, honestly, Iā€™d much rather have the option to set the native resolution I want instead of relying on scaling tricks that donā€™t correspond 1:1 with reality. And not just for the fact of being an accurate benchmarkā€”if it were a regular game, anyone should still prefer it, based on the tests Iā€™ve just done.

It does within margin of error. Try it with a previous version of UE5 where you can still use setres without it bugging out.

Yeah, you use a simple equation to calculate the % you need. Just get the display resolution and then the equation is: resolutionscale = 100*(target resolution/display resolution). Want to go from 1440p to 1080p? (1080/1440)*100=75. Want 4k on a 1440p? (2160/1440)*100=150 and so on.

Most modern DX12/Vulkan games all do this under the hood, even when they give you resolution options in the settings. Exclusive fullscreen for DX12 really isnā€™t a thing any more and if you do force it by disabling FSO/MPO and messing with the present methods, it can have some performance drawbacks with modern drivers and builds of Windows. So thereā€™s no need to actually cause the monitor to blank and change to a separate resolution. Some games still give the option, but the DWM still treats it like borderless fullscreen(no screen blanking between alt+tabbing back to desktop) when it composites the frames with MPO. Basically, thereā€™s no advantage to using exclusive fullscreen anymore assuming youā€™re on a build of Windows from within the past few years.

Plus, straying away from a monitorā€™s native resolution introduces artifacts on the actual image displayed on the physical screen. They should always be ran at native resolution. You can use filtering options like bilinear/bicubic for down/upscaling an image and it will will result in a better image on the native resolution screen vs manually changing the display resolution to a non-native resolution on the monitor. And by native resolution, I mean at the hardware level, how many pixels are physically built into the display.

And again, itā€™s 2025ā€¦ People alt+tab a lot and tons of people use multi-monitor setups. Though Iā€™m sure they will eventually get around to fixing the setres issue in some way, shape or form.

Hi @IronicParadox ,

I thank you because of trying to help and giving advices!

Yes, I tried in 5.3 with setres working as expected.

In my case, I was lucky enough to enforce the custom resolution in forced fullscreen mode (with a black flicker when changing resolutions), and no one observed artifacts or compatibility issues with Windows. It even locked the possibility of resizing the window when switching applications with Alt+Tab, with the benchmark continuing to run at its custom resolution (meaning I was very careful to prevent cheating). I use three screens: two at 1080p and the main one at QHD.

But if this is nowā€”at least for the time beingā€”the only wayā€¦ Are you saying the GPU internally renders the same number of pixels in the case of, for example, 1080p@200% as it does with native UHD? If so, and to maintain equivalence across different monitors (but with the same PC hardware), should I always use TAA, which should be more consistent across different resolutions, instead of TSR, which I believe would cause more performance variation in the case of running at, for example, 1080p@200% vs. UHD?

By the way, the bilinear filter you mentionedā€”this isnā€™t something that can be configured in Unreal, right?

Thanks!

As far as I know, any resolution scaling will automatically use algorithms like those when up/downsampling, usually something like bilinear/bicubic/lanczos. Your monitor doesnā€™t* though, so if youā€™re setting a 1080p monitor to run at say 720p, there are still physically 1920x1080 pixels built into the screen. So those 1280x720 pixels that you want to map to your display arenā€™t going to 1:1 line up with the physical pixels built into the display. This leads to hardware level aliasing issues. So itā€™s always better to run the display at its native resolution, like 1080p and if you still want 720p quality, you upscale it to 1080p. There will be far less artifacts/less aliasing issues by doing it that way.

*And yes, Iā€™m sure a lot of monitors have their own algorithms for upscaling non-native resolutions to native, like running 720p on a 1080p native, but it would be a blind and raw pixel by pixel upscale with zero access to things like frame buffers

Oh and forgot to mention that resolution scaling also lets you keep things crisp, like UI/text, and scale the 3d content. So your UI could stay at 1080p, but the 3d content would be at 720p if you wanted.

But anyways, this is all derailing the hell out of the thread, so Iā€™ll leave it at that.

2 Likes

@dcy_shadowfall
Hello,

You think you would have time to submit your Camera acceleration change as PR request so that we could have cvar that disables camera acceleration for resolving lag purposes?

Itā€™s still issue with ue 5.5 and ue 5.6 it seems

The aforementioend post:

https://github.com/ShadowfallStudios/UnrealEngine/commit/85798d0f9d7378f7d3d41f9f31cf22340c42abfd

  • Thanks!

Iā€™m testing a handheld AR app (Android ARCore) and am afflicted by bug UE-223981 passthrough camera image missing. As with the OP of that bug report, I find that the camera itself is working and scene geometry is being tracked.

My concern is that the projected fix for this is 5.5.3, which seems quite a stretch for a bug that literally prevents handheld AR from functioning on the Android platform. Iā€™ve upvoted the issue, but Iā€™d like to politely express my concern to the Epic team here as well.

In the meantime, I can continue to develop on 5.4 and will compare the relevant source code and see if I can make a local workaround.