Released UE5 games-The 60fps performance monitoring of games using UE5.

Where and what are you basing this off? There is more to “performance” in the consumer sense then the GP. everything matters. From the OS down to The PCIE version the card is running at. CPU’s do matter. A “3060” will never reach 16.6 MS with a Pentium 2, in any 3d title from the last 7 -10 years. (i am sure there is an exception but you get the point)

60 fps has never been the “Gold Standard” 30 htz or 33.3 ms has been a baseline for decades. Even modern day home gaming consoles have titles that run at 30 htz. I am sure many of your favorite games are and were developed for 30 frames per second. 60 FPS is nice, so is 120, so is 500.

After reading this thread it also seems like you have strong opinions on TAA/ DLSS/ FSR/ TSR/ XESS. or really any temporal solution. So much so as to call is a Cancer! But is it? From Wikipedia,

" Temporal anti-aliasing (TAA) is a spatial anti-aliasing technique for computer-generated video that combines information from past frames and the current frame to remove jaggies in the current frame. In TAA, each pixel is sampled once per frame but in each frame the sample is at a different location within the pixel. Pixels sampled in past frames are blended with pixels sampled in the current frame to produce an anti-aliased image. Although this method makes TAA achieve a result comparable to supersampling, the technique inevitably causes ghosting and blurriness to the image"

Ghosting (the downside) is bad. No disagreement there. but in this image comparing No AA to SSAA, i think its pretty clear that any solution that is comparable to SSAA, is the best choice. Other solutions are older, and less efficient

From Wiki: (AGAIN)
“TAA effectively computes MSAA over multiple frames and achieves the same quality as MSAA with lower computational cost”

This is why MSAA has lost favor in real time rendering in some products. its more expensive. Any YouTube video explaining what the Nvida control pannel MSAA options did would have giving you this information.

And FXAA or Fast Sample Anti-aliasing is a primitive solution and i would recommend to no one. It was developed by Nvidia a long long time ago.

Then there is DLAA (an Nvidia only solution for now) WKI:
“DLAA handles anti-aliasing with a focus on visual quality. DLAA runs at the given screen resolution with no upscaling or downscaling functionality”

DLAA is also “trained to identify and fix temporal artifacts, instead of manually programmed heuristics” which Wikipedia states is similar to a blur filter.

But enough about anti-aliasing, what about upscaling? Modern effects are expensive, consumers demand higher quality visuals in each product iteration. In turn requiring these same consumers to acquire more powerful hardware over time. The cycle continues over and over leading to today. However, hardware has a limit, we are not there quite yet. But we are getting closer, computer microchip power has increased dramatically over the past 10 years. (in both server and consumer applications and deployments). Upscalling, if done well, offers a way to extend this process (why do you think NVIDA and AMD and Intel has promoted this as much as they have. 4K rendering is hard, 1440 is becoming the new baseline, and 1080P is going the way of 766p and 720p. People will just stop buying 1080P displays, and the manufactures will stop making them.

Engines need to be forward thinking, and account for the technology of their time. There will come a day when you can play (insert your game here!) at 4k 300 FPS. It happened to Quake! (that game simultaneously goes at 500 FPS and 19 with RTX ON! :joy:)

Nanite is one of these advance new technologies I was describing! its cool its new, it means i don’t need to make 5 or 10 LOD’s per mesh!. Same with Lumen, now I don’t need to worry about Static Meshes and the number of realtime lights in an area. Or for that matter baking the light maps for hours! Yes all this comes at a small performance hit, but I saved 5 or 15 hours!!! And the customer, well they will upgrade if they want to.

If unreal doesn’t suit your needs then use another engine, Make your own! Bungie made their own engine, Dice, same thing. There are other game engines, Unity, Cryengine, Unigine, Godot! the project tells you what engine you should use. The engine should never hamper the creative freedom of the project.

Apologies if i jump around, But this is a long post!

I want to jump back to some of your initial screenshots. The fortnight ones are meaningless (lets not even get into the settings you chose :joy: medium textures on a 12 GB 3060) it is two different parts of the map looking in two different directions, with different assets on screen. A better comparison would be a averaged benchmark over multiple runs over multiple hardware configurations. Digital Foundry, or any of those other types are the benchmark for this kind of stuff.

I have never heard of the “Rasterization to resolution scale” your referring to please provide a link?
Quick history lesson, as far as i am aware, Rasterization is a solution to approximate what a Ray traced scene looks like! :tada:

That’s all, if I need to add anything here I will!
Kudos

1 Like