NEW(3/22/2024) UE5.5+ Feedback: Please Invest In Actual PERFORMANCE Innovations Beyond Frame Smearing For Actual GAMES.

I made my entire first game with precomputed lighting, so tone it down a bit and drop the condescending attitude I know the system’s limits and advantages.

Time Breaker: TIME BREAKER on Steam - Full bake with some stationary lights.

I also made another one with Lumen as the default system: DESORDRE : A Puzzle Game Adventure on Steam

And the new one (Lumen): A Silent Desolation on Steam

So it’s pretty funny to see all these claims based on nothing but air and feelings.

3 Likes

Funny how all beginner asset flippers struggle with all the same things.

  • Your environments look sharp, boxy and incohesive.
  • You underestimate the importance of bevels and imperfection.
  • You can not follow a single art style.
  • You don’t realize that Light must support Form, and Form must direct Light - these are the very 101 basics of composition.
  • You underestimate the importance of atmospheric effects.
  • And post processing too (lens effects and so on - eye effects apply too).
  • Lack of complex shaders besides what’s from Marketplace.
  • Lack of detail scale level variation.

But that’s just a quick look on your environment, and besides the point in this conversation, except that you shouldn’t voice your opinion when you lack the expertise to do so.


EVERYTHING that you have said in your last posts has ABSOLUTELY NOTHING to do with the missing functionality i’ve been talking about in mine. You just ignored it.
There is A TON of things that you CAN NOT bake in UE5, and you probably wouldn’t even realize it without my list in my previous posts since you are a beginner.

To me, it appears that you just want to talk and not really focused on the subject. And that’s fine, we all have our problems in life and need someone, but please do it elsewhere, what we are on here is just too important and eventually will benefit everyone - developers and players.

2 Likes

The original topic is pointless, based on completely biased benchmarks and totally clueless analyses by some guy pushing his agenda and his crappy YouTube channel.

Editor-based “analyses” (seriously, who even does that in 2024? profiling in editor …), tests with cubes, nothing resembling a real situation with an actual game pure garbage.

Why are you even trying to discuss serious topics, poor guy? This thread has been garbage from the start, with no real or interesting analysis whatsoever.

By the way, you’re ridiculous and triggered over nothing probably a buddy of TheKJ who criticizes everything and accomplishes nothing. We’re used to it.

Oh, by the way, I just thought of itsend us your Steam page so we can check out your work! (This isn’t related to the topic, but since you’re all talk, back it up and show us :slight_smile: ).

2 Likes

That’s not “Agenda” and I don’t know him personally. You just can’t accept the reality for whatever reason.

I’ve been voicing my opinion regarding UE optimization flaws since 2018, before you even started developing games, and many of the things I and other people talked about have been implemented with time. The most recent one is the CPULM support for World Partition, which is coming in UE 5.5, thanks to the community, as Epic said themselves.

However, as many developers and most gamers noticed, Epic tends to choose a different path lately. Instead of continuing developing optimization tools, they make one-click universal solutions for the ease of use for Fortnite kids and people unrelated to the GameDev/Realtime industries.

Even though these modern tools often result in a extremely poor performance and sometimes in a bad image quality.


I’ve been using a lot of engines since 2011, like GameMaker 6 - Studio 2, Unity3D, CryEngine, Unigine 2, and finally, Unreal from 2017. Each engine has/had different problems, but almost all of them perform bettern than Unreal out of the box.

Look at the Unigine 2 landscape solution for example, where you have stable 400 fps on a large landscape with voxel dynamyc GI, which by the way Tim Sweeney wanted to purchase around 2019 but was refused by Unigine developers because how deeply it was tied to their engine.
In Unreal, such scene would give you 40 fps, at best.

Look at Rockstar’s engine insane real time GI optimization techniques achived with partial baking which is basically impossible in Unreal. Look how beautiful and performant their volumetrics are, compared to Unreal.

Look at Frostbite’s baking capabilities, where you can bake Specular reflections and much more, where you can have dynamic shadows from foliage at 240 fps!

I could give you a lot more examples, but is there a point, if you just lack an ability to understand and not willing to accept it?

1 Like

It seems a bit unproductive to keep circling back to the same complaints about Unreal’s optimization as if nothing has changed. Optimizing in Unreal Engine in 2024 is entirely achievable the tools are there for those who know how to use them effectively. Take Stray, for instance: the game demonstrates that with careful planning efficient culling, baked lighting, and well managed LODs it’s possible to achieve excellent performance without even needing upscaling, which is rare these days. The problem isn’t a lack of tools but rather a lack of proper utilization by many developers.

That said, I completely acknowledge that the baking workflow in Unreal has its limitations, especially compared to some other engines. Baking lights, for example, is essential for performance in large, complex scenes, but it can be quite restrictive. You can’t blend between different baked scenarios, so adapting dynamic elements can become cumbersome. Plus, baking for large, open worlds or scenes with intricate geometry requires significant time and storage, which isn’t ideal for every project. When it comes to foliage, blending is particularly challenging since you can’t bake it effectively, meaning performance often takes a hit with dense vegetation.

Yes, there are dynamic GI methods in other engines whether it’s voxel-based or probe-based that offer interesting alternatives with their own strengths and weaknesses. I’d be fully in favor of Epic developing other methods beyond the outdated CPULightmass. Expanding options for dynamic GI could open up creative possibilities and help overcome some of the limitations we’re currently facing with Lumen and the need for baked lighting in more complex scenarios.

And yes, the newer systems like Nanite, Lumen, and VSM still have optimization challenges. Nanite can be costly with masked materials (making foliage especially demanding), and animated vertices only increase the resource load. Epic is actively addressing these issues, though, as can be seen in their GitHub commits, and there’s also room for Lumen and VSM to improve in terms of efficiency.

So rather than recycling biased points or rehashing the same arguments, it might be more productive to focus on constructive discussions around what’s already working and where Unreal can realistically improve. If Epic hasn’t responded to some threads while addressing others, it might simply suggest they’re prioritizing feedback that’s offering genuine value and actionable insights. Unreal has come a long way, and for those willing to adapt and work with what’s available, the possibilities are definitely there.

Thank you for finally admitting it. Anyway, I must correct you that this “optimization” too often leads to rewriting the engine, which shouldnt be the case.

And I am not saying that “nothing has changed” - I said the opposite:

The problem is that it is happening way slower than in other engines, and it became way worse than it once was. Instead, they are focused on UEFN, tech for filmmakers, launching Fab without wishlists and text reviews and so on.

Of course, not every game would benefit from every type of baking, but a good engine should have all these tools or at least a possibility to seamlessly integrate them via plugins - this is what Game Engine is - it’s a toolset.


Stray, while being on the more performant side of Unreal games spectrum, is not even that impressive quality-wise (the art style is great though), here is the comparison of Star Wars Battlefront (2015) and Stray (2022)

Screenshots




In my opinion at least, Battlefront still looked better after 7 years. AND it’s rendering at least 2 times faster while having more advanced tech such as tesselation, more dynamic objects and more VFX.

2 Likes

Of course, we definitely need more flexible methods maybe something between the old CPU Lightmass and Lumen. There’s nothing in between, which is a real shame.

Battlefront was indeed an impressive feat for its time, and it still looks stunning even in 2024. It’s a pity we no longer see this level of performance/visual quality, even with Frostbite.

Let’s see if future developments will bring more flexibility while also improving existing systems.

1 Like

For those who are interested in cheaper lighting solutions than lumen, I would like to point out this topic here:
SSGI is worse in unreal 5? - Development / Rendering - Epic Developer Community Forums (unrealengine.com)

In ue4, I managed to get some nice ssgi lighting through the use of console commands. It was slightly noisy ssgi, but smear free, and very cheap. However in ue5, I cannot achieve such a good result. In UE5, ssgi is more noisy, locked to a lower resolution, and more expensive for me due to having certain performance enhancing settings removed.

Why are you comparing an in-game screenshot of a 10 million dollar AA carton cat game to an offline render promotional image of a 100 million dollar AAA Star Wars game? Battlefront 2015 I think was the first game to make use of Photogrammetry, which is why it looks so realistic. Also, Andrew who did the Photogrammetry for Endor now works at Epic. Isn’t that ironic?

2 Likes

Lol, did you play the game yourself? Because I did. The graphics is exactly the same if not better in the actual game as it is on the screenshots here.
Can you imagine that there was a time when screenshots didn’t lie and the game ran at 120 fps on average hardware?

And why I am comparing this? Because of the 7 years difference.
15 years ago a single person with 0 budget could only make a super mario game, now everyone can make it look like AAA production (except performance lol)

Not really, Photogrammetry is great and all but it’s more about Lighting and Engine Performance.
No amount of Quixel assets can grant you such performance.

2 Likes

Battlefront 2015 I think was the first game to make use of Photogrammetry, which is why it looks so realistic

We’ve had photogrammetry in PS3 games.

The problem is the time is takes to optimize them and last gen limited the quality. Nanite is an approach, but a poor one since it basically doubles, if not triples geo timings against quad/material efficient LODs and also promotes too much subpixel aliasing.

1 Like

Yes. I played the game on PS4 when it came out.

No. The image with the “AT-ST” is NOT an in-game screenshot. That image shows no gameplay, no HUD and is rendered at an extremely high resolution like 7680 × 4320 to remove aliasing. It’s literally an ad for the game (scroll down to the image gallery).

For contrast, here are some *in-game* screenshots



You don’t like Unreal Engine?

So you think the lighting looks bad in Stray?

Stray in-game screenshots



2 Likes

I’m really not sure what is your point here?
You compare a compressed screenshot from an alpha version of a 2015 game which performs at 240 fps, has more dynamic vfx, has more movable objects and dynamic shadows to a 2022 title which runs at 60 while being much more static.
You compare 2 completely different lighting scenarios - 1 is daylight with nothing but sun+sky, another is urban night scene with lots of small colored lights and dense volumetric fog, RT reflections and shadows. So?

And even if Stray performed as good as Battlefront, it’s kind of beyond the scope of the conversation here, which is about missing baking/optimization tools, Nanite-Lumen-VSM lags, TAA, TSR and so on…

No, I didn’t say that, I said that it’s not impressive in terms of quality/performance ratio.

Yes I am a Unity spy.

2 Likes

So I have a few conflicts with Performance in Unreal Engine as well. While I am able to resolve a major portion of this and get to over 400fps (yes it’s possible in ue5.), the major conflict seems to be in engine parts.
Some of my forum posts have gone unchecked as well when mentioning the performance drop in Groom. DX12 missing solutions that become DX11 recommended settings on Steam for example, and over all Vulkan to Linux support. It isn’t too bad to max out the settings for performance, however I wish they would have a presentation that encompasses the best performance using Vulkan with Groom and effects in unreal. A lot of the Linux, Steam, Groom and DLC documentation requires an update.

Groom cards usage between 5.2.1 to 5.4 crash.
Games for Steam crash and require setting back to sm5 dx11 and not dx12 initially. Though you can get dx12 to work, most leverage dx11. (btw Proton uses vulkan drivers)
Setting Groom per platform causes the Groom to disappear entirely unless on Project Settings are on Default.
Hitching occurs during rhi thread render passes.
Little to no Vulkan updated documentation based on items added between ue4-ue5 in general.
Also there’s typos in the Pixel Streaming code on the github… which causes connection and performance disconnects/

I don’t think comparing Stray to a Star Wars wallpaper proves that Unreal Engine is unoptimized.

Comparing Stray to Battlefront is comparing apples to oranges.

Stray doesn’t use ray tracing at all. It relies on very well-placed cubemaps and SSR.

That’s why I compared in-game graphics only.

That’s why I have chosen the most identical lighting scenarios for both game.

That’s cool, SSR really tricked me with that camera angle.

“EPiC GaMEs DoNt ImPRoVe pErFORmanCes”

Meanwhile the UE 5.5 patch note :

  • Optimized Metahumans (less than 100MB vs 2-3GB)
  • Faster hardware RT targeting 60 fps support on consoles
  • New denoiser for lumen reflections hugely reducing shimmering, artefacts, ghosting, a lot more stable
  • Megalights, similar to Nvidia RTXDI reducing performance cost with many lights (but have some drawbacks for now)
  • Nanite now support DX12 Work Graph improving performances : “Nanite compute materials now can use work graph shader bundles on D3D12 when r.Nanite.AllowWorkGraphMaterials and r.Nanite.Bundle.Shading are both set. Both of these default to off at the moment.”
  • Render Parallelization improvements
  • Many QoL improvements
1 Like
  • Megalights, similar to Nvidia RTXDI reducing performance cost with many lights (but have some drawbacks for now)

Megalights are an incredible performance killer nobody asked for. The reveal for megalights and every other environment showcase for them have a been an overhyped joke.

  • New denoiser for lumen reflections hugely reducing shimmering, artefacts, ghosting, a lot more stable

And Lumen GI is 20+% slower than 5.4 (which already much slower then 5.3) and still has noise, splotching, ghosting, and terrible disocclusion.

Nanite now support DX12 Work Graph improving performances

Still can cost x3 more than LODs.

  • Render Parallelization improvements

So they could make room for more pointless “improvements”

Many QoL improvements

Good, this is off topic. The engine needs to be rewritten with independence from TAA and aggressive frame smearing from garbage algo’s like DLSS/AA.

1 Like

Megalights are an incredible performance killer nobody asked for.

I’m sorry what? Do you have proof or sources to make this claim? Because in my project, which is a sandbox game where players can place lights, Megalights is a life-saver that allows for player creative freedom and I can only imagine how great it could be for more linear games.

And Lumen GI is 20+% slower than 5.4

Again, sources? For me Lumen has been significantly faster and it seems like there are more options to tweak it for performance, even options that say “Increased GPU cost when enabled” are not that bad in initial tests.

Still can cost x3 more than LODs.

Nanite has a base cost and for me has not been a disapointment, I’m utilizing Nanite Tesselation as well which is apparently production ready now, and I cannot imagine how bad performance and visual quality would be with regular LODs.

There’s also context, I don’t know how you tested Nanite but for detail heavy, large environements it is very much faster than LODs in every way possible.

The engine needs to be rewritten with independence from TAA and aggressive frame smearing from garbage algo’s like DLSS/AA

Are you even reading yourself before posting? Do you really think the engine is going to be rewritten any time soon? The improvements made are always praised by gamers and developers and people like you that are behind videos like “Unreal Engine 5 is ruining games” are spreading useless hate against the engine while what you should do is appreciate that there is a free software out there offering you a massive set of features to create the content you want.

The tools are there, for free, use them if you want, you aren’t forced to use them, but please don’t hate developers that decide to use tools that rely on different techniques that may or may not look better than others.

In the end this topic is not about the engine but about your personal hate against developers using Nanite and Lumen to create quality dynamic experiences and you can’t stand a little bit of ghosting or a little decrease in performance and you can’t appreciate actual improvements made by Epic to create a better engine.

5 Likes