(post deleted by author)
Funny how all beginner asset flippers struggle with all the same things.
- Your environments look sharp, boxy and incohesive.
- You underestimate the importance of bevels and imperfection.
- You can not follow a single art style.
- You donāt realize that Light must support Form, and Form must direct Light - these are the very 101 basics of composition.
- You underestimate the importance of atmospheric effects.
- And post processing too (lens effects and so on - eye effects apply too).
- Lack of complex shaders besides whatās from Marketplace.
- Lack of detail scale level variation.
But thatās just a quick look on your environment, and besides the point in this conversation, except that you shouldnāt voice your opinion when you lack the expertise to do so.
EVERYTHING that you have said in your last posts has ABSOLUTELY NOTHING to do with the missing functionality iāve been talking about in mine. You just ignored it.
There is A TON of things that you CAN NOT bake in UE5, and you probably wouldnāt even realize it without my list in my previous posts since you are a beginner.
To me, it appears that you just want to talk and not really focused on the subject. And thatās fine, we all have our problems in life and need someone, but please do it elsewhere, what we are on here is just too important and eventually will benefit everyone - developers and players.
(post deleted by author)
Thatās not āAgendaā and I donāt know him personally. You just canāt accept the reality for whatever reason.
Iāve been voicing my opinion regarding UE optimization flaws since 2018, before you even started developing games, and many of the things I and other people talked about have been implemented with time. The most recent one is the CPULM support for World Partition, which is coming in UE 5.5, thanks to the community, as Epic said themselves.
However, as many developers and most gamers noticed, Epic tends to choose a different path lately. Instead of continuing developing optimization tools, they make one-click universal solutions for the ease of use for Fortnite kids and people unrelated to the GameDev/Realtime industries.
Even though these modern tools often result in a extremely poor performance and sometimes in a bad image quality.
Iāve been using a lot of engines since 2011, like GameMaker 6 - Studio 2, Unity3D, CryEngine, Unigine 2, and finally, Unreal from 2017. Each engine has/had different problems, but almost all of them perform bettern than Unreal out of the box.
Look at the Unigine 2 landscape solution for example, where you have stable 400 fps on a large landscape with voxel dynamyc GI, which by the way Tim Sweeney wanted to purchase around 2019 but was refused by Unigine developers because how deeply it was tied to their engine.
In Unreal, such scene would give you 40 fps, at best.
Look at Rockstarās engine insane real time GI optimization techniques achived with partial baking which is basically impossible in Unreal. Look how beautiful and performant their volumetrics are, compared to Unreal.
Look at Frostbiteās baking capabilities, where you can bake Specular reflections and much more, where you can have dynamic shadows from foliage at 240 fps!
I could give you a lot more examples, but is there a point, if you just lack an ability to understand and not willing to accept it?
It seems a bit unproductive to keep circling back to the same complaints about Unrealās optimization as if nothing has changed. Optimizing in Unreal Engine in 2024 is entirely achievable the tools are there for those who know how to use them effectively. Take Stray, for instance: the game demonstrates that with careful planning efficient culling, baked lighting, and well managed LODs itās possible to achieve excellent performance without even needing upscaling, which is rare these days. The problem isnāt a lack of tools but rather a lack of proper utilization by many developers.
That said, I completely acknowledge that the baking workflow in Unreal has its limitations, especially compared to some other engines. Baking lights, for example, is essential for performance in large, complex scenes, but it can be quite restrictive. You canāt blend between different baked scenarios, so adapting dynamic elements can become cumbersome. Plus, baking for large, open worlds or scenes with intricate geometry requires significant time and storage, which isnāt ideal for every project. When it comes to foliage, blending is particularly challenging since you canāt bake it effectively, meaning performance often takes a hit with dense vegetation.
Yes, there are dynamic GI methods in other engines whether itās voxel-based or probe-based that offer interesting alternatives with their own strengths and weaknesses. Iād be fully in favor of Epic developing other methods beyond the outdated CPULightmass. Expanding options for dynamic GI could open up creative possibilities and help overcome some of the limitations weāre currently facing with Lumen and the need for baked lighting in more complex scenarios.
And yes, the newer systems like Nanite, Lumen, and VSM still have optimization challenges. Nanite can be costly with masked materials (making foliage especially demanding), and animated vertices only increase the resource load. Epic is actively addressing these issues, though, as can be seen in their GitHub commits, and thereās also room for Lumen and VSM to improve in terms of efficiency.
So rather than recycling biased points or rehashing the same arguments, it might be more productive to focus on constructive discussions around whatās already working and where Unreal can realistically improve. If Epic hasnāt responded to some threads while addressing others, it might simply suggest theyāre prioritizing feedback thatās offering genuine value and actionable insights. Unreal has come a long way, and for those willing to adapt and work with whatās available, the possibilities are definitely there.
Thank you for finally admitting it. Anyway, I must correct you that this āoptimizationā too often leads to rewriting the engine, which shouldnt be the case.
And I am not saying that ānothing has changedā - I said the opposite:
The problem is that it is happening way slower than in other engines, and it became way worse than it once was. Instead, they are focused on UEFN, tech for filmmakers, launching Fab without wishlists and text reviews and so on.
Of course, not every game would benefit from every type of baking, but a good engine should have all these tools or at least a possibility to seamlessly integrate them via plugins - this is what Game Engine is - itās a toolset.
Stray, while being on the more performant side of Unreal games spectrum, is not even that impressive quality-wise (the art style is great though), here is the comparison of Star Wars Battlefront (2015) and Stray (2022)
In my opinion at least, Battlefront still looked better after 7 years. AND itās rendering at least 2 times faster while having more advanced tech such as tesselation, more dynamic objects and more VFX.
Of course, we definitely need more flexible methods maybe something between the old CPU Lightmass and Lumen. Thereās nothing in between, which is a real shame.
Battlefront was indeed an impressive feat for its time, and it still looks stunning even in 2024. Itās a pity we no longer see this level of performance/visual quality, even with Frostbite.
Letās see if future developments will bring more flexibility while also improving existing systems.
For those who are interested in cheaper lighting solutions than lumen, I would like to point out this topic here:
SSGI is worse in unreal 5? - Development / Rendering - Epic Developer Community Forums (unrealengine.com)
In ue4, I managed to get some nice ssgi lighting through the use of console commands. It was slightly noisy ssgi, but smear free, and very cheap. However in ue5, I cannot achieve such a good result. In UE5, ssgi is more noisy, locked to a lower resolution, and more expensive for me due to having certain performance enhancing settings removed.
Why are you comparing an in-game screenshot of a 10 million dollar AA carton cat game to an offline render promotional image of a 100 million dollar AAA Star Wars game? Battlefront 2015 I think was the first game to make use of Photogrammetry, which is why it looks so realistic. Also, Andrew who did the Photogrammetry for Endor now works at Epic. Isnāt that ironic?
Lol, did you play the game yourself? Because I did. The graphics is exactly the same if not better in the actual game as it is on the screenshots here.
Can you imagine that there was a time when screenshots didnāt lie and the game ran at 120 fps on average hardware?
And why I am comparing this? Because of the 7 years difference.
15 years ago a single person with 0 budget could only make a super mario game, now everyone can make it look like AAA production (except performance lol)
Not really, Photogrammetry is great and all but itās more about Lighting and Engine Performance.
No amount of Quixel assets can grant you such performance.
Battlefront 2015 I think was the first game to make use of Photogrammetry, which is why it looks so realistic
Weāve had photogrammetry in PS3 games.
The problem is the time is takes to optimize them and last gen limited the quality. Nanite is an approach, but a poor one since it basically doubles, if not triples geo timings against quad/material efficient LODs and also promotes too much subpixel aliasing.
Yes. I played the game on PS4 when it came out.
No. The image with the āAT-STā is NOT an in-game screenshot. That image shows no gameplay, no HUD and is rendered at an extremely high resolution like 7680 Ć 4320 to remove aliasing. Itās literally an ad for the game (scroll down to the image gallery).
You donāt like Unreal Engine?
So you think the lighting looks bad in Stray?
Iām really not sure what is your point here?
You compare a compressed screenshot from an alpha version of a 2015 game which performs at 240 fps, has more dynamic vfx, has more movable objects and dynamic shadows to a 2022 title which runs at 60 while being much more static.
You compare 2 completely different lighting scenarios - 1 is daylight with nothing but sun+sky, another is urban night scene with lots of small colored lights and dense volumetric fog, RT reflections and shadows. So?
And even if Stray performed as good as Battlefront, itās kind of beyond the scope of the conversation here, which is about missing baking/optimization tools, Nanite-Lumen-VSM lags, TAA, TSR and so onā¦
No, I didnāt say that, I said that itās not impressive in terms of quality/performance ratio.
Yes I am a Unity spy.
So I have a few conflicts with Performance in Unreal Engine as well. While I am able to resolve a major portion of this and get to over 400fps (yes itās possible in ue5.), the major conflict seems to be in engine parts.
Some of my forum posts have gone unchecked as well when mentioning the performance drop in Groom. DX12 missing solutions that become DX11 recommended settings on Steam for example, and over all Vulkan to Linux support. It isnāt too bad to max out the settings for performance, however I wish they would have a presentation that encompasses the best performance using Vulkan with Groom and effects in unreal. A lot of the Linux, Steam, Groom and DLC documentation requires an update.
Groom cards usage between 5.2.1 to 5.4 crash.
Games for Steam crash and require setting back to sm5 dx11 and not dx12 initially. Though you can get dx12 to work, most leverage dx11. (btw Proton uses vulkan drivers)
Setting Groom per platform causes the Groom to disappear entirely unless on Project Settings are on Default.
Hitching occurs during rhi thread render passes.
Little to no Vulkan updated documentation based on items added between ue4-ue5 in general.
Also thereās typos in the Pixel Streaming code on the github⦠which causes connection and performance disconnects/
I donāt think comparing Stray to a Star Wars wallpaper proves that Unreal Engine is unoptimized.
Comparing Stray to Battlefront is comparing apples to oranges.
Stray doesnāt use ray tracing at all. It relies on very well-placed cubemaps and SSR.
Thatās why I compared in-game graphics only.
Thatās why I have chosen the most identical lighting scenarios for both game.
Thatās cool, SSR really tricked me with that camera angle.
āEPiC GaMEs DoNt ImPRoVe pErFORmanCesā
Meanwhile the UE 5.5 patch note :
- Optimized Metahumans (less than 100MB vs 2-3GB)
- Faster hardware RT targeting 60 fps support on consoles
- New denoiser for lumen reflections hugely reducing shimmering, artefacts, ghosting, a lot more stable
- Megalights, similar to Nvidia RTXDI reducing performance cost with many lights (but have some drawbacks for now)
- Nanite now support DX12 Work Graph improving performances : āNanite compute materials now can use work graph shader bundles on D3D12 when r.Nanite.AllowWorkGraphMaterials and r.Nanite.Bundle.Shading are both set. Both of these default to off at the moment.ā
- Render Parallelization improvements
- Many QoL improvements
- Megalights, similar to Nvidia RTXDI reducing performance cost with many lights (but have some drawbacks for now)
Megalights are an incredible performance killer nobody asked for. The reveal for megalights and every other environment showcase for them have a been an overhyped joke.
- New denoiser for lumen reflections hugely reducing shimmering, artefacts, ghosting, a lot more stable
And Lumen GI is 20+% slower than 5.4 (which already much slower then 5.3) and still has noise, splotching, ghosting, and terrible disocclusion.
Nanite now support DX12 Work Graph improving performances
Still can cost x3 more than LODs.
- Render Parallelization improvements
So they could make room for more pointless āimprovementsā
Many QoL improvements
Good, this is off topic. The engine needs to be rewritten with independence from TAA and aggressive frame smearing from garbage algoās like DLSS/AA.
Megalights are an incredible performance killer nobody asked for.
Iām sorry what? Do you have proof or sources to make this claim? Because in my project, which is a sandbox game where players can place lights, Megalights is a life-saver that allows for player creative freedom and I can only imagine how great it could be for more linear games.
And Lumen GI is 20+% slower than 5.4
Again, sources? For me Lumen has been significantly faster and it seems like there are more options to tweak it for performance, even options that say āIncreased GPU cost when enabledā are not that bad in initial tests.
Still can cost x3 more than LODs.
Nanite has a base cost and for me has not been a disapointment, Iām utilizing Nanite Tesselation as well which is apparently production ready now, and I cannot imagine how bad performance and visual quality would be with regular LODs.
Thereās also context, I donāt know how you tested Nanite but for detail heavy, large environements it is very much faster than LODs in every way possible.
The engine needs to be rewritten with independence from TAA and aggressive frame smearing from garbage algoās like DLSS/AA
Are you even reading yourself before posting? Do you really think the engine is going to be rewritten any time soon? The improvements made are always praised by gamers and developers and people like you that are behind videos like āUnreal Engine 5 is ruining gamesā are spreading useless hate against the engine while what you should do is appreciate that there is a free software out there offering you a massive set of features to create the content you want.
The tools are there, for free, use them if you want, you arenāt forced to use them, but please donāt hate developers that decide to use tools that rely on different techniques that may or may not look better than others.
In the end this topic is not about the engine but about your personal hate against developers using Nanite and Lumen to create quality dynamic experiences and you canāt stand a little bit of ghosting or a little decrease in performance and you canāt appreciate actual improvements made by Epic to create a better engine.