NEW(3/22/2024) UE5.5+ feedback: Please Invest In Actual PERFORMANCE Innovations Beyond Frame Smearing For Actual GAMES.

Thank you ALL who voted this topic, in only 11 months it has replaced previous #1 feedback thread made 3 years ago. As of now, 5.4 does not tackle a lot of the issues stated here. We did get a major performance improvement from Lumen reflections and slight bump in traditional raster you would see in basepass, but Lumen GI and TSR have canceled out any performance improvements because they became much more expensive. Here was a small, but recent stress test.

Who's posting this.

I’m a writer and director working on a game title. I have independently studied game development for 5 years and began working with Unreal a couple years ago to learn it like the back of my hand. I see clear issues and ideals stemming from the top dogs incharge of UE and they need to be addressed before any next version release. I’m not a graphics programmer, but know enough about pipelines, production, and have many references that show the serious issues.

For better or worse. Unreal surpasses any public game engine(even some in-house) due to years of documentation, years of optimizations, a huge library of free assets, plugins, and it’s easy to find UE experts to hire(If you are a big budget studio) due to its massive popularity.

If such a popular engine like unreal doesn’t fix MAJOR problems like the ones explained below. Gamers will be (and already have) the ones to suffer the consequences, to some an extent even game studios.

The current path of UE5 is focused on virtual production and “games” made to run on insanely expensive hardware nowhere near purchasable by consumers(I.e. 4090’s, even overkill 4070’s)
Too much of UE5’s workflow is completely backwards for specifically game development (which is over shadowed by the many uses of UE today)…

This is not to offend anyone but offer as completely sincere, genuine frustrated and logical feedback. It is harsh but not offensive.


The main things that need to be fixed ASAP for sake of all games using Unreal.

The unethical standpoint of Temporal dependency and Upscalers like TSR.

All temporal upscalers and AA methods undo the whole point of visual breakthroughs in games like amazing FX’s, textures, and object details. These only remain stable in rare occasions of stillness.
STOP making shaders depend on TAA like contact shadows, SSAO(which the command compute smooth glitches and doesn’t work), hair, Lumen, soft shadows, bloom, and SSR(again requires hoops to get working correctly and many games it defaults to the broken non-temporal version). THIS IS RUINING visuals for gamers. You haven’t even updated TAA in unreal. You gave devs another BLURRY, insanely expensive upscaler that ruins all the visuals in motion/gameplay.

There is no point in making a beautiful game that performs so badly, it needs to upscale from 720p to produce eye sore blurry/artifacting visuals. That makes NO SENSE.
It’s an oxymoron.
Path traced like visuals don’t make blur and heavy temporal artifacts prettier and FSR3 lag isn’t the answer either.

Stop targeting PS5 and Series X. Target 20 series hardware playing at their recommended resolutions instead.

PS5 and Series X are more powerful and have biased optimizations that 1080p PC hardware like 3060’s and 6500’s (regularly priced GPUs) don’t have. Yet currently, you encourage studios to upscale from 1080p to 4k.

So when someone buys a regular priced GPU card like a 3060, Epic designs it for you to temporally upscale from dithered 540p to play at upscale “1080p” 60fps.
Absolutely disgusting

The problem with “Just buy a console, you get better hardware for the price”.
Okay, true. But then you are stuck with CRAP software that doesn’t let people control settings like AA preference, motion blur, post processing like ugly film grain. The whole point of PC gaming.
Yet because unreal forces so many important temporally dependent shaders, devs force blurry TAA and upscalers on players.

You are wasting the potential in these consoles by abusing the power they offer for things that are not important when you should be making workflows that create offer the best base performance for things that make a bigger impact. Optimizing for generic PC GPU’s and common pieces of technology(RT cores, mesh shaders, etc) will only be exponentially beneficial on consoles where such major optimizations can be offered.

Nanite is not what games needed

STOP promoting Nanite as God’s gift to rendering meshes. It clearly has worse performance than optimizing LODs yet you continue to spread lies about it.
Nanite performance is not better than LODS [TEST RESULTS]. Fix your documentation Epic. You're dangering optimization.?
Pop in will always be present due to shadows and skeletal meshes.
Epic games also needs to STOP revolving more important features like VSMs and Lumen(and I heard PGC?) around it. VSM’S and Lumen are by far more important for dynamic games.
For a game, there is no point in adding so much detail to a meshes that needs a blurry upscaler to fix it’s performance and extreme subpixel issues. The very detail becomes irrelevant when the temporal motion smearing comes into play during basic camera and or scene movement(gameplay).

This idea could replace Nanite. AI LODs workflow for preserving detail intelligently
Most LODs are made like trash today with deforming and collapsing auto-LOD “algorithms”.

Lumen is the biggest asset over lightmaps but stop catering to Virtual Production and fix the issues before moving to HWRT optimizations

Lumen on the other hand is almost there,as of now Lumen is much more expensive and even on high or known at the 60fps mode, it has a very bad performance to visual ratio on current gen gpus. It still has leaking and splochyness you hand over to TSR. Part of that reason is because it caters to game designs like FN where caching isn’t an option.

  • It needs to be temporally(TAA, TSR, DLSS) independent.
    Action/Common gameplay motion and temporal algorithms do NOT mix.
    Especially YOUR temporal algorithms with inherent flaws.

  • Major Optimizations need to be provided on a scene specific level for developers. Such as static(Building ), stationary(destructible) and movable meshes for saving major calculations. If a mass majority of projects perform very similar, this is clear sign of a lack of scene specific optimization tools. This could be done with bakeable volumes with developer controlled interpolation logic and probe based Light/Direct Light/Neither visibility for dimming.

  • Here is the concern when comparing FN with City Sample. Both perform very similarly but are vastly different projects. That is a scene specific downfall in the per scene optimization tools. We should have much more caching/less computing on the gpu rather it be performing the near identical in every UE5 project.

  • Not everyone is making a stylized game(FN) where a temporal upscaler may not interfere with your presentation. But third person games with actual rotation are still having major issues.


We as the gamers and game developers of the UE community want FN’s 6 billion dollar revenue
invested in common sense based game performance innovations for Unreal such as
micro optimizations, much better caching algorithms, effect algorithms that do not hog computing for a 10% difference in visuals, and offering developer appealing workflows that promote performance.
Look at other games, modern trends and surpass them.
Most games are NOT Fortnite where caching the entire scene is not an option.

Not random crap like substrate and path tracing or dedicated UE5 versions for tiny visual changes that don’t fix core problems.

Stop investing in issues that will not help regular gamers without 4090’s or even 4070s.


Post closing—

Thank you for your time and continue to vote for this thread.
I am VERY glad people agree about these issues and the amount of votes we have now.
SHOW Epic Games WE, the gamers, the GAME developers care about this topic.
Just clicking the vote button shows we care together.

Some people think getting performance gains is an obvious goal for Epic Games but plenty of unexperienced or causal people do not understand how concerning UE5’s current performance is. And after 4 releases It’s been a tug of war of performance between different effect departments, an equilibrium of constant poor performance.

Fast links to my post and replies related to this topic. (Including innovations that would actually help game performance and regular consumers with reasonable budgets)
Post #2–Definition of optimizing.
Post #3–AI workflow for UE5 for optimizing static meshes. -Possibly the best investment Epic could make in.
Post #4–Why 30fps is not acceptable and where FN’s 6 billion revenue should go.
Post #5–STOP relying on TAA and upscalers. A total recap on more issues that hurt gamers excluding lack of optimization focus updates.

Phew, i thought i was the only one who noticed that UE5 simply not suited for games at all, only for things like movie production, that’s why there’s only visual demos and all actual games require pc from 2030. Because of that i had to roll back to 4.27. And companies that i know (which do interactive stuff aside from games) still working with UE4 because of UE5 low performance and general instability

10 Likes

I think we will have to wait a long time in terms of performance. taking into account the new solutions inside the engine, lumen, virtual shadow, nanites and others, I am not at all surprised by the huge drop in performance at this stage of development. that is no justification and is certainly a criticism of anyone who can call the current version game ready. I think that even within the engine, it cannot do much better in this matter (performance). as we know, they are affected by many factors besides the engine itself, and our only hope is that in the years to come, these factors will coincide in a direction that suits us. by that I mean that progress in performance will actually take place at the hardware level (gpu, cpu). we can only hope that it will all come together. practically for the first time we will have a clearer picture after the release of the next generation of consoles, maybe even a little later when the announcements of games for them (ie games that will fully support their new capabilities) begin. it will be in next 2 years and maybe even later. if all these innovations survive the development experiences that await them, we can only realistically expect the comfort we dream of in 3-4 years. until then, those who survive will tell :).

1 Like

But there have been some interviews of AAA software houses shipping their games made with UE 5.1 and they have been using UE 5.0 long before it was made public.
So the obvious question seems to be that they are using a better bug free version of the engine, otherwise how could they be shipping any complex project at all ?
But even the marketplace rule for sellers to support just the last three UE releases is not good at all, although I saw some being allowed to add UE 4.x versions as old as UE 4.24 recently, still most are not adding UE 4.27 support anymore.
As a poor solo indie developer I am stuck at UE 4.27 for my first two games I am developing right now (just finished creating the soundtracks but a lot more work left) as well because UE 5.0 5.1 and 5.2 are far from usable at least for the public version available to everyone.
UE 4.27 is not free of bugs either but UE 5.x releases have some major bugs being reported that make them really unstable.
I would pay $100/month for a stable premium UE release with reported bugs fixed in a matter of a few weeks and not years.
But it seems that Epic Games doesn’t care even to add a LTS branch to UE 5.x releases as requested by many other than myself.

the business model is.

provide awesome features. get paid from game companies to make them work properly and have perfect performance.

if you need that extra 20% to get to 100% then you need to pay.

from my understanding on how this entire game engine industry works is to think at it like this:

They give you all the ingredients to make a perfect 5 star meal, but they hide the recipe.

Almost every piece of the engine has some ideal setup where it works perfectly as it should. Multiply this with tweaks you need to made to have more engine parts work together, again with an ideal unknown settings setup.

Things quickly get out of hand when you don’t know what you are doing, like me, and you learn game design learning from youtube videos.

5 Likes

its not all that black. when i commenting low performance (not game ready), it was all regarding all nanite turned on, lumen on and (not neccessar) vsm turned on. and in that case we can expect game ready comfor (fps budget) on average gpus for 3-4yrs for example. regarding performance in old setup it is not that bad at all. actually it is very arguable (no nanite, no gi, cascade shadows, no tsr). i even have better performance in various cases with ue5.2 then on ue4.23 on some old templates. but in that case there is another problem or it is subjective, meaning visual fidelity (quality). in all that scenarios (old) ue5.2 is offten faster then ue4.23 on high sett (scalability 2), but get slowing in medium (scalability 1), with not much difference, and my mayor compliant is big hasle to get equialize picture quality in ue5.2 medium settings with ue4.23. so there is it …

1 Like

Yeah, i mean, if you want to make a game - you ALWAYS should take into account that many people don’t have such good hardware as you do, so if game runs good on your setup, it always will run worse on setups of people who have your game, and you want other people to play your game - the more the merrier. And here UE5 stumbles and falls in all worst ways possible.

But in cases like CAD - it’s a blessing. I was talking about companies leaving UE5 for general usage, but when it comes to taking some big CAD projects, and client (as all people that bring their own cad files) want it to be ready fast, we just threw it in UE5 and let it did it’s magic - boom, more than half of a job is done. Simlar thing with different conferences when you make project a singular powerful setup that will stand in some conference wher it will stand in main hall and roll some interactive presentation.

So, yeah, just like in post - it’s not a gaming engine, we wish it was, but it’s not. So at least it shouldn’t be promoted as good-for-everything tool, especially for games.

1 Like

I’ve been a developer for 18 years and I have to agree. UE3 was fine. UE4 was fine from 4.6, when it started to be usable, to 4.19, where there were a lot of unfinished features, but at least after two years they finished the layered materials :smiley: And UE5 is fine, but they making new experimental features, but dont polish almost anything and lack of documentation.

But there’s no point in telling them. You try wrote it here, then on udn and they just care. In fact, it will later demotivate you to work, the engine becomes more complex, and Epic is proud that it has recruited a lot of inexperienced people. Just mess without logic.

2 Likes

You mean riots, right? :smiley:

Nope. This forum has been dead for several years and UDN is almost dead, like this forum several years ago. Now, even if someone does answer you, it will probably be someone from PR. And if it was someone from Epic, you wouldn’t solve anything. But they used to come here years after it was made for public.

On the other hand, I understand them. The reason why Epic releases such semi-finished products may be the fact that they are in the pre-production of a new project, I am also preparing tools that you can try out for them. And the difference between Epic and Crytek was that they always focused on tools, not technological demos (but sometimes i doubt that now) - that was problem with Crytek. And they are also because Epic is an active developer. And if you’re a smart indie developer, you can make a small single-player game where you can use it all - but you will be overhelmed with looots of issues.

On the other hand, when you look at their direction, it’s ironic. Valve criticized Microsoft for their corporate practices and years later it became the same. Epic criticized Valve and Apple and it’s pretty much the same thing. They’re just not interested in money, which could be worse. Because you are still dependent on their tools, the technical condition of which can cause you to miss a milestone and even cause bankruptcy.

I understand that they don’t have larger documentation about Nanite - seams in UVs affect cluster lods as well as vertex paint… its wip, i get it. But you dont have any control about how and when clusters will be lodded or distance culling - you should still have this option to reducing for example small props for more drawcalls, etc

But the problem is that they lack even the basic tools in Vannila. The physical mask for materials does not work well and cannot be used in the instance. In the same way, when you want to change the basic settings for a material instance by batch (eg. physical material), there is no such thing in Unreal (Epic scrapped it in ue4 due crashes if i remembered correctly). If you want to find out which level instances lack world partition support, you also have to go through them one by one. Deferred decals with instanced static mesh component doesnt work properly, which can be blocker, because Nanite doesnt not support level vertex paint - i found it in bugs - their target to fix is UE5.5 ((winter in 2024? really?!). Errors with world partiion (for example groups and datalayers (when you accidentally put a model in another datalayer and unload it). Moire effect on models - and this is quite relevant, because Nanite allows you to render more detail. Problems with nested structures, datalayers and blueprints. Even ten or more fps less when you have windows open in another monitor and they are not docked in the main window. You cant have prefab with local overrides/ exposed variables (packed level actor is repacking and level instance due to its nature). Broken shader complexity view with dx12. Memory problem with editor (even game build is fine). No pivot painter for Nanite (or problems with vertex interpolator or custom uv with nanite). World outliner is almost useless with largecmap and lots of folders (you cant show only current folder). I had even problem with overlaped vertices in uv with nanite or offset (for tileable materials). Problems wit chaos And more and more… especially with their plugins (eg water).

And this is a really big problem, because they bet all to their workflow to which you don’t have to have access (houdiny, udim + virtual texturing, Matrix city without translucency and actor for ISM, foliage without masked material - and the worst thing is that this is all tied together - you need money and experienced people, that are open minded, because its lots of changes. You’ll probably turn off Virtual shadows first. Then you’ll find that Nanite isn’t that effective anymore and if you have low poly, you proppably turned off too. Then you’ll realize that Lumen is already taking a lot of performance and this is a problem… and then you don’t see the point in UE5 anymore.Enhanced input is ok until you start having problems ( for example with the menu (exit from ui mode didn’t work well some time ago).

And as a nice example of their carelessness is the map check: if you have a scene with ten thousand models and turn off virtual shadows, you will get a freeze for tens of seconds and then ten thousand reports that you have turned off virtual shadows and are using nanite (thanks to which you will think about how to turn off nanite too) not to mention of stationary lights, with which their features have a big problem, although they are default in the editor. It would be enough to select in the config whether you want to use static or dynamic lights and a warning about Nanites only once per map check.

All the nice details that let you focus on your creative work are gone and nobody will answer your questions…

3 Likes

BUMP.

But there’s no point in telling them. You try wrote it here,

Your wrong. I have already seen some of my idea’s reach the UE5 main devs and implemented in 5.3 preview 1. This thread is important. We already surpassed the Lumen in the land of nanite release request. Negativity will never fix anything. We as creative developers/non-engine coders , are our players only hope for good performance in UE5 games. We as Epic Games customers or third party UE5 distributors must demand performance over new features in UE5 version releases.

you don’t see the point in UE5 anymore

I heard input lag was reduced, and more optimized backcode for non-next-gen workflows(ue4 workflow). Plus newer plugin support. Or if you are targeting unacceptable 30fps.

1 Like

What? How?!

Our brain makes moires effect with high pixel information and or real life scenarios. I cannot stand it when people complain about mories effect like its a bad thing.
Its a real life phenomenon.
But I will admit, nanite doesn’t tackle problematic meshes that exaggerate this issue.
TAA, FSR, DLSS are not so called “solutions” because all they do is promote lazy optimization and blur the entire image, causes ghosting and force developments to depend on them( so if you find a way to turn off a forced temporal solution, everything with look ugly and temporally dithered.)

If you watch a 4k or 8k video of a City, you will see moires pattern

It’s distracting and takes the eye away from what’s important in the scene. Reason enough why in film it’s avoided on set and delt with in post. Same applies in professional photography and videogames.

With such cynical views I imagine you will skip the next decade of real time graphics, there is a lot of effort being put into upscaling and with good reason. :upside_down_face:

2 Likes

Temporal solutions like TAA and DLSS are disgusting. My studio will not force it on players like a mass majority of games that do,

I imagine you will skip the next decade of real time graphics

No, me and my studio are going to stop it this blurry crap from ruining more games.

there is a lot of effort being put into upscaling and with good reason

Yeah this is the problem. They should be working on better/more optimized methods of rendering instead. They don’t want to optimize anything anymore and hand off the work to an upscaler. Intel is the only one trying to optimize important features
instead of pushing “Buy a better graphics card”

EDIT: Death stranding is perfect example of a WELL optimized game with NO TAA dependent features. Even in a large urban screen, at 1080p a 3060 is only taking up 40% to achieve 60fps with its computing power. With only 40% being used, raytracing and Gi could easily run 60fps using the full gpu.

I really feel like this post speaks a lot about UE5 current state.
Scaling down quality is NOT optimizing!

https://forums.unrealengine.com/t/lumen-gi-and-reflections-feedback-thread/501108/1111

Real optimization is computing SMARTER, not HARDER.

Scaling down quality isn’t optimizing, that’s the same crappy DLSS mentality](https://dev.epicgames.com/community/api/user_profiles/tab-redirect/about?discourse_user_id=6117637)

And thanks to the embarrassing public performance of Fortnite 5.1 with bare bone settings and as a game produced by UE5 programmers. It’s more than apparent that that performance needs to be 1# priority over new features.
The main developers of the UE5 source code need to make many more ways of optimizing parts of our specific/unique scenes scenes. With each one each of the three major systems like Nanite, Lumen and VSMs.

Games, are being bottleneck by the source code engineer’s, not game studios(for the most part). The appropriate UE5 teams need to find ways to allow developers to really save performance and computing power by making sure we can tune these systems for all types of scene(s) and objects.

I don’t care if it requires an AI algorithm to scan our environments before we release our games(Not AI dependent for players or the shipping build, only AI depended for developers, so the AI would save the optimizations on a offline structure).
Or more tags we need to put on actors so these systems can know the most efficient way to render them, etc(Static mobility is a good step in this direction, but it’s obviously not good enough).

Scaling down quality, like DLSS or lowering the Lumen settings, is the EXACT OPPOSITE of optimization.

That is called settling

2 Likes

try to make scene, like Favela for example with metal waving on roofs and observe result. It doesnt change in last decade and now is much wors with Nanite. Dont get me wrong. I really like Ninite, but it will be good, is someone fix it:

Moire Problems)

1 Like

Another big issue is culling. People basically dont understand how its working with nanite instancing.So they end up with merging meshes, bigger memory costs, wrong instancing, placement in level… and Epic is where for them? :wink:

And yes. Negativity wont solve anything. But Epic is selfish as hell. They deserved it.

btw: try to use custom primitive data with packed level actor. Or check actor bounds in world prattion after editing level instances, where are folders or parented objects in outliner and observe result. There is issues like this too much :wink:

and Epic is where for them?

In all fairness, the UE5 documentation is an underrated source and and public performance of Fortnite shows how iffy the new UE5 features can be.

The engine’s new features just appeal to lazy developers/studios.
The first major third party AAA title using Lumen, Nanite and VSMs performs like absolute trash like Immortals of aveum…Remnant 2 which doesn’t even have Lumen.

Although I really do blame the Remnant 2 studio for their poor performance.
Absolutely incompetent studio/developers.

But many people could give a crap about new features and rather have money and innovation spent on more optimization the processing code of the new features like Lumen, VSMs and Nanite.