It seems the direction Epic is heading is to make Unreal a modeling, animation, video editor tool.
Thatās false. You might argue that they canāt achieve the right balance, but saying that donāt care is blatantly false.
Correct, but the engine is only 2 years old and likely the development started in UE4, so itās way too early to make these kind of statement.
Thatās because those games were in-between generations. Besides, Cyberpunk 2077 in Path Tracing mode is clearly a step beyond anything ever done and it can only run on high-end GPUs. Thatās a glimpse of what it can be achieved with a mature engine. It took almost 10 years for the team to develop the RED Engine to that level.
Sure, just add 1-2 years to the whole development of the project, who cares, right?
Plus, LODs will always be visible in terms of pop-ins, no matter how skilled you are. Especially for vegetation. Oh, they also take up more space than Nanite models. Ops!
The āno issuesā part is quite baffling if you actually followed the game world in the last years, especially AAA games.
Definitely. But heās here to complain, not to see the positive things made by the dev.
Thatās false. You might argue that they canāt achieve the right balance, but saying that donāt care is blatantly false.
There are thousands of shaders and cheap techniques already been done before but major studio(including Epic Games) with the right budget to implement fail to invest in bringing those better technologies.
Instead the offer an unstable engine that requires ugly and blurry temporal visuals to those unstable issues and allow big companies like Bandai Namco to produce crap.
Why would big greedy studioās invest in developing smart/faster techniques when TSR and DLSS are "freeā performance fixers?
Also why the hell are you responding/quoting to me? I recall you āBlockingā me?
Sure, just add 1-2 years to the whole development of the project, who cares, right?
GAMERS care. CUSTOMERS CARE.
People who spent $400+ for reasonable standard of 60fps care!
Plus, LODs will always be visible in terms of pop-ins, no matter how skilled you
WE WILL ALWAYS have pop in because of skeletal meshes and shadows pop as I already stated in the MAIN POST.
Pop is also mitigated with transitional effects but now everything is temporally dither and smeared!
Nanitesā features like no draw calls, no pop in, smaller disk size have 0% value when the game runs like DOG crap.
I donāt care if a game is path traced, 100mb, with 0 pop in anyplace,
It looks and becomes unplayable crap on when itās being upscaled from 540p even though its rendering on efficient hardware beyond next gen consoles
The more I learn about UE, the more DISGUSTED I am with itās obvious issues.
It would be dang near worthless without all the plugin, company, and basic asset support.
THE ONLY reason Iām using this engine, is to FIX it with a public fork so other major companies and studios can utilize the other things I mentioned while not using a disgustingly over dithered engine.
Look up games using UE5 and people are disgusted with UE5 performance.
YT Benchmarkers are showing how unacceptable UE5 is.
EDIT DAY LATERā
Glad to see two more votes on this feedback thread.
That is definitely not my experience with Chaos Vehicle.
This thread has reached 2nd most voted feedback to Epic Games and UE5 within only 177 days.
Big thanks to all who voted.
As for Epic Games, you guys need to STOP providing an engine that promotes blurry temporal graphics because of performance issues(which can stem from a lack of performance enhancing workflows). Use common sense based features, meaning do not strive for %100 accuracy if it means it can only run on 40 series gpus with upscaling and frame gen bullcrap.
Stop thinking companies are going to invest in custom solutions for their games, because if they were going to create a in ācustom solutionsā. They would use a proprietary engine.
Several games, I was personally waiting for are using UE5 look horrible due to the blurry temporal templates you offer and I know they wonāt even run well because you forced Nanite revolved workflows.
At least I care about my game, and Iām using UE because I canāt afford to build a proprietary engine, but it is disgusting how much things depend on TAA. Iāve had to jump through so many ridiculous holes to get UE5 to look stable without TAA.
Itās funny to see how popular this thread is, if youāre complaining so much why use Unreal Engine 5 in the first place ? You scream about Nanite, disable it, stop complaining about it and let the devs that use it, use it as they want. UE5 is built with the future in mind and the big features of it are clearly not meant for 10yo hardware.
Itās funny (actually hilarious) to read your complaint repeated over and over again, I think you need to breath.
Anyways just to say, if you complain about one specific UE5 feature, why use it ?
Oh also, Epic Games is forcing NO ONE to use UE5 features, theyāre promoting the great improvements they worked on for a next-gen engine, and the next-gen games are using the next-gen features for next-gen hardware.
if youāre complaining so much why use Unreal Engine 5 in the first place
There are so many problems with the way you see the point of this thread.
UE5 is has an advantage only acquirable by time and long term support. I already explained why UE has to fixed because it affects several titles, not just mine.
For better or worse. Unreal surpasses any public game engine due to years of documentation, years of optimizations, a huge library of free assets, plugins, and itās easy to find UE experts to hire(If you are a big budget studio) due to its massive popularity.āMain post
You scream about Nanite, disable it, stop complaining about it
Two things: 1# You clearly didnāt read the main post as it explains Epic is forcing people to use Nanite if developers want to use other features such as VSMās.
2# They are also lying about Naniteās performance which again affects OTHER STUDIOS. Meshes start off with high detail, and companies promote Nanite like itās not ānext gen detailā when itās not: Itās called not optimizing meshes for affordable GPUs+more triangles=more aliasing problems which promote MORE blurry upscaler dependency .
UE5 is built with the future in mind and the big features of it are clearly not meant for 10yo hardware.
Iām not talking about GTX cards/equivalents, when I say affordable, I mean GPUs as in $300-400(which is a lot in this economy) which have released within the last 4 YEARS.
Itās funny (actually hilarious
Iām glad games turning into mush during basic gameplay and only 4090 owners can achieve bearable visual quality in modern games is āactually hilariousā for you.
Not everyone wants mushy games.
Thatās his whole personality. Heās a complainer, he doesnāt create anything, heās just here for the attention. I muted him but sometimes I enjoy reading his tantrums.
Even gamers are OK with using upscalers at this point. They are a tool that can help to have better graphics running on lower hardware, thereās no point in NOT using them.
even gamers are OK with using upscalers at this point
That is a LIE, just because some are doesnāt mean this downgraded crap should become standardize.
They are a tool that can help to have better graphics running on lower hardware, thereās no point in NOT using them.
Yes there IS a point. They look like crap during basic gameplay vs actually lowering the resolution independently(+more performance) without temporal framed blurring.
They never look better during actual gameplay. You just continue ignore public data provided by me and plenty of others.
@vfXander comments summarize accurately:
āItās okay because other big companies do it, stop complaining, everything is greatā
Even tho it clearly NOT. Other companies simple get away with it because the own massively popular, exclusive Ipās. You are appealing to crap standard wielded by customer stronghold.
Epic is forcing people to use Nanite if developers want to use other features such as VSMās.
Well no, itās the same thing again, youāre not forced to use VSMs nor Nanite, but if you want to use VSM yes you need Nanite, and you can enable Nanite on lower poly meshes without any performance issues (in my experience).
I already explained why UE has to fixed because it affects several titles, not just mine.
I follow a lot of UE5 games and yes they are terribly optimized, the upscaler is crap but thatās not coming from UE5. Unreal Engine is not just a game-engine, and its also not a simple game engine, recent games feel rushed becaused not a lot of UE5 games are out yet (except indie horror games surprisingly), so my guess is that AAA studios are completely skipping optimization just to release games fast before UE5 becomes a standard. I think weāll see better games in the future after the first rush, but itās defintelty not an issue on the UE side (just my opinion)
That one is interesting, Epic never lied about the Nanite perfs, they said itās a new feature to allow near infinite amount of polygons rendered on screen optimized by the system and also reduce draw calls, and theyāre right, Nanite is incredible and allows super high quality models which removes the need for a normal map (which in my experience is more expansive than Nanite). Of course itās going to be less performant than traditional LODs, no matter the number of triangles since the system is the same, and honestly for 1ms I wonāt complainā¦
Itās called not optimizing meshes for affordable GPUs
Again, affordable GPUs can easily handle Lumen and Nanite with reasonable performance, I had a 3060 laptop and was able to run UE5 with no issues (except the massive tech demos). And also next-gen hardware is affordable if you look at Nvidiaās 3060 or 4060, or AMD which is even cheaper. But again UE5 is a next-gen engine and studios that choose to use UE5 probably donāt care about an old 1080. Gamers will have to switch to a newer GPU at some point, I believe thereās a slow transition in hardware happening if you look at Steamās database.
And not everyone wants ugly 300 poly models and baked lighting, itās time to evolve. Thereās always going to be two sides fighting but the truth is that UE5 is meant for the future, and their current formula suits a lot of developers, so if a minority doesnāt like it the engine will move on without them.
so if a minority doesnāt like it the engine will move on without them.
I donāt see it as a minority issue, I think itās a serious problem that games and studios are saying āgames can only look good with smeary frame blenders onā
Saying a game looks good Path Tracing while turning into mush is an oxymoron.
Epic is not the only ones doing this, but they are definitely giving more strength to this horrible future of games.
And not everyone wants ugly 300 poly models and baked lighting, itās time to evolve
The thing is, you are over simplifying the problem. Youāre acting like 300 polys or 5 million poly meshes are the only choice when that exact exaggeration is what is causing the problem.
Itās about overdraw.
That is efficient and reasonable, with assets using a logical poly count, RT effects like RT AO and Shadows become way more reasonable. Nanite is not the answer to the investment issues of common sense based assets.
and baked lighting, itās time to evolve
We are devolving, MGSV and 8 year old game with a highly dynamic world comprised mostly of static objects(like most games today) had interpolated baked lighting for every hour of the day and dynamic objects were lit similarly to UEās volumetric lightmaps(which btw, doesnāt have interpolation available)
UEās newest designs were made for lazy, cheap studios. I donāt have a problem with trying to archive that but they shouldnāt make workflows that HURT and deprive gamers of BASIC standards like clarity during gameplay, reasonable input lag and 60fps.
Separate issue: You said this:
And also next-gen hardware is affordable if you look at Nvidiaās 3060 or 4060 or AMD which is even cheaper. But again UE5 is a next-gen engine and studios that choose to use UE5 probably donāt care about an old 1080.
When my last post literally said this:
Iām not talking about GTX cards/equivalents, when I say affordable, I mean GPUs as in $300-400(which is a lot in this economy) which have released within the last 4 YEARS.
Released 4 years ago, not GTX cards, around $300-400: Same exact cards you mentioned.
Saying a game looks good while Path Tracing while turning into mush is an oxymoron.
Alright well, the only games Iāve seen using Path Tracing are insanely good quality and I donāt see what mush youāre talking about, Iām specifically talking about Alan Wake 2 and the new Cyberpunk.
Epic is not the only ones doing this, but they are deffenienyl giving more strength to this horrible future of games.
I feel like you are just hating the technology, it has issues YES, but the issues you are pointing are irrelevant.
We have enough GPU power to go over 300 poly and not enough for 5 million and/or Nanite pretending to solve the 5 million tris
But we have enough power for over a billion, Nanite is not pretending to solve over 5M polygons, it does solve it, and I donāt see whatās wrong with it.
We need basic logic on each asset, if itās small, it shouldnāt need 5 millions tris, we need to compute just enough geometric detail and let textures/shaders take on the rest.
Again, Nanite does just that, it renders only whatās needed, of course a pebble doesnāt need a million tris but with Nanite it can have enough geometry to avoid using OLD, DEPRECATED texturing and expansive shaders.
UEās newest designs were made for lazy, cheap studios.
I guess that proves my point, youāre just a hater, UEās newest designs are meant for films, virtual production, and new generation games, the studios that use them are studios that chose the future instead of looking at the past, I donāt think AAA studios are moving to Unreal just because they are lazy, itās economically a better choice, the technologies available are great and improve the visual fidelity as well as developement time, having easier tools is not laziness, itās evolution.
but they shouldnāt make workflows that HURT and deprive gamers of BASIC standards like clarity during gameplay, reasonable input lag and 60fps.
I donāt see whatās hurting gamers with UE workflows, games can clearly achieve 60 fps and relatively good input lag as shown in Fortnite, and the UE5 games I know are mostly non-competitive cinematic games, and they run well using some of the big features of UE5.
Currently, UE5 is perfectly able of providing 60 fps with reasonable upscaling and relatively unnoticable blurriness on ācheapā hardware, and I only see it imroving in the future.
I honestly donāt know what to say, thereās no right or wrong here, UE5 is mostly production ready and Iāve been using it for production since the very first Early Access. The technologies available (Lumen, Nanite, VSM and TSR) are working fine and can provide a 60 fps experience with some tweaking, and yes there are some issues, Nanite on masked materials is quite expansive and VSM are also very expansive with something like a time of day system, but on my current hardware (RTX 4070), Iām not noticing a huge fps or ms impact. There is room for improvement, but saying itās āpretendingā to handle billions of tris at 60 fps is just refusing the truth.
I donāt know what you expect with your complaints but honestly I donāt care, I originally just replied for fun because reading the entire thing was funny, but I only see it looping over and over again and I donāt want to be a part of this threadās āsucessā
Iām using UE5 how it is, and I have no issues with it currently, so Iāll leave you and all the other people that complain about the engine talk about how wrong we are.
Alright well, the only games Iāve seen using Path Tracing are insanely good quality and I donāt see what mush youāre talking about, Iām specifically talking about Alan Wake 2 and the new Cyberpunk.
Iām saying no matter how āgoodā a game looks, it pointless for it to look like mush.
I feel like you are just hating the technology, it has issues YES, but the issues you are pointing are irrelevant.
Temporal smearing, insane noise, lumen flickering are not pointless issues. You are just complacent with the crap standards.
But we have enough power for over a billion, Nanite is not pretending to solve over 5M polygons, it does solve it, and I donāt see whatās wrong with it.
NO, Nanite is not some magical wand. I guess this proves my point. You ignore the both the Nanite test thread and 60fps UE5 games monitoring thread.
I guess that proves my point, youāre just a hater
The whole way Epic makes money with is by attracting large studios ran by companies that are focus on maximum returns and I said I had nothing against Epic Games logical advertisement of easy workflows. It doesnātā matter if Epic designed UE5 workflows to save money and speed up development, it does and and gamers are paying for it.
I donāt see whatās hurting gamers with UE workflows, , games can clearly achieve 60 fps and relatively good input lag as shown in Fortnite
60fps is pointless when it looks like mush and 30fps with amazing visuals is pointless for a game that needs interactivity. WE not need to sacrifice both when we already HAD Both.
I donāt want to be a part of this threadās āsucessā
No gamer or passionate developer will ever thank your complacency with extreme issues in modern games. Nothing about this thread is offensive or funny. I will proudly be a part of this thread if it pushes for BETTER engine that gamers can benefit from. You clearly do not care if that happens.
EDIT: This thread got 3 more votes in one day, it is people like you guys who vote that get us a better tomorrow for games. The people who vote, who take action are 10 times more powerful than any negative and lazy comment that comes around to troll the mission and basic logical sense this thread stands for.
Iām gonna reply one last time because I need to set things clear, Iām a firm believer that Unreal Engine 5 is an awesome engine for what I do with it and what I play that uses the engine. I donāt care about haters or āreviewersā that try to prove me wrong with data that has no real evidence of anything going wrong. I am comfortable with how the industry is going and Iām looking towards the future instead of looking back. Iām no professional, and I donāt care about provinding data, if it works, great, if people donāt like that it works, donāt care, Iām out of this thread.
Deleted comment, meant to post here:
It is true that chaos is not very performant. But i hope they will fix this in the future. I cannot get stable 240 hz chaos ticks with 20 cars in my project. its just the amount of rigid bodies and worst case in my āsimpleā game i have 20+ rigid bodies at the same time in my game world, either cars or smackable objects. I need the high hz for the tire and suspension simulation. it is not even nearly possible to simulate soft tire in this engine in the current state. but futureversions may make some dreams come true.
Iām hearing rumors about thread parallelization and bindless resource support coming to UE5.4 possibly giving UE some performance boost. If we do get a finally get big performance improvement, this will be a big win for the threads goal.
Innovations that would benefit developers and players would to have optional visibility rendering for opaque objects(since that does help performance) without any Naniteās admitted overhead: Nanite performance is not better than LODS [TEST RESULTS]. Fix your documentation Epic. You're ruining games. - #36 by TheKJ
Straight from the creator of Nanite.
The primary reason to use lower resolution Nanite meshes is to save on disk space. The performance of lower poly meshes may not be intuitive or like you are used to. Commonly that wonāt improve perf. Sometimes it can even be slower. YMMV.
-
^^From Brian Karis himself, from this comment.
-
Self accumulating effects that donāt need TAA accumulation to resolve properly.
-
Better, much more intelligent LOD-detail baking workflows that use the full GPU and CPU,
-
Aggressive caching and systems that would benefit most game scenarios(such as mainy static objects that make up the scene).
-
Updated AA methods and maybe even checkerboard rendering from here
There are plenty of plugins that slow down or remove ticking of actors that are too far or not visible. Tick Optimization Toolkit, for example.
Meanwhile, Robocop: Rogue City came out, it uses both Nanite and Lumenā¦ and it looks and performs great. And this is from a small studio.
Weird, right?