Saying a game looks good while Path Tracing while turning into mush is an oxymoron.
Alright well, the only games I’ve seen using Path Tracing are insanely good quality and I don’t see what mush you’re talking about, I’m specifically talking about Alan Wake 2 and the new Cyberpunk.
Epic is not the only ones doing this, but they are deffenienyl giving more strength to this horrible future of games.
I feel like you are just hating the technology, it has issues YES, but the issues you are pointing are irrelevant.
We have enough GPU power to go over 300 poly and not enough for 5 million and/or Nanite pretending to solve the 5 million tris
But we have enough power for over a billion, Nanite is not pretending to solve over 5M polygons, it does solve it, and I don’t see what’s wrong with it.
We need basic logic on each asset, if it’s small, it shouldn’t need 5 millions tris, we need to compute just enough geometric detail and let textures/shaders take on the rest.
Again, Nanite does just that, it renders only what’s needed, of course a pebble doesn’t need a million tris but with Nanite it can have enough geometry to avoid using OLD, DEPRECATED texturing and expansive shaders.
UE’s newest designs were made for lazy, cheap studios.
I guess that proves my point, you’re just a hater, UE’s newest designs are meant for films, virtual production, and new generation games, the studios that use them are studios that chose the future instead of looking at the past, I don’t think AAA studios are moving to Unreal just because they are lazy, it’s economically a better choice, the technologies available are great and improve the visual fidelity as well as developement time, having easier tools is not laziness, it’s evolution.
but they shouldn’t make workflows that HURT and deprive gamers of BASIC standards like clarity during gameplay, reasonable input lag and 60fps.
I don’t see what’s hurting gamers with UE workflows, games can clearly achieve 60 fps and relatively good input lag as shown in Fortnite, and the UE5 games I know are mostly non-competitive cinematic games, and they run well using some of the big features of UE5.
Currently, UE5 is perfectly able of providing 60 fps with reasonable upscaling and relatively unnoticable blurriness on “cheap” hardware, and I only see it imroving in the future.
I honestly don’t know what to say, there’s no right or wrong here, UE5 is mostly production ready and I’ve been using it for production since the very first Early Access. The technologies available (Lumen, Nanite, VSM and TSR) are working fine and can provide a 60 fps experience with some tweaking, and yes there are some issues, Nanite on masked materials is quite expansive and VSM are also very expansive with something like a time of day system, but on my current hardware (RTX 4070), I’m not noticing a huge fps or ms impact. There is room for improvement, but saying it’s “pretending” to handle billions of tris at 60 fps is just refusing the truth.
I don’t know what you expect with your complaints but honestly I don’t care, I originally just replied for fun because reading the entire thing was funny, but I only see it looping over and over again and I don’t want to be a part of this thread’s “sucess”
I’m using UE5 how it is, and I have no issues with it currently, so I’ll leave you and all the other people that complain about the engine talk about how wrong we are.