Nanite Performance is Not Better than Overdraw Focused LODs [TEST RESULTS]. Epic's Documentation is Dangering Optimization.

(post deleted by author)

I’m getting great results with nanite, lumen, vsm and taa - they look beautiful and perform well above my target framerate :slight_smile: Not sure what you’re doing wrong.

Your behavior of belittling any person arguing your point (or just deleting their comments) just shows us all we need to know. You’re not mature enough to take any regard of what you say. All your imaginary friends and fake accounts just make it worse.

3 Likes

So why ask for almost 1 million dollars if you’ve already implemented new methods? Why not let the entire industry benefit if you’ve developed new anti-aliasing techniques that surpass what’s been done so far? I think everyone will be attentive to that and it will push the industry forward. So, when’s the talk happening?

No, you’re not the most valuable person for Epic. The issues with TAA aren’t new, and the industry doesn’t care because, in the end, it’s a very good solution that eliminates all the flickering problems that older forms of AA suffer from and covers a wide variety of geometry. If the recent introduction of PSSR hasn’t made you realize that this will remain the standard for the next generation, then you have a serious ego problem and a lack of understanding of where things are headed. You should have been against it 10 years ago when the first temporal anti-aliasing solutions were introduced. Now, it’s too late there’s no going back, given the amount of money invested in it.

All you’re doing in the end is like striking a sword in the ocean completely pointless and ineffective.

You can already start protesting against frame generation, which will most likely become the next standard in most games with its introduction in the PS6, without a doubt.

(post deleted by author)

There’s always problems along the way :slight_smile:

When there’s a problem - I find a solution, or an alternative way of doing it - I don’t expect the world to change just for me.

I’m just trying to help you pull your head out of your a$$ - put that talent you have to better use. Listen to what Maenavia said about it being here to stay - and it’s just going to get better and better.

Have a bit more vision for the future and accept that the evolution of technology isn’t at the rate we’d like - we need to be patient. It’s not like the engineers working on UE are just sitting around twiddling their thumbs - there’s some amazing tech coming in 5.5 in the way of GPU instancing and GPU light baking, along with Nanite skeletal meshes - the future is looking bright.

I think we all share the same dream of virtual worlds being indistinguishable from the real thing - we just need to be patient, and accept that there are a lot of people working on these things with a lot more depth of experience.

Yelling/Crying at them isn’t going to make a slight bit of difference.

1 Like

That doesn’t prevent you from showcasing the progress made around a solution in R&D. Who knows, it might even inspire other developers, and maybe even developers at Epic.

Ten years ago, it was already the same issue : some artifacts, blurriness at lower resolutions with TXAA. Not much has changed since then.

That’s good, it’s a start, but it probably won’t change anything. Frame generation will most likely be the standard for a huge portion of games in the future, and it might even be impossible to disable. I wouldn’t be surprised if Epic introduces their own in-house solution with UE6 (in addition to Sony’s and Microsoft’s solutions).

So yes, there are improvements to be made. Personally, I play in 4K, and with or without upscaling, the temporal issues don’t cause any problems the image is very good. But at lower resolutions, typically 1080p, it’s quite mediocre (depending on the title, but nothing extraordinary). I doubt the industry as a whole will make progress unless there’s a real game changer in anti-aliasing. And since we’re moving towards AI-based solutions (even FSR seems to be heading that way for version 4), we’re not likely to see temporal methods disappear anytime soon, especially with ray-tracing denoising, which is also temporal.

I recently played Star Wars Outlaws, which runs on Snowdrop, and all the GI and reflections are ray-traced. Even while playing in 4K with DLAA, it was blurry, whereas The Division 1 and 2 are sharp.

I also played Alan Wake 2 at launch with no RT, which doesn’t use Unreal but Remedy’s engine, and it was extremely demanding. I even had to play on balanced mode with frame generation to get a good framerate, especially when the screen was packed with particles and other effects. People talk about Unreal and its heaviness, but all engines are like that. Most games are becoming more and more demanding.

We can also add the Frostbite engine, which is extremely demanding on Dead Space Remake. Good luck running it at native 4K at 60fps, even with an RTX 4080.

Even Naughty Dog’s engine is incredibly heavy on PC it’s unbelievable.

You’re focusing on Unreal, but it’s no better elsewhere you just have to deal with it.

I gotta say, I don’t know who has the issues here, the troll, or the ones feeding him.

SMH

2 Likes

Hmm disabling nanite gave me 10% to 25% (30% compared to tess) fps with VSM on and without using it on foliage which would be even worse. Makes me reevaluate things for the type of project I am making.

1 Like

At least the source-code isn’t too long.

Have you tried building a game with Nanite, TSR, and VSM’s where you have control over the systems? I’ve recently been extremely satisfied with Nanite, Lumen and VSM’s getting above ~94FPS with TSR enabled at Native 1440p with nanite foliage EVERYWHERE.

I only found optimizations because I’ve been working on a game that I wanted to push some boundaries with.

Nanite isn’t very well documented it seems. I definitely aired some frustration here and on your other thread. But as it stands right now, I am thoroughly satisfied with even more optimizations to be implemented. I’ve been working with it long enough to be certain that I know enough about the system to configure it properly for any project I want to develop with it. This is from: trial and error, reading source code, and watching the unreal youtube channel.

One thing you can’t do when working with these systems is to expect the epic devs to provide you with the perfect configurations and setup for your project. My games code adjust the settings for TSR and Nanite based on the screen percentage. This allows me to avoid wasting resources on things that were increased to account for upscaling. This greatly improves native performance while maintaining the level of quality.

My nanite memory usage cvars were also greatly reduced from the default settings as well as enabling and disabling certain functionality where needed to improve performance.

What it all boils down to is understanding that:

  • Nanite isn’t a one click solution. It requires you to tailor your level assets to fit with nanite as well as configuring the internal systems through CVARS so that the engine can render YOUR assets efficiently.
  • Lumen isn’t a one click solution. It requires you to configure the scalability settings to fit your projects requirements.

All in all, developing a game is the only real way to tell you what you can expect from Nanite, Lumen, VSM’s and TSR.

EDIT: Forgot to mention that Epic introduced a way to scale MaxPixelsPerEdge on a per static mesh level. This allowed me to greatly decrease the settings of some assets while maintaining higher quality on more important assets.

4 Likes

And who’s to say I’m not already working on it?

Okay, yes please let us know when your game engine is finished. We are dying to know, mostly because it means we won’t hear further complaints from you.

I’ve been studying game production for more than 5 years and far from being a kid. I don’t need to have 10 years of a experience to tell if systems are COMPLETE botchery. It’s people like Brian Karis who have that “years of experience” who brought the insanely smeary TAA into unreal because and excused the massive ghosting in the scene that revealed his crap.

While I have it, I don’t NEED 10 years of experience to tell you that AI isn’t needed for BASIC image quality. I don’t need 10 years experience to tell you that Death Stranding looks WORSE because it had an engine downgrade in the AA department.

So like…don’t use it? lmao
And TAA doesn’t use AI…
here you are again, complaining about all these OPTIONAL features. Do you understand what a game engine is? Its a blank slate. Its a canvas for you to create your game with. It provides tons of OPTIONAL features that you can use, or not use, how you see fit for your specific use case. You aren’t required to use Nanite, TAA, DLSS, and if you chose to forgo them, you would be in the exact same shape as you were before they were introduced.

And we shouldn’t have SYSTEMS in unreal DEPEND on a system that is massively slower

So turn TAA off if you don’t want it. Easy fix.

I’m the most valuable person Epic has atm because I’m not here to COTTLE the bs mess their causing

:rofl: :rofl: :rofl: :rofl: :rofl:

1 Like

Maybe I have this wrong. Does nanite in general help with draw calls so much that you don’t need to instance assets when you’re using it? Thanks.

To the best of my knowledge, and based on Epic’s Unreal educational videos, you still want to instance your meshes. “Nanite is super-good at instancing.”

Do we still need to virtualize textures? Do I still need master materials? Is trying to reduce draw calls still a thing? Do I need to pack UVs still? Should I still merge meshes into static mesh chunks in the environment? I’m sure all these methods exist but are they worth the time with nanite and lumen now a thing?

According to this guy: https://www.youtube.com/watch?v=dj4kNnj4FAQ

The answer is yes. Virtual textures, luman, nanite are meant to play together.

You don’t NEED to but it would be better for performance to use virtualization. As far as materials, you can still use master-materials, nanite doesn’t change that paradigm, but Substrate might when it becomes standard.

For now, you can still use your ‘regular’ material-tech on nanite with the usual caveats around WPO, etc, etc.

As far as merge, you should instead build Assemblies. This is basically merging for nanite since it can instance the entire assembly, and you can edit the assembly directly, post-creation so it seems ‘better’ on that point.

Nanite is really about topping-off on screen-cost so that triangle-count can grow and not grow the cost. Eventually, the nanite-clustering will cost you just-about a screen’s worth of detail and whatever you throw into the scene will be made to work within that constraint. It costs you more up-front, but ultimately will top-out to cost you less, much like a fixed-cost (which is insane when you think about it).

Also, nanite is for perfect-LODing, or as close as you can get. You don’t need to manually define LOD’s with the tech instead creatively-destroying the mesh as you get farther and farther away. This saves a lot of time.

2 Likes

I’ll be using substrate on my project. So master materials help with performance still and are necessary, got it. What are assemblies exactly? Most of my assets are coming from zbrush.

(post deleted by author)

I found a neat greasemonkey script: Greasemonkey script for phpBB boards that hides posts of certain users ¡ GitHub

@TheKJ I literally provided you with a real game benchmark using Nanite, where we reached over 180FPS using Photogrammetry Assets and 8k textures.

That was (to my Knowledge) the only valid Benchmark in this whole thread.
I also talked about why TAA is a Choice nowadays and is redundant with the use of Upscalers, as these will automatically AA Images.

Yet you still fall back to the same arguments every time this thread continues.

Multiple people have pointed out that they are working with Nanite in Production and are benefiting greatly from it.

At this Point the whole thread (like you said) is just people disagreeing with you and you defending yourself.

Why don’t we just agree, that it is just a skill issue from your Side and move on?
Because obviously your oppinion is the far far minority…

3 Likes

That’s amazing! Would love to know more about the project and what you all did for performance optimizations. I’ve been using all of the big features in UE5 at native 1440p and got above ~94FPS with loads of nanite foliage on screen, cars, etc… I can push over 100 FPS at native 1440p resolution with TAA set as the AA method which is on par with what I see in RDR2. I have yet to finalize my native resolution settings for TSR and expect to push it even further.

Spec: Ryzen 5950X and a 3090ti.

My mentality on all of the new systems in UE5 were completely wrong out of the gate. I didn’t realize epic cranked up everything to 300% out of the box. I knew nanite needed to be configured but I felt that there weren’t that many options available. So wrong. Found ways to configure it similar to VT’s.

I’m still working on finalizing performance optimizations for both Nanite, Lumen and TSR. There were some things that could have been improved and epic has started that by allowing control over ‘r.Nanite.MaxPixelsPerEdge’ per static mesh in 5.4.

I would love to know what you’re doing. A few other developers and myself have been discussing different tricks and findings for optimizations with nanite & lumen here.

3 Likes