Nanite foliage for open world

You ok there?
Despite everything you said there’s plenty of high performing UE4/5 games already out there.
If you don’t have the skills and knowledge to make your projects performant might I suggest you lower your expectations of what you should be making on your own?
At least in the AAA space we have in-house render programmers and engineers helping us solve all these problems as they come up and we often have to make custom tech here and there, but still the engine handles 90%+ of all our needs, which beats making our own engine any day.

Epic is very concerned with performance and user experience, but a lot of UE5 shiny new toys are actually EXPERIMENTAL still. You might wanna look into what that means.

At UE Fest 2024 some Epic devs I talk to mentioned they are actually looking into making a new Tree Asset Type that would handle all the common pitfalls of Nanite Foliage and foliage movement that seems really promising. They understand foliage currently is a big unknown for many studios.

But people; you’re more than welcome to not jump to the new experimental features right away and continue working with the old proven methods. Foliage with LODs and Impostors at the end still work!

Epic is not concerned with performance at all.
If you think otherwise you are either new to the engine (400 or so posts, thats peobably so), drank too much of their coolade, or are just flat out delusional.

The statement is a fact.

Just like it’s a fact that Epic steals other developers idea and makes profit off of them.

Just like its a fact that Epic would rather sue little kids (minors) for cheating at fortnite rather than patch the holes they cheat with.

Just like its a fact any new shiny thing they release is worked on for about 5 minutes, then dropped and left in beta - case and point with relation to the topic: Procedural Foiage.

This engine has 0 to offer to anyone when it comes to performance.

And if you had tried any other engine where the developers are factually concerned about their engine’s usability and performance you would know/realizd as much.

Again, look at company actions;
Not their ■■■■■■■■ press statements and faked talk footage.

And yes, obviously anyone using beta stuff is at fault for being stupd if they expect perfoemance out of anything - **but the engine just doesn’t perform, and has decayed into an amazing sht show over the past 4 to 5 years**.

Go back to 4.18 and benchmark a scene.
Ugrade to .22 and run the same benchmark.
Move to .25 and experience yet abother 25% fps drop.

Thats without even touching the total sh*t show that ue5 (which is released but is just a beta) would bring into play due to Nanite, Lumen, and all the other BS they came up with that would be good on paper but is a completely failed execution.

on top of that remeber that you loose about 80% CPU performance just because they switched from Nvidia Physx to their custom sh*t show aptly called Chaos.

And regarding high performing games - just no.
Go have a look at what Outerworlds had to do to become playable.

High performance means 224fps at 4k native - at a bare minimum.
Anything short of that is OK perforamnce at best.
And before you say anything silly like thats an impossible goal remember that other engines do this just fine, in 3d, and up to nearly 10 years ago.

Now, if you want to argue that the games made with UE4 have generally better looking graphics - maybe - it’s hard to make that claim or validate it when you still have Crysis as a benchmark and the equivalent of it made in Unreal Engine after optimization would run around 10fps.

Again, in a TL:DR;
Can one optimize Unreal Engine? Somewhat.
You still wont get anywhere near any engine which is actually minimally concerned with performance.

And no, this post is not off topic you trolls.
@Mind-Brain please ensure they get dinged for abusing the flag system.

Obviously the performance will increase over time as (hardware increases its capacity in general) so that the quality and limits can be improved… This is a random rant in this post that was meant for foliage optimization in an large open world…

No. It wont. It hasn’t so far. And theres only so much you can do given an x64 architecture and the bus speed of the gfx, but obviously you don’t know the first thing about anything at all.

And no. Its not a random rant. Its a direct answer to a direct question in regards to optimization, which is very much on topic.
Unlike your post, which adds literally nothing to the discussion and is in-fact off topic.

I’ve been around for 10+ years working with Unreal, mostly in AAA. Not sure how not being super active in the forums equate with my experience with the engine though.

All your complaining sounds a lot like you expect things to work automatically and for you without effort. If you think other engines do that and have for years, then why are you even using Unreal? You’re just not making any sense.

I’ve optimized content and scenes for bunch of big AAA games and I wouldn’t say everything magically works -you really have to know what things to do to get the most out of the engine, and there are hundreds if not thousands of values/parameters/settings to tweak-, but you can optimize scenes just fine if you know what you’re doing, probably with some compromise, sure, but you can.

I stand for what I said before; if you can’t optimize a game like a AAA studio can, just aim lower and be thankful you didn’t have to build your own engine from scratch.

1 Like

I would strongly recommend not continuing to engage with the poster you’re replying to and putting them on your ignore list. If you can’t tell just from the posts in this thread they’re both spectacularly ignorant about the games industry (‘there are no AAA games studios in Europe’? That’s gotta be news to Rockstar, Remedy, Io Interactive, Ninja Theory, CD Projekt, DICE, roughly half of Ubisoft, and a half dozen others) and they don’t post anything helpful or constructive.

2 Likes

Actually you can optimize a game way better than an AAA studio can if you know enough to pull from source, build from source, and discard all the epic made trash that doesn’t work.
It just takes you longer on account of having less manpower.

How does that have anything at all to do with both: the reach of the average user, and optimizing unreal engine?
Particularly when what you have to do to get performance is to effectively throw out most of the engine?

Again. Use better engines that actually do care about performance, like Cryengine for instance, and that’s that.
EPIC does not care. They won’t change the way they conduct themself. If anything they’ll get worse as they amply demonstrated by stealing even more developer ideas after getting sued and settling the first time.

Also, quite honestly, from someone who lives in Torino I would really love to know what AAA titles you were involved in.
I’m not dismissing your claim out of hand - but yea, in practice I am.
The titles europe makes are not AAA ever. With minor and rare exceptions.
Italy has nothing that even resamles B titles being released, so there’s that.

And the troll below your reply:
First of all, you are the ignorant one.
Or else you wouldnt be begging for scraps of knowledge in every other thread AND/OR posting BS about stuff you don’t know anythig about and misleading forum users.
Heck, if i had the patience to flag all your posts youd probably have been banned already.

And lastly, to everyone.
Get back on topic. Aside from soaking Epics cork claiming they gr8 like a psycophants you are really just detracting value from the topic at this point.

If you happen to have actual tips that were not already listed above do share them.
I doubt you will, but you never know.

1 Like

As a business game engine, there’s nothing wrong with using gimmick to attract more users, but for a new team like ours, it’s easy to put too much faith in the rhetoric.

Lumen and Nanite caused a huge disaster for our team’s project management and we didn’t know until it was too late that something was wrong.

I think this problem is caused by the fact that on the one hand, the demo of Unreal engine raised our expectations of the picture, but we did not have the engine team with the technical ability to match it. On the other hand, UE launched these experimental technologies too early, and if the development team used these immature technologies, it would be difficult to maintain their extended version of the engine. Especially if you run into bugs, you either have to wait for an official fix, or you spend a lot of time and effort trying to fix the problem yourself, which will be wasted in the next engine release.

In recent years, I feel that the Unreal Engine has gradually shifted its focus to the movie and virtual production industry, and on the game side, it’s more about taking care of the business needs of Fortnite, but on top of that, Epic team has a huge community to manage, so it’s hard to keep up with all the technical details.

Dude, I’ve been following you, it feels like we’re having the same experience with Unreal Engine 5, lol, it feels like we both have a love-hate relationship with it.

Forgive me, my English is very poor, these words are generated by translation software.

To help more people avoid this kind of problem, I wrote this document:

1 Like

If anyone have tips for foliage optimizations in large open worlds I would be really happy to hear of them :smiley:

I don’t think there is much more than what was shared already.

Except for maybe the obvious:
Adapt or change the landscape so as to have less visible foliage. Canyons/high buildings/view blockers of any type including rocks/mountains.

Also,
Dead/leafless trees cost less - use more dead trees.

Since unreal is incapable of handling lush landscapes with decent performance, don’t have lush landscapes… as silly as it may seem.

After the basic changes:

You want to test/bench your worse case scenario (max #of landscapes - 4, max foliage, max overdraw, max tris count etc).
On your lowest spec supported device:
For this, you need to make sure your scalability is at the lowest, and that your materials perhaps even instance count (on things like grass) also take scalability into consideration.

Once that runs OKish, then you need to Benchmark your worse possible scenario, at the highest res you really want to support (hint, with unreal its not 4k), on the device with the max specs you want to support.

Based on that benchmark, you will then take the instance count down to where the rendering MS cost of the full base scene is around 10frames per second below acceptable.
The extra frame time will come in handy later, when you add stuff that you didn’t think of needing or introduce some gamplay gimmic of some sort that eats up FPS.

Ideally, this will only benefit the lower performing setup you already got to run somewhat decently. But you probably want to check and make sure rather than accept it on faith.

That’s really all the stuff there is to mention about it - save for some newer engine trash or settings I’m not prevy to.

Also worth noting that this testing/bench process would never change regardless of what engine you decide to use…

1 Like