There definitely needs to be a popcorn eating emoji on these forums as a result of this thread.
I dont think that you eating popcorn does benefit this thread in any way.
Thanks! I know that culling is done automatically in UE, it just would be interesting to visualize this and check if this depends on further factors which could be improved manually
With the recent versions, landscape performance has already improved. This was mostly due to some fixes with tesselation, but the result is “ok” so far on my landscapes.
Cant wait to test with … 4.17 I guess? (stop video).
As said by , You can use freezerendering console command coupled with VisualizeOccludedPrimitives debug feature.
I don’t recall any fixes regarding tessellation, apart from change not to render subsequent landscape LODs with it.
I still see pretty noticeable and unexpected bottleneck somewhere between vertex factory and hull shader.
I went ahead and made a short video explaining the idea a little further because I didn’t feel the post really explained it that well:
your first post explained it properly. what you suggest is to bias the mips based on the layer weight.
to an extent this can already be done now. all you need is to change the texture sampler’s mip settings to mip bias, and then hook the value of the landscape layer weight to it with some factoring/multipliying/etc.
I would even question the visual result. I would really suspect that blurry/pixellated textures will show through kinda obviously even if it’s only 10-20% blended - these things tend to be eye catchy in a negative way
also this whole thing comes with some limitations:
- biasing the mips disables anisotropic filtering. i.e. makes your textures very noisy at grazing angles.
- biasing the mips by a percentage (based on layer weight) requires knowing how many mips there are available (basically ‘detecting’ the texture resolution) which I believe isn’t possible in an automatic way.
- this is something that happens in the shader itself but making behavior that is landscape-specific doesn’t fit the general-case usage of the material editor. i.e. the material doesn’t know that you’re making a landscape and doesn’t know that you want to activate this feature (and if so, in which specific samplers, which are tied to which specific landscape layers)
so technically I don’t think this suggestion is viable in the current state of things.
It is pointless to post a 5 minutes video explaining the idea, that can be laid down in 5-7 words.
Besides, any kind of suggestion should be backed up with something other than personal experience.
- Do you happen to know any shipped titles, confirmed to utilize this method?
If above is no,
- Can you reference to a publication with above-mentioned method?
If above is no,
- Do you have some sort of reliable metrics to prove that suggestion is viable?
If above is no, you should go back to the drawing board and think of something better.
I state that from everything discussed in the thread, your suggestion delivers least performance gain while giving back the most noticeable visual deterioration.
Yeah, I’m well aware that most of it can be done in hacky ways, through materials; which I’ve already been using, for quite some time, and it works great. You have to sample the landscape, perform some math, to break up into 0-1-2-3-etc mips, and then feed those into the textures. You also need to make sure textures are set to use mip bias relative. Here is an example of the mip filtering math:
What I’m proposing would be done through actual engine code. It would need some new specific features added in for analyzing the landscape and prebaking the numbers out to help the materials. This way it scales better and you don’t have to hand pull that SAME information PER layer of a material.
Actually, you’d be really surprised. There is next to no real noticeable fidelity hit when going from a 4k to a 2k texture, when it’s only at 50% opacity (unless you’ve been made aware of the trickery and are specifically looking out for it).
Hence why it would need to be implemented into the actual engine… And you wouldn’t do it by a percentage like that. You would posterize the visibility alpha, into discrete steps and make sure texture is set to mip bias relative(not absolute). So with the posterized alpha, a 1=mip0, 0.75=mip1, etc etc. Obviously, if you’re already at you last mip, it’s not going to go any lower. See the example filter from above.
Using that same logic, then why do we have landscape specific nodes and why are there a ton of different material domains, blend modes and shading models? Case specifics get added in…
You guys crack me up with the negative Nancy attitudes
that’s fine and dandy when you use 4k textures. case in point would be using a 2k texture and seeing it as a 512 at 25% opacity.
and this is all relative to the tiling you use on your textures. the entire scenario changes if your 4k texture is tiled to 1 meter or say, 10 meters.
also if you blend 2 4k textures and at 50% opacity they look good as 2k, you most likely don’t need them to be 4k in the first place. this directly ties to the point above (tiling)
easier said than done. I’d suspect some API or hardware limitation to why anisotropic filtering gets disabled in the first place. I might be wrong but I don’t think it’s trivial.
both the MipLevel (absolute) and MipBias (relative) disable anisotropic filtering. I’m curious what your examples look like at grazing angles on far distance.
you can play with the mips all you want and in fact these numbers kinda reinforce my previous point. if at 75% opacity your 50% downscaled texture looks good, you probably don’t need the 100% scale in the first place.
also notice that if you posterize the values you (probably) would get hard lines, which would become more obvious as you pan the camera through the surface
yes you have a good point. we would need a TextureSamplerBasedOnLandscapeWeight node with inputs for min and mip bias, where you also have to specify the landscape layer it belongs to. maybe also a bias factor because I’d probably want a different behavior than yours.
it’s not negative. I simply refuse to ignore the downsides to it and question the benefit. that’s all.
since you already have an implementation working it would still be nice if you can provide performance comparisons
This is very interesting flamewar and thanks to it I am now more aware about issues in UE terrain implementations. Thanks! Here is point of view of guy from different engine:
Just for comparison, I have 22 2048 terrain layers in Unigine, engine uses texture arrays (all diffuse textures are in one atlas, all normal maps in second, etc…). I can blend any texture from array with terrain texture (which is very useful for for example grass variation, you can save A LOT of textures). I think there is limit of blending maximum 4 textures at once, but in reality you never need more. There is also texture LOD system, so any layer can be displayed from defined distance. Texture can height blend and have displacement. Performance impact is quite small, my guess is cca 4% down per each layer. In theory it could be also possible to make megatexture, split it into pieces and put on terrain chunks (terrain is split into smaller parts with independent diffuse textures, which are generated during import from terrain texture), but we had never need for it. And this terrain supports round earth and can be infinite. There are also limitations, shaders are hardcoded, so you cant have fancy effects on terrain textures. No uv scroll, etc…Unless you want to write you own shaders. Anyway this is proof, that it is technically possible to make performing terrain solution without virtual texture.
Some sceens/videos: https://developer.unigine.com/
Virtual texture is not salvation, actually I see benefits of virtual texture more for standards assets than terrains, because you can achieve very good results with layered terrain too. benefit is IMHO somewhere else. You can have all textures from one location in one texture, which can together with proper rendering optimization save huge amount of performance, memory and loading times. Such solution was written many years ago as Unity3d plugin, but now it is almost forgotten (probably best unity plugin ever). Maybe you can contact this guy and bribe him to port this into UE4, it was heavily optimized c++ code and in theory could be connected to any engine with reasonable API (render to texture, etc…): http://amplify.pt/unity/amplify-texture-2/
My advice is to not focuse on terrain much, it is better to cover terrain by vegetation, stones, decals, whatever and you dont have to care about terrain much. For example covering terrain with grass looks great and you really dont have to care about quality of grass textures.
Just my few cents.
If you’re talking about proceduraly distributed meshes, well, that’s another dead end we can’t even do that without major frame rate drops.
https://answers.unrealengine.com/que…ance-drop.html
77 Votes and target fix only gets continuously postponed.
As an artist, I’m not going to be painting each and every foliage asset manually over tens of kilometers, I got better things to do. Really hope they finally fix that in 4.19.
UE4 will also get texture arrays soon, Epic said 4.19: https://github.com/EpicGames/UnrealEngine/pull/2340
Anyone interested in having them in 4.19 should probably comment there and remind Epic about it, 4.19 isn’t far away any more.
This.
Epic said a lot in their roadmap that things will come “soon”. I wouldn’t get my hopes up until it’s actually in the release, especially considering that “open world” isn’t on their priority list at all.
I really hope we can enjoy built-in texture arrays, as well as a tool to make them in editor from ordinary textures, hopefully before Sun out
There are some other landscape improvements on the horizon too, allegedly expected for 4.19
Indeed, it is no to paint even 1square km with basic foliage by hand. Also performance would be terrible, with thousands of objects in scene there will be so big CPU overhead, that framerate would be soon zero (cpu must decide what render, etc…). Btw, procedural placement should have even better performance than manual one (for big numbers of objects). It is because you dont store all these instances in scene all the time, but you create them only when visible. So you can have millions of objects in scene with cost only for what you see. Also there is usually some kind of batching, so these “clutters” are quite cheap.
But I think it is always about what is engine focused on. Unigine can do open worlds flawlessly, but FPS features are several years behind CE or UE. CE/UE has great FPS features, but troubles with massive worlds. You probably cant have both, without sacrificing something else…
All Epic needs to do is to focus on open world for 1 or 2 engine updates. They will enter their golden age afterwards.
Epic will also need to create their own games which utilize big, open world, dynamic lighting etc - otherwise those features can be integrated (PR) but they will never be properly supported.
This is typical Unity3d issue. Adding millions of features, on the paper it can do everyting, but in reality nothing works.
at least Fortnite Battle Royale is picking up some traction, meaning we get some updates to openworld and networking (as seein in that new blog post). it’s not much but it’s something
An Corp. Interesting to visit this thread 4 years later. Nowdays Unigine support 20 layer terrain (each layer can have up to 4 textures with individual blend mask and is automatically tesselated, no normals maps by design). Such terrain can be technically limitless and empty makes arround 200-300FPS with 16k height map on 1080ti (no virtual textures), na matter what blending do you do. By several light years ahead of Unity, CE and UE.
But I am curious what changed on UE side. Virtual texturing is here (but I am not sure if it can help for huge terrains, resolution would be probably too low, at least this is result of my quick test on 20*20km terrain), UE5 also support texture arrays (already checked this, seems working fine). Did anybody already try to leverage this for terrain or to solve issues mentioned above?