Tessellation deprecated?

I just heard about this Today & did some experiments with that, it seems to be working pretty well. I just found some issues here and there. But, it’s better than current version of tessellation for sure:

1 Like

The virtual heightfield mesh might even stick around for UE5 as it’ll be more compressible than Nanite. Geometry is far more compressible than textures, I wouldn’t worry overly about 100mb+ meshes, they already said there’s compression over a year ago (don’t know about import times though). But I can see an entire terrain being a problem for file sizes, so…

Yeah! That’s my issue also. I think they acquired some compression company in the fast & they did released some of that work recently. So, we can hope nanite could be something interesting to see.

I think they are using Oodle for that. They are using the current master in GitHub. I’m going to test that in a couple of days.

I tried to import some high-dense meshes in UE4. (100MB to 6 GB). It was taking a lot of time as the size/density increases & finally it crashes without saying anything :smiley:

Anyway, I think in UE5 there’s a much efficient fbx(or similar) streaming solution exists. Otherwise, we have to wait ages to import meshes.

I really don’t see why this would be the case, but perhaps you know more than I do.

Epic stated that 1m triangles with a single UV channel is roughly equivalent in size to a 4k normal map. Which is isn’t terrible but it also isn’t very good when you consider a 4k normal map at 80% UV usage can contain normal data from ~13.5m triangles at a density of one triangle per pixel.

The statue in the UE5 demo supposedly had over 33 million triangles. This demo was made before their Oodle aquisition and assuming they used the current texture compression available in UE4, a 4k normal map would have been ~21mb. Putting a single 33m triangle mesh at around ~690mb. For one mesh.

Nanite looks incredible, and I plan to use it as much as I possibly can. But I am still expecting file sizes on disk to be extremely large compared to what we’re used to, even after aggressive compression.

Same feeling here too.

Not sure when they acquired Oodle, but I think Epic is using them for a long time in other projects for sure. I bet they are using that for the UE5 demo.

Honestly speaking, I suspect some part of that UE5 demo might be offline rendered or something similar. Anyway, I wish it didn’t end up like CyberPunk.

Yes, while there may be some smarter replacement for hardware tessellation, Nanite as in importing extremely high poly meshes won’t be replacement for it. It’s just completely different use case.

For example you can use HW tessellation to tessellate up close section of like 4x4km landscape, and all it takes is relatively small landscape heightmap and relatively small tiling displacement texture. Small in terms of asset/game install size on the disk as well as memory occupation.

I am really having a hard time imagining importing actual landscape mesh at the detail density of sub-5 centimeters. At 4x4km with 5cm minimum detail size, that would be 12,800,000,000 triangles. Yes, 12 billion. Not only would that be huge, but just authoring such a dense mesh would be extremely difficult, and probably challenging to even display in any kind of rasterized viewport.

So the idea that Nanite is a replacement for hardware tessellation in all cases is just unrealistic.

That being said, not knowing much about nanite, I currently assume it’s basically alternative to static mesh actor/component. But perhaps nanite is more than that. Maybe the same tech that seamlessly gradually streams triangles based on the screen space visibility could be also used to dynamically construct triangles based on the camera view from low res input geometry and displacement map. In that case, it could indeed replace HW tessellation.

The case of landscape is already covered by using the new virtual heightfield mesh.
I’m worried about the other potential use cases though: using tiling textures and trim sheets for environment assets and wanting relief detail out of it, but still being able to modify the tiling easily (impossible if the detail is baked into the mesh). And possible effects like snow buildup, though I suspect that will be possible with WPO with dense enough meshes.

indeed Nanite is an alternative to static mesh component. and indeed it’s gradually streaming triangles based on screen visibility. A friend of mine insists it’s like thinking of mesh geometry density the same way you’d think of texture mipmaps (not sure if that’s entirely accurate but it’s a good analogy)
IIRC It was stated at the time of the demo that it was only for static meshes with static mobility, and at that time also not supporting WPO-animation, and also not supporting meshes with masked alpha (they have plans for it but we’ll see). Basically a no-go to use on vegetation, and in retrospective becomes evident why most things in that demo were full solid and static objects

I thought the same and they showed that in the 4.26 demo.
But then confused by this comment from the Epic Staff developer.

During UnrealFest they said it supports rigid movable objects.

I expect foliage has problems beyond alpha/wpo. Namely the fact that 90-100% of the geometry is defining the silhouette and you can’t easily remove any of it without completely destroying its shape. Same reason foliage doesn’t decimate well.

Tileable displacement workflows can be addressed, though I don’t know how easily it would be to integrate it into the engine. Basically just allow the user to tessellate meshes in editor and automatically bake down the displacement in the material to the geometry. You’d probably have to lower the tessellation amount to get near-instant feedback from changes but you could have a “preview setting” and a “built setting” similar to lightmass where you push a button and it builds all the geometry to be high res.