Tessellation deprecated?

I just heard about this Today & did some experiments with that, it seems to be working pretty well. I just found some issues here and there. But, it’s better than current version of tessellation for sure:

1 Like

The virtual heightfield mesh might even stick around for UE5 as it’ll be more compressible than Nanite. Geometry is far more compressible than textures, I wouldn’t worry overly about 100mb+ meshes, they already said there’s compression over a year ago (don’t know about import times though). But I can see an entire terrain being a problem for file sizes, so…

Yeah! That’s my issue also. I think they acquired some compression company in the fast & they did released some of that work recently. So, we can hope nanite could be something interesting to see.

I think they are using Oodle for that. They are using the current master in GitHub. I’m going to test that in a couple of days.

I tried to import some high-dense meshes in UE4. (100MB to 6 GB). It was taking a lot of time as the size/density increases & finally it crashes without saying anything :smiley:

Anyway, I think in UE5 there’s a much efficient fbx(or similar) streaming solution exists. Otherwise, we have to wait ages to import meshes.

I really don’t see why this would be the case, but perhaps you know more than I do.

Epic stated that 1m triangles with a single UV channel is roughly equivalent in size to a 4k normal map. Which is isn’t terrible but it also isn’t very good when you consider a 4k normal map at 80% UV usage can contain normal data from ~13.5m triangles at a density of one triangle per pixel.

The statue in the UE5 demo supposedly had over 33 million triangles. This demo was made before their Oodle aquisition and assuming they used the current texture compression available in UE4, a 4k normal map would have been ~21mb. Putting a single 33m triangle mesh at around ~690mb. For one mesh.

Nanite looks incredible, and I plan to use it as much as I possibly can. But I am still expecting file sizes on disk to be extremely large compared to what we’re used to, even after aggressive compression.

Same feeling here too.

Not sure when they acquired Oodle, but I think Epic is using them for a long time in other projects for sure. I bet they are using that for the UE5 demo.

Honestly speaking, I suspect some part of that UE5 demo might be offline rendered or something similar. Anyway, I wish it didn’t end up like CyberPunk.

Yes, while there may be some smarter replacement for hardware tessellation, Nanite as in importing extremely high poly meshes won’t be replacement for it. It’s just completely different use case.

For example you can use HW tessellation to tessellate up close section of like 4x4km landscape, and all it takes is relatively small landscape heightmap and relatively small tiling displacement texture. Small in terms of asset/game install size on the disk as well as memory occupation.

I am really having a hard time imagining importing actual landscape mesh at the detail density of sub-5 centimeters. At 4x4km with 5cm minimum detail size, that would be 12,800,000,000 triangles. Yes, 12 billion. Not only would that be huge, but just authoring such a dense mesh would be extremely difficult, and probably challenging to even display in any kind of rasterized viewport.

So the idea that Nanite is a replacement for hardware tessellation in all cases is just unrealistic.

That being said, not knowing much about nanite, I currently assume it’s basically alternative to static mesh actor/component. But perhaps nanite is more than that. Maybe the same tech that seamlessly gradually streams triangles based on the screen space visibility could be also used to dynamically construct triangles based on the camera view from low res input geometry and displacement map. In that case, it could indeed replace HW tessellation.

3 Likes

The case of landscape is already covered by using the new virtual heightfield mesh.
I’m worried about the other potential use cases though: using tiling textures and trim sheets for environment assets and wanting relief detail out of it, but still being able to modify the tiling easily (impossible if the detail is baked into the mesh). And possible effects like snow buildup, though I suspect that will be possible with WPO with dense enough meshes.

indeed Nanite is an alternative to static mesh component. and indeed it’s gradually streaming triangles based on screen visibility. A friend of mine insists it’s like thinking of mesh geometry density the same way you’d think of texture mipmaps (not sure if that’s entirely accurate but it’s a good analogy)
IIRC It was stated at the time of the demo that it was only for static meshes with static mobility, and at that time also not supporting WPO-animation, and also not supporting meshes with masked alpha (they have plans for it but we’ll see). Basically a no-go to use on vegetation, and in retrospective becomes evident why most things in that demo were full solid and static objects

1 Like

I thought the same and they showed that in the 4.26 demo.
But then confused by this comment from the Epic Staff developer.

During UnrealFest they said it supports rigid movable objects.

I expect foliage has problems beyond alpha/wpo. Namely the fact that 90-100% of the geometry is defining the silhouette and you can’t easily remove any of it without completely destroying its shape. Same reason foliage doesn’t decimate well.

Tileable displacement workflows can be addressed, though I don’t know how easily it would be to integrate it into the engine. Basically just allow the user to tessellate meshes in editor and automatically bake down the displacement in the material to the geometry. You’d probably have to lower the tessellation amount to get near-instant feedback from changes but you could have a “preview setting” and a “built setting” similar to lightmass where you push a button and it builds all the geometry to be high res.

so, what your saying is, if i want, say ,a large stone or brick wall that would have normally looked wonderful as a tiling texture asset tessellating with vert blending and variation, now i gotta displace that into in zbrush and bring that in as one bulky large mesh without any regard for the ability to edit vert colors on the fly to refine my scene???

surely epic recognizes the shear absurdity of that? This isn’t a question of quality, this is a production time concern. Forgive me but making all my static assets in zbrush is unacceptable from a time standpoint.

1 Like

Well good news, because vertex color painting is unsupported by nanite so you can’t do that anyway.

thats a BIG oversite.

as i described above, tiling/blending/tessellation is a production speed thing. its extremely helpful and locking nanite from doing that seems like an oversite

Don’t think it’s an oversight so much as a drawback to how nanite works. Even in UE4 vertex painted meshes couldn’t be instanced.

with UE4 i used height based blended tessellation to detail out brick walls. i wonder why they can’t just take that data and convert it to nanite. seems like a logic oversite that they missed

1 Like

lol plus, who in there right mind under a time production time constraint would waste time doing all that in zbrush? thats mega oversite from a time standpoint alone. its like, “here’s uber quality but now your project costs 3 times more to get it.”

1 Like

I made a comment about working with ‘preview’ displacement above in order to adapt trim-sheet style workflows to nanite. I still think it’s possible for Epic to implement. No idea if they will.

Depending on the context it might end up being a time saver since it’s no longer necessary to bother baking, retopologizing, or creating LODs.

true

yeah i see that aspect.
what i don’t like is having to commit to an almost uniterable mesh. you sculpt, then its a pain in the ■■■ to go back and fix if needed without redoing sculpted data. as apposed to just erasing tessellated vert data then repainting or returning a Designer brick then updated in the tessellation then updating in game.

yeah you get alot by in not baking, but then again i never baked with tessellation. i never had to

If you’re working from pre-made tiling textures from a library or from substance designer that were built without a mesh then yeah it’s no problem. But if your source asset was a mesh you’d have to bake a normal/height map to get the data for the texture.

Anyway, it could be made more convenient, I think everyone recognizes that including Epic. It’s early access and there’s still time to adapt tools to address at least some of the issues

1 Like