Tessellation deprecated?

From the latest changelog:

  • Tessellation is now Deprecated.

Why? :frowning: It’s such a basic feature, even mobile renderers support it. I don’t understand why would anyone remove it. Maybe I didn’t notice some replacement that is as powerful as tessellation itself?

4 Likes

It still works, but the replacement is nanite

Though given the backlash over deprecating it, maybe they will reconsider.

HW tessellation is old tech now, there are better/faster ways of achieving the same thing.

I’m sure it’ll be replaced by something equal or better in UE5. (NB: which won’t involve importing meshes with billions of polygons, nobody wants that).

I’ve found this quite interesting too. Why would you do that now? On 4.26?

You guys sound like you already know what Nanite is, is that information already public? I’m intrigued what that is.

As for displacement maps, smoothing stuff and some other concrete uses, of course, tessellation is not the best way to go. But it does really general thing: tessellation allows is to just quickly add new vertices that will be processed with our materials. We can do anything with these - displace it, change vertex colors, animate it, etc. You can have one quad and turn it into grid. Does Nanite allow doing just that simple thing efficiently? If so, then it’s all good.

No public information, but it is pretty easy to anticipate what Nanite is by knowing what it isn’t. By subtracting what it is from what it isn’t or what it isn’t from what it is, you can get a reliable glimpse into what it could be. sound of distant missile

I think extrapolating the fact that tessellation is going away to not being able to perform above-mentioned things is a bit premature.

1 Like

Why would you care?
IF the engine is now capable of importing said file in little to no time (without BS, as it does currently with the triangulation process) while also producing perfect mesh tris count reductions that don’t look like s…t (as they do currently), then what difference would it make to you if you import a good mesh, or a piece of crap that was sculpted with dynotopology on and has 1 billion tris?
The end result should be the same, obviously.

There have always been tons of issues with tessellation. I would still like to know who thought it was a good idea to force it on all landacapes.

It increases the install size of the project significantly.

IIRC Epic described Nanite as “streaming virtual geometry”. Streaming virtual textures have a similar problem, it’s the reason (I assume) why Doom 2016’s install size is ~20gb larger than Doom Eternal despite having around half the content.

2 Likes

Yea but we are talking text. Because an FBX in the end is text. I doubt (though I never looked at it so not sure) the engine packs meshes much differently.
What I mena by that, is that meshes can probably be compressed bo matter how “bad” they are when you add them into the project.
sure the uncompressed size can be huge, but once you package that shouldn’t be problematic.

I’d be much more concerned on the import process taking forever as it currently does. Even when you pre-triangulate.

In the end its all speculation anyway. We don’t know what they will do…

I think you’re greatly underestimating the size of dense meshes.

3 Likes

maybe. But it’s still 3 numbers to a vector. You can reduce that a lot via both dumping floating point precision as the engine already does, and compression.
I’m not saying it’s a good practice. I’m saying that IF the engine is finally capable of handling it, then it should make little to no difference how dense the mesh you put in engine is.
in fact the game could even just pack the optimized (by the engine) version.

As things stand, not only is the engine NOT capable of handling it, but things like floating point precision would cripple a mesh dense enough to require it.

I took a random Megascans mesh (rock_assembly_tazj1) to compare disk size of FBXs. LOD0 is 706 kb at 16302 tris, while the Highpoly source is 115 mb at 1.999 million tris. That is 162 times bigger size on disk.
just how good can they come up with better compression and optimizations? is the new Zen loader really 100+ times faster?

Think of the other implications, everything needs to be 100+ times faster. You still need to save the mesh file to disk from within the editor. There’s mesh destruction/slicing systems (good luck keeping those in memory). There’s vertex painting tools. There’s new in-engine mesh modelling/editing tools. There’s Niagara mesh sampling on CPU. There’s particle meshes. There’s per-poly collision option. There’s shader effects (snow buildup, footsteps and trails, etc).

And also think outside Unreal. Most other 3D softwares can’t reasonably handle opening/saving/viewporting million-triangle meshes. There’s Source Control upload/download speed. There’s Source Control bandwidth and disk usage.

I think it’s fair to be prudent about replacing the existing tech with Nanite when the rest of the supporting tech isn’t built for what it implies


In general I’m not a fan of getting the choice removed. multipurpose engines like Unreal should be about choices. Nanite for such usage seems to rely on fast SSDs (we know is it was showcased on PS5 which is ~8GB/s), but is the adoption on other platforms there yet?

for a Megascans-UE4 workflow this means relying on the highpoly mesh entirely (very questionable), or having control by manually subdividing+displacing the lowpoly mesh with the displacement map and re-exporting (pretty horrible workflow IMO)

what about Landscapes? people please don’t just say “make a mesh to use with Nanite” - just think it through. I’ll assume the solution for Landscapes will be the new virtual heightfield, but is that enough for closeup detail?

for Water rendering I guess this means relying on actual dense meshes. not terrible but still seems wasteful. unless the new water plane rendering can be used outside of their water system

what about Skeletal Meshes (unsupported by Nanite) ? in practice I don’t know how often Tessellation is used for characters but they seem like a really good match.

and for shader effects that make use of displacement now we’ll have to do all of them in WPO, meaning we will forcefully need a dense enough base mesh to achieve anything.

I can see where this is coming from - Tessellation was never efficient in UE4 and it probably conflicts with nanite. However other engines can handle it (Frostbite, Gears5 UE4 branch) so I always hoped UE4 would eventually catch up :frowning:

5 Likes

I don’t think it’s really fair to use Megascans as your benchmark for that. Most multi-million triangle meshes originate in these programs and are perfectly fine to work with, because they’re not just a raw dense polygonal mesh like you would get importing a photogrammetry asset. Blender will choke for several minutes if you try to import a 12m triangle obj file, but it has no problems generating a 12m triangle mesh through a typical subd workflow. The high resolution asset will (usually) always be exported anyway because most people will do their baking in another application.

Not a fun workflow (especially for surfaces that need to be tileable) but the results are better looking and more efficient than what you would get with tessellation because you can decimate it to reduce the triangle count and eliminate the triangle shearing you get from tessellation.

My suspicion (given the recent addition of voxelation modeling tools) is that UE5 will have a way to do this in an automated fashion. Given that so much of the environment art pipeline currently relies on working with tiling textures and trim sheets which is a super fast way to work, I think nanite will have an uphill battle if they can’t accommodate that workflow.

I could be wrong though…

That being said, I wouldn’t say tessellation in its current state is a fun way to work either… It’s just the best of what is available…

I agree with that but nanite isn’t here yet and tessellation still works. Everything at this point is speculation, the only thing I am confident about is that packaged files will probably be much larger than we’re used to.

I agree with this when it comes to handcrafted art, but the problem stands when working with photogrammetry which can be a huge chunk of what makes up realistic scenes.
curiously enough, the UE5 demo had plenty of both worlds: all the rocks and natural parts were megascans highpoly meshes while the statues and manmade structures were subd sculpts.

in the context of an environment art pipeline, trim sheets and tiling textures are exactly the reason why you can’t just get rid of tessellation and base everything on highpoly meshes. just imagine the huge moment of “ok, we’re screwed” when someone decides the tiling scales need to be adjusted :stuck_out_tongue:

2 Likes

Just an opinion, but I suspect we are going to see the new, basic water mesh-element-thingie as a preview for nanite. Guessing some precomputation also helps for LODing/tessellation wherein you can still have the ‘all/aggregate’ set of data (LOD0?) for lighting vs what is shown as an actual mesh.

re:

Larger dataset would imply metadata/precomp-overhead?

You can stop enjoying my fortune cookie now.

As for tessellation, I still use it, I like it, seems to work well enough to really add in a lot of micro-detail (which is all I really wanted). That aside, what’s currently a better/more performant way to achieve the same effect(s)?

I have never done Parallax for a landscape and I am not sure if it is more performant. Now how would you get a surface extrusion detail if it is not with Tessallation in the future? From what I understand you can import full detailed models and avoid creating lods, normals, displacements etc… But what about for a landscape? Anyone heard anything about what’s gonna be replacing that feature?

1 Like

Tessellation on landscape has literally never worked either well, or reliably.

I think they will create a quad tree version like they have for the water, since they implemented a quad tree system on water.

Mind you this is just speculation right now, but quad tree seems to be the best reduction technique for sculpting geometry in engine - and the landscape to some extent uses a bastardized version of it already with the lod mesh reduction, but the problem of the landscape is that it’s tied to an initial size and component when there really isn’t any need for it anymore (perhaps there never was, its just that thats what the engine had and it was slightly improved upon rather than scrapped).

Regardless, my point and pretty much everyone’s hope, is that they’ll scrap the whole system and allow us to generate an elliptical planetary approximation with gravity oriented at its center - or either way a controllable gravity vector for chaos.

Hope, mind you. Leila putting faith in Obi One was likely a far more realistic scenario :stuck_out_tongue_winking_eye:

Oh yeah for sure most things develop like that as otherwise they would need to rewrite it from scratch which could take… a bit of time and money. Not sure how they are developing UE5 but I hope it is something that looks like it, and maybe your dreams will come true one day. We will know soon enough. Was UE5 planned for early 2021?

That’s very interesting. Now I have some visual glitches using some simple RVT blending so I guess we are still far from seeing this ready for production unless RVT is properly supported in UE5. Maybe I am doing it wrong but it seems the problem goes away by simply turning the RVT volumes ON/OFF in the outliner. Sometimes is about opening the landscape material instance once. Very weird.