Best practices creating Nanite modular environments

Hi!

I would like to pick your brains in regards to the new Nanite workflow. I’m using Blender for modelling and sculpting and then Substance Painter for texturing my models. I’m currently working on a modular quay and can’t get a good workflow going.

Here’s what I got so far:


Some sample wireframes:

To get to this stage was quite laborious. I modeled different variations of 4 different stone types and sculpted them to get those rugged edges. Those hero-stones were then uv-unwrapped and given a material in Substance, which I imported back into Blender. I then manually placed copies of those hero-blocks to build up the quay and pavement.
There’s still some pieces missing, like the mortar between the blocks to make it a solid object, as well as more wall below the water line.

My thinking behind this workflow is that by creating those hero stones, I can have one set of textures for all those elements, i.e. the stones will be presented in high-def.

The shader in UE can then layer additional textures on top, based on world position etc. to bind the aesthetics together.

In a way, this workflow is similar to a displacement-map workflow, where you’d create a set of universal textures, like trim sheets, and you re-use the same elements of the textures for reoccurring elements in your model. Here, I create a set of universal high-res parts that are then shared and re-used in the overall model.

BUT…

Looking at the statistics and the file size of my asset, I begin to doubt my approach. My quay wall already has over 2 million verts, the FBX file has a whooping 120 MB. Extrapolating from that, if I build up the rest of my modular wall set, I’ll end up with some 2-4 GBs for the completed meshes. To me, that sounds excessive?

I haven’t tested yet how Unreal handles my mesh - maybe when those assets are deployed, they shrink in size, I don’t know. Either way, I’d still need a gigantic hard drive to store all this information, even worse when those meshes are checked in with version control.

SOOOO…

I would like to read your opinion on the workflow, what’s good, what’s bad and suggestions how to stream line it. Hopefully this thread will be helpful to others who are starting out with Nanite.

Hi Virtus,

I’m planning to test a full Nanite environment, so any update about your project? it can be helpfull.

Thank you.

As I see it, sculping all the details is way too time consuming for something that can be achieved via displacement maps (or tessellation? I saw a video saying that it was back in UE5.3).

Also the poly count is too high indeed. I doubt your scene will just be stairs and a wall. If you have more stuff done the same way, it won’t be playable.

Thanks for the replies! By the time of my post, displacement maps weren’t available yet to Nanite meshes. It appears to me that with the introduction of 5.3 the workflow is much closer to how it used to be before Nanite became a thing - with the added bonus of the high-fidelity, high-performance meshes that the new tech brought.

Since I didn’t get an answer back in February, I didn’t really follow up with the asset creation side and instead focused my time on gameplay elements instead.

So @kanzari I can’t really give you much of an update. I personally got excited reading about the displacement map options so I’ll probably go down that route when I get back into 3d assets.

@carmanfer I’m curious about your comment about playability. My understanding of Nanite is that it makes it possible to have high-def meshes virtually everywhere without noteworthy performance issues.
You’re definitely right about the workflow being time consuming though. And as I mentioned in my initial post, I also have concerns with just build sizes getting way too big with my approach.