Managing per object dynamic graph instance


I’m working on a project where most of my materials are generated at runtime using Dynamic Material and Dynamic Graph Instances.

I’m doing it that way so I can have some per object personalization, like changing color, roughness etc without having to manage multiple graph assets.

My doubt is: what is the best way to manage these graph instances so I can minimize my vram usage.

A situation it might happen, for example:

A 3 store building with 9 similar apartments, each with 3 identical showers. For each shower, my blueprint would create and render a graph instance, so, 81 identical graph instances rendered (3 x 3 x 9, I’m not considering level streaming). Is there an already implemented way to optimize this situation?

In addition, substance unreal documentation mentions a “generation mode”. Is that deprecated?

Hope I made myself clear.

This functions similarly to how Material Instance Dynamics (MID) does, providing a way for you to control parts of your material graph dynamically at runtime through scalar and vector parameters. The difference is that CPD has the advantage of storing data on the primitives themselves rather than with the Material Instance, which lowers the number of draw calls for similarly placed geometry in your Levels (such as walls, floors, or other duplicated geometry).

I think the engine automatically unloads things that aren’t visible, but you can do it manually.
For example, if you’re in one room looking at a shower, you know you won’t be seeing any other shower, so you can unload all the rest.

That’s a really cool feature that I didn’t know about. Thanks midgunner! But I think I forgot to give more context. These graph instances are from substance plugins, which means that they output textures, or something really similar, I believe. Therefore, it seems that I can’t use it to drive Primitive Data.

I also believe unreal unloads unnecessary data. In that case, let’s say I’m directly looking at 10 identical showers, each with it’s own MID and graph instances. Wouldn’t there be redundant data?

I don’t use substance, so I wouldn’t know what to do there. But, if they are simple things like colors, you could try to get that into the blueprint and use that for CPD.

I would assume each MID would be more data, but not that much, just the parameters; I think the material graphs will only exist once.

With the meshes, I don’t know if identical meshes are instanced automatically or if you have to set it up manually. In the performance section on the CPD page, it says “Draw calls are reduced using the Mesh Drawing refactor that automatically dynamically instances scene primitives,” which confirms dynamic instancing. But that’s using CPD, so I don’t know if it’s the same with MID.

Graph instances are a way to generate textures procedurally. They output 2D textures that we can use to drive base color, roughness, metallic etc.

Reading the documentation you provided and testing it myself, I noticed texture samples cant be marked as “Use Custom Primitive Data”.

I’m considering to create a custom “Graph Instance Manager” to store and retrieve these graph instances. But, before I do that, I’m trying to learn if creating multiple dynamic graph instances is a real problem and if there is a already implemented way to deal with it.

Well, looking at your goal, you don’t actually need to change textures at runtime; only colors, roughness, and such. So unless you need every shower to have its own set of textures, I would just use one set, then use either MID or CPD for adding variation.
In other words, use substance only to generate the main textures, and use unreal’s MID or CPD for variation. This is how you would do it in any regular game (e.g. armor colors in halo, or car colors in a racing game).

Also, there are nodes in unreal that let you add detailed variation to things (e.g. you can use the world position of the object to add variation), so I would use those instead of making a new substance graph for each instance.

Hope you don’t mind helping me figure this out midgunner!

Perhaps the shower example doesn’t show my point the way I thought

Im uploading a RGB mask, shown below. Considering that mask, I’m building a system, using substance designer, where I can apply different materials to my whole mesh with this single mask. So, let’s say, the black portion would be a chrome metal, with a low roughness, while the white portion would be a gray plastic material, to represent the nozzles. The red portion would be a logo, with a third color, perhaps a different roughness and so on. I built it in a “generic” way so I can have up to 8 different appearances + a logo and can use it for showers, toilets, furniture, light fixtures etc. Before commiting to this solution with substance, I was considering using unreal material layers, but the documentation says it isn’t very good performancewise.

So, for my example with showers I could figure a way to use CPD to change colors. But lets say I have a branded sofa and I want to have real life options for the fabric, or change to a leather, or swtich wodden pieces for some kind of metal etc. Thats why I’m using substance.

My project is a ArchViz project, but I trying to get it as performatic as I can

So something like this? You can download this demo on their site, plus a bunch more (I actually downloaded their car configurator demo a while back and still have it on my computer now).
I don’t know how they did it, or whether they used substance or something else, but you can contact them and ask.
But in terms of performance, this shows it’s possible, so I don’t think you’ll really need to do anything too complex.

The engine was made with archviz in mind, so you should have nothing to worry about:

Before worrying about performance, I would just make it actually work first, then, if performance is an issue, optimize later. Instead of speculating and committing to something before even making it, run some tests to see which gives the best performance, and go from there.

It’s kinda working already, that’s why I’m trying to optimize it.

The problem only becomes a problem if I have multiple objects that were supposed to share the same textures. I think the path here will be to crate a “texture manager actor”, which will manage my dynamic graphs and server them as my other actors request.

And I emailed them. Thanks for the support @midgunner66.

No problem, glad to help!