Allow "Input Material Attributes" in Materials

I’m pretty new here, so I’ve been looking at a lot of tutorials - and a most of tutorials I’ve seen for creating materials start with sampling the albedo/normal/ARD maps of megascans materials.

Rather than wiring up 3 texture object params and 3 texture samples, it seems like it’d be easier to just use an Input Material Attributes wired to a Break Material Attributes. This would also allow performing the transformation on procedurally generated materials, since it wouldn’t depend on there being texture map assets.

It doesn’t seem like this is currently possible, but I think it’d be useful, especially for beginners. Do I need a higher trust level to tag this as FeatureRequest? Do you folks see any value in this? Is it already possible, and I’m just blind?

Use a material function and you can choose an input-type of material-attributes.

You can also look at Material Layers (or layered materials) if you want a way to parameterize things.

Thanks!
I’ve been looking into material functions and material layers, but I still can’t figure out a way to achieve this.

In my Material Function I can add an Input Material Attribute param, but if I add that function to my material, I can’t promote the pin to be a parameter.
Likewise when I create a Material Layer, it only allows me to have a single Input Material Attribute, which is used to pass in the result of the preceding layers, it doesn’t allow me to have a Material Attribute param that I can set in the layer params of the material instance.

The input itself won’t be a parameter, what you plug into it will. The will have ‘param’ in their node-name:

OK that helps, I’ve misunderstood the purpose of the Input Material Attributes node.

What I’m looking for is still the same, but by a different name - a Material Attributes param, so that I can just pass one material instance to the params of another material instance (or material layer), the same way as I can pass a material instance to the Landscape Material param of a landscape.

Say I have a complex material, with a bunch of logic that transforms the texture maps, and I want to add macro variation to it.
I could duplicate the code and add in the macro-variation logic (DRY)
I could add the macro-variation logic to the existing material behind a flag (SRP)
I could add a layers node to the existing material, and put the macro-variation in a layer (which I’m trying to do now, but it has a code smell).

I don’t see a clean and simple way to apply a transformation to a material instance to get a new material instance.

talking about this is helping by crystalizing what it is that I actually want, but I’m still struggling to achieve it.

set/get material attributes nodes. make/break material attributes nodes. blend-material attributes node.

parameterize a master-type material that carries out the basic functions as you need them. This might be different for something solid vs transparent, etc.

the logic is up to you, so if you want some kind of impact from noise, you could (for example) pass in the noise itself as an alpha, and then inside your formal material function have some logic to sample the noise and do whatnot. exactly whatnot is up to you, as is the choice to pass in the noise as an alpha, you might choose to pass in something else, maybe tinted noise, etc.

here’s an example of what I mean, in my case I don’t happen to be using noise here, as my function is set not sample it in this particular case:

I have that master landscape layer wrapped with parameters specific for a path layer, or walkable layer, etc. In my case I chose to encapsulate my functions w/parameters so I can drop stuff in out, turn them on/off with switches, etc, but again, your mileage may vary, you need to make some decisions on how you want to look at noise and pass it around, etc.

1 Like

Thanks for all your help so far!

A master material is just a parameterized material, right? That’s what I meant when I was taking about materials, I didn’t realize there was a name for it, sorry for the confusion.

I think the problem might be that, as a beginner, I’m heavily relying on materials and assets from megascans. I don’t want to create a huge complicated master material from scratch, or edit the megascan master materials, I just want a simple way to apply a transformation on existing megascan material instances outside of what the material instance params allow for.

Being able to chain together a bunch of material transformations seems useful, and I think that’s what Material Layers does, but the megascans materials I’m using don’t support layers.
I was hoping there was a way around that, but it looks like I’ll just copy their master material and add layers to it. Assuming I understand layers correctly.

look at any channel, RGB, metal, specular as a value from 0-1 and from whatever maps you sample from incorporate math that would otherwise modify it as you see fit. for example, if you had a wetness function you might take the specular and add a value to it, like a rainyness (scalar parameter) value. this would make it shiney, for example.

What exactly you want the material to do functionally is going to determine what maths you include. This is where the artistry comes into play. I’m not going to be able to tell you what to do specifically, but I can suggest you try this guys series, it’s what I learned materials from: Landscape Material Tutorial Part 1 (Unreal Engine 4) - YouTube

As far as the master-material, you make a decision on what you want to have a set of features/qualities about your game. Do you include roughness maps, do you use a flat value, do you tweak your normal-maps or add detail blending. Some basic ship-of-the-line set of my game will look thusly, and make a core set of functions around that. For me, I made a global snow function that put snow on meshes, but also can be applied to landscapes and other things so my snow is all (visually) consistent across all my stuff.

What I am describing is a modular approach and that is valuable into and of itself, but it’s also how I did it for my project b/c that’s what made the most sense for me. How you want to break things down will largely depend on what you do. If you are making a space-combat game, maybe you don’t need any insight into landscapes, etc, etc. Game functionality will have a large weight here as materials can do quite a bit and in some cases even mirror/replace some kinds of logic, like using WorldPositionOffset to rotate a mesh vs needing to update the instance transform: https://www.tomlooman.com/unreal-engine-material-vertex-shaders/

Thank you so much for your patience so far.

I’m sorry, I don’t think I’ve been clear enough with what I’m asking - I’m still new to UE so I don’t know the correct terminology, and I think its causing confusion.

I’m looking for a way to take a Megascans MaterialInstance, and transform it, without duplicating and/or editing the Megascans master Material. The Megascans master Material in question doesn’t support layers.

I am not looking for advice on how to create my own master Material, and pass in the Megascans assets.
Nor am I looking for advice on refactoring logic into smaller compossible functions, which I’m also very familiar with.

The way that I think this would look is the Decorator Pattern. A MaterialInstance that could take another MaterialInstance as a param. Presumably (but not necessarily) this would mean that you could promote the input pin of the BreakMaterialAttribute node on a master Material, which I think would be a MaterialAttributes param.

Nesting and composing Decorators is a very useful strategy in programming, and I believe it would be useful here too, especially for beginners. Since Material/Blueprint Graphs are just a Dataflow programming language, I’d like to be able to apply the same programming best practices and patterns.

Ahhh. ok. We’re kind of beholden to the rather narrow design of the GPU. It does certain kinds of math very well/fast, but the instruction-set is a bit narrower vs a general purpose CPU and as such only (really) supports certain kinds of programming-flow/structure.

What unreal offers is an abstraction-layer, the xml-based nodage we all tool around with. You could put a, for example, case, switch, decorator, etc type support at the xml level but it still has to be broken down and compile down to what the GPU can actually run (see above). So in that sense, not so much of one might want is readily supported.

Functionally what you are asking for is to take a material (rather it’s output) and use that as an input to another material/material-instance. Since the nature of the gpu-proggies are load/run, they don’t reach out to other programs being run on the GPU, they are kind of fire-forget, so mechanically, what you are asking for is to turn that material into a material function, use a blend-material-attributes, set an alpha which can be defined programatically; that is, mechanically your ‘decorator’. So far as I am aware, we need to pack everything you might want into the single material as that is what compiles down to a singular-unit.

It kind of has to be that way owing to the nature of the GPU, so, i I do understand correctly, my response is indeed your answer, at least inside Unreal.

What you CAN do aside from unreal, is look into a Custom node. It’s a node where you can directly paste HLSL code, and that certainly offers more options vs unreal.

It’s not like I don’t get you, but a lot of what we want in higher-level languages won’t break down to the GPU. Until the spec gets updated to include new stuff we’re left using what the GPU can actually do. Note that I am not an expert in HLSL, so take a look, there could be options I cannot describe to you being I might not be aware of them…

1 Like

ah, ok. I think I’m starting to understand a lot better now!
Using the Custom node and HLSL is probably a bit more than I want to take on while I’m still learning the graph, but its definitely something I should keep in mind for the future.

This still seems like it should be possible, its effectively what Material Layers do, right? I’m still obviously new to it so I’m not sure if I’ve understood their purpose properly. It seems like you could functionally recreate any MaterialInstance as a MaterialLayerInstance, and then blend them together in a MaterialInstance that supports layers.
Hypothetically Megascans could recreate all of their surface Material Instances as Material Layer Instances, and I could achieve what I want, I think? I’m not sure what the associated performance cost would be.

But given what you describe, it was probably a lot of effort for UE to support Material Layers - when they compile Material Instance into a shader, they’d probably need to in-line all the layer code to avoid reaching out to other programs. Given that they already provide a solution for this (Material Layers), I can understand why they wouldn’t want to spend the effort to create another solution for it, especially one that would mainly only benefit beginners.

Please let me know if my understanding of Material Layers is way off!
and thank you again for helping me wrap my head around this and understand some of the limitations that we face due to the necessity of running all this on the GPU!