I would definitely like the ability to switch out layers at runtime. I hardly ever use material instances within the editor but almost all of my assets create new dynamic instances of materials at runtime.
When it comes to landscape blend functions, it would be really useful to have a landscape layer sample parameter that can be exposed in the blend function. Is there anything like that?
Also is it possible to assign a material layer directly to a mesh?
I’ve only been cracking at this a couple hours now, but I personally see this system being very useful to my workflow. I’d love to see a “randomize parameters” button by the layer stack that would go through and randomize all the parameters attached to each material layer. I guess you would need to set a range for each parameter somewhere along the way though. Is that something that could be implemented?
I could see this being used dynamically when objects spawn in to populate scenes, get some slight material variations across objects in the environment every time you play on the same map.
Hey, I watched the stream yesterday about material layers and there was some hesitation to implement things such as dynamic mat. etc. I just wanted to say that, in order for developers to adopt material layers, it needs to be at least equivalent in terms of features to what we currently have with regular materials. If material layers do not support landscapes, or they do not support dynamic change from code (all this being possible with regular materials), then this will be a no-go (at least for me!). So please make sure you are not restricting what we can do with mat. layers in any way compared to current material system, this is a major requirement for me and probably for many.
Also at release time, please clearly state the limitations compared to regular materials, I do not want to move all my materials to the new systems to realize I am hitting a show stopper at some point. In other words I’ll move to the mat. layers only when I can be sure I can do at least the same things as with regular materials.
Yesterday during the stream someone showed a new node that allows you to select with a drop-down menu which channel from a texture a mask will be selected, R, G, B and Alpha. I would request the option to select the “Black” Color as well, as in, no channel selected, nor alpha. In my workflow I when I pack Material Masks into channels sometimes is useful to leave some UVs in the black areas to use one more material.
Hi Raildex, hopefully this one was answered somewhat in the stream we did yesterday but I’ll try to clarify the changes and some thoughts:
- Parameters in materials are still driven together if they share a name
- Parameters within a layer (or blend) are “owned” by that layer and any with the same name are shared as a single parameter, similar to materials
- Having parameters owned by layers allows you to use multiple of the same layer with different instances and customization e.g. one base metal layer tweaked to gold and then copper can be used in the same layer stack, each layer instance using it’s own copy of the parameters.
That said, we’re still finalizing the best way to control which parameters are owned and which are shared (globally) across all layers. This is something we feel is important to get right as it will likely be core to the final workflow, we’re not there yet in the preview however.
Hi RyanGadz, so there’s two points to address here, one Lauren discussed above and the other we talked about on stream. Again I’ll try to clarify what I can, but this may be a little lengthy so apologies if it’s a little more in depth than you wanted:
Dynamic Material Instances:
Dynamic instances will work with materials using layer stacks. As Lauren discussed above, right now there isn’t a good interface for setting those at run-time but the functionality works under the hood, you could expose it yourself in the 4.19 engine code but would require lots of hard-coded layer numbers and offsets to control which would be pretty painful to work with. This is something we are planning to address for the final release.
Dynamic Layers (Part 1):
So the initial idea we had for a future release, probably not the first version, is the ability to add a layer to the stack and flag it as dynamic. This would wrap the layer in a dynamic branch so you could effectively switch it on and off in a cooked game. There would be a cost here even if the layer was off, though ideally much cheaper than evaluating the whole thing and blending out the result. The major restriction here is that you would need to build the full stack of possible layers in your material, you wouldn’t be able to swap out the layer entirely for something new. For cases like damage effects or an expensive wetness effect this is likely the right tool.
Dynamic Layers (Part 2):
The next step would be something much more free where you could change the layer stack at run-time, but this isn’t something we’re evaluating in the near future. An important thing to note about materials is that when you build a graph that gets collapsed down to code which goes through the platform-specific compilers and processing to generate the actual shader that is used to draw your object. That process is very slow. When cooking and packaging a game Unreal Engine gathers up all of your referenced materials, compiling all the variants you might need then builds them upfront so that at run-time you can swap them freely with minimal cost. Changing the layer stack requires each of those shaders to be recompiled to account for the new graph code, which isn’t something we could do dynamically without adding a huge stall every time you swapped a layer.
An alternative might be to let you create a pool of user-selected layers, pre-compile all the variants for you and then under the hood pick the best match when you swap. That would work, though would require some fairly invasive material code changes and you’d probably find a surprising amount of memory eaten up by all the shaders created.
Dynamic Layers (Part 3):
A more realistic option Lauren touched on in the stream is better grouping of parameters into a single object, this would open the door to swapping compatible layer instances at run-time. As long as the layer instances match (static switches and such), the same parameters should be valid and we could certainly try making it easier to swap them wholesale.
So, it’s not to say fully dynamic layer stacks will never happen, if we all agree that’s the right move and there’s a major benefit there we’d invest the engineering time to work on a solution, but that is time that could be spent doing something else, say general material system improvements or additional features. As we tried to stress in the stream however, feedback is key, we need to make sure we’re developing the right tools that people will benefit from, so please keep it coming!
Hi MarGon, as I mentioned in the stream, landscape compatibility isn’t something that’s been directly targeted for this preview version but is something we’ll aim to address for the first release. As landscapes already have their own unique system for stripping layers when not used on a landscape tile this might warrant something unique on the layer front too but that may end up outside the scope of the initial push.
Not at this time but there’s no reason it can’t be similar to material functions, where applying the function generates a material for you.
A cool idea but definitely outside the scope of what we’re aiming for with the first release! It would make a pretty interesting editor plugin though. If we explored the suggestion above in Dynamic Layers Pt3, that might be a fairly convenient way to go about it but it depends if you want authored variant instances or truly randomized.
Hi Eyoli, hopefully the above comments from Lauren and myself clear this up! Ultimately material layers will be compatible with all existing material functionality.
Absolutely, we’ll try to make sure the final release notes have the current limitations in there. I definitely would not recommend anyone switch their full production pipeline to material layers yet, the system is still in flux and isn’t complete. The feature is still marked experimental at this time so swapping over would be unwise.
Hi MDiamond, that was the channel mask parameter. I added it to help the common operation of picking a single channel from a texture or color which comes up a lot with layers and we noted many cases ended up using static component masks for this. As each of those added potentially introduces a new permutation we wanted a cheaper alternative that allowed better re-use of materials, especially with an instance heavy workflow. Can you please go into a little more detail of how the addition would work?
Internally that node wraps up a vector parameter and a dot product to pull out the channel data, so in that case a black value of 0 would always return 0 i.e. no mask. I could see that having it’s uses to blend something out but if you have any good use-cases it would be nice to better understand the potential here.
Thanks for all the feedback so far everyone!
I have to dig an example of using a black ID, for now I guess its easier to educate my team to start painting Material Masks starting with the black color so the bottom-most material will be in black areas and each subsequent layer is read from the next channel.
I did however found an example of when adding masks from two channels could be useful. In the pics bellow, the chair fabric is two tone, and when blending materials, I’m selecting two channels to the fabric part, but then I’m using the Override Base Color and selecting just one of the channels.
This keeps my tiling, roughness etc values the same and makes me not pay the instruction cost of adding another fabric layer and using the same parameter values as the first fabric layer. I guess for weird exceptions I will make my own blend function that allows dot products of more channels at once, which is the beauty of the current implementation as I see it, unless there’s a better way to handling cases like that.
That looks very useful! Great work
As RyanGadz said, the ability to switch layers at runtime would mean a lot too. Without dynamic layers, this feature is “just” a workflow improvement for manually made assets.
With dynamic layers, this feature also improves visual customization options at runtime, which is non-negligible!
I’m enjoying layers a lot
I’m not sure I have the workflow down / somethings going wrong. I tried to recreate the awesome Death Blossom’s master material from the Infiltrator (shown below)
But after only a couple of layers I’ve run out of texture samplers.
I’ve tried to show a examples of what I’ve put together, I think I need to add bolts, decals and emissives to the material so I’m only about 1/2 way and hit the 16 samplers limit where as the original material only had 14//16 used.
I assume it’s something to do with the way I’ve put it together.
Cheers for any ideas.
I get the following Error when I try to use a Shared Input:
I want to use a Layer Blend with that Parameter, because it was suggested in the stream.
Solved my earlier issue of running out of texture samplers by putting masks into a material input collection. Problem is I think I’m using those wrong as if I want to put another type of mask i.e. body, head, coffin section it’ll overwrite the other masks in the shared material input collection…
Raildex excuse my ignorance but why are you using a Vertex Interpolator?
Can’t you just connect the parameter to the input like my pic below?
It didn’t work without VertexInterpolator either. My original Setup didn’t use the VertexInterpolator, but since the Message says something about VertexParameter and PixelParameter my assumption was it cannot convert he SharedInput to PixelShader instructions or something like that.
Something that would really make development faster for me would be the ability to use Materials like Material Instances to tweak parameters on the fly for things in the scene. Is there anyway you could add a toggle onto the Material so that you could tweak it without recompiling (maybe in a separate window)? This would allow faster testing without creating new assets.
Not sure, i started dragging the materials which avoids the crash and dont dare to press the arrow any more
In the new MaterialLayerblend, why am i not able to use operations such as TransformVector, or PixelNormalWS (VertexNormalWS seem to work)? Anything tangent/normal seem to be unavailable, think someone else mentioned the TransformVector doesnt work earlier in this thread. I want a snow material to show up on the slopes of my mesh.
You must update to the preview 4 which such issue was fixed.
I have an issue.
In my Material Layer Blend when I’m trying to create an alpha that is based on base color, normal etc in the incoming layers i get an error in the Material with the layer stack:
“(Node MakeMaterialAttributes) Error on property BaseColor”
Why doesnt this work? If this is not possible then a Blend Material is almost useless…
See the picture for the Blend Material:
Hi Showster, are you using shared samplers in your layer texture samples? There are global shared clamp and wrap samplers which will likely cover most use-cases unless you’re after some manual mip-biasing. The one thing that could hit here is that each texture parameter is owned by a layer so they’ll be duplicated as objects even if set to the same value, just to maintain the ability to swap them independently. SharedInputs would be the short-term solution in the current preview, though as we mentioned in the stream we’re likely removing those as a concept as we’d like a better option long-term, perhaps per-parameter you could flag as global/shared in addition to the instance stack level having an option to tag parameters to link as shared.
One suggestion I’d give is to take advantage of the Layer Stack node’s input. It takes in a MaterialAttributes which is then available as the input to every layer, so in some of these cases you could e.g. Set your texture parameters as BaseColor, Emissive, Normal on a SetMaterialAttributes node, pass that in to your layer stack node, then in each layer use the GetMaterialAttributes on the input and grab that data. Whilst they have those attribute names, under the hood it’s mostly just groups of data e.g. BaseColor is a Vector3 being passed through so you can re-use it how you wish. In that way you can either treat it as arbitrary data or as the original intention, which was a “base attributes” input to the stack so you can get that same data in each layer.
Hi Railex_, thanks for trying out the feature but that definitely sounds like a bug. Are you using the GetSharedInput for a few different material attributes perhaps? The error reads as though it created a per-pixel variant of the shared Set graph, but then tried to use it for a per-vertex attribute, so it should really have made one for each required case.
Hi Erik007, TransformVector is a known issue, it’s not usable for every case as the information required for the transform isn’t always there. The error is a bit overly aggressive though and I believe it was toned down but perhaps that missed the 4.19 branch. We made a bug tracker to chase this and I’ll see it’s there for 4.20 at least. Some of the others may be down to the attributes they’re used with; In general BlendMA is a bit tricky because of the Normal attribute being in there, so if you have a simple graph that uses PixelNormalWS as a blend input for two MA inputs, it will try to blend the normal with the result of the blend you’re in which expectedly fails. You can work around it by manually blending the normal with Get/SetMA nodes alongside the BlendMA node with some other type of blend, or manually create an alternative transition. There’s also a few flags on the BlendMA node to opt in/out of pixel or vertex attribute blending which can help clear the graph in some cases if you want to just blend those groups separately across multiple blend nodes.
As an FYI for everyone using the experimental layers feature:
After a fair bit of discussion internally and much external feedback we’re very likely going to remove the SharedInput asset and associated nodes from the next 4.19 preview build. As a support class for an experimental feature it isn’t quite checking the boxes it needs to and only building up dependencies that we’ll later have to remove. For the sake of Layers in 4.19 I’d recommend using the MaterialAttributes input to the Stack node by pulling from the input within each layer to accomplish the same as the SharedInputs were offering and please continue to give feedback about workflow improvements you’d like to see here.
I’ll update this post if we go ahead and have the primary post updated to show an example of that alternate workflow!
EDIT: This should be in the next preview version, I’ll work with Tim to get the release notes and main post samples updated. Anyone using shared inputs will find the nodes missing when they open their graph and will need to update the connections. The shared input collection assets will still be in your content browser but no longer function and can simply be deleted.
The editor option for enabling material layers has moved and will need re-toggling. It’s now under “Project Settings > Rendering > Experimental” as this now also changes some run-time behavior.
Since the Material Layer Stack is just a collapsed Graph, why don’t you set Parameters by their name just like it was before Layers?
Before Layers, Parameters in Material Functions are Parameters globally in the whole Material.
I guess it’s important for many developers that you can atleast drive parameters of single layers at runtime. Building a Material at runtime (adding, removing layers ) shouldn’t have that much of demand if you ask me.