If I remember correctly, there is a way to do that already, but I’m afraid it creates a new UV sets altogether. So I don’t think it could serve as a lower LOD material for the same mesh…
So I do +1 on that.
Hi there.
We cannot use PixelNormalWS in Layers or LayerBlends, even though when we do not blend the Normal.
I.e. even if we blend BaseColor Only with PixelNormalWS, it gives us an error that PixelNormalWS is not allowed in the Hull/Domain Shader
So I’m playing with the experimental feature in 4.19 and I’m unable to find a way to display the parameters from my master material inside its instance version (in the details panel). In the GDC session demo/tutorial, the parameter group section is visible, but on my side it isn’t. Is this a bug that has been solved since ? I can see that the GDC demo use a newer version of the feature.
Right now my workaround is to create an instanced version of this master to override the normal texture input, then use this instance as the parent of my layer instance. Quite cumbersome if you have a lot of materials.
Also as a **suggestion **: I would love to have a search/filter field at the top of the “Layer Parameters” tab in an instance. When you duplicate an instance to swap the mask input, this could be done more quickly if you didn’t had to scroll all the parameters for example.
Hi all, I wanted to say thanks again to everyone trying the feature and giving us feedback on where you’d like to see it go. Materials Layers are definitely still in their early stages and there’s a lot of potential for the system moving forward. Everything you feed back to us directly affects how we prioritize and expand areas or add new functionality entirely, it makes a big difference!
The difficultly here is that this would require changing the underlying shader source code every time you change a parameter. Not only would the materials have to recompile each time you changed the option, it has the longer term effects of making it very easy to generate unique shaders which take more memory and potentially render slower. Whilst building up an initial blend library is time consuming, it should become less necessary over time and something we can hopefully supplement with more engine-included blends down the line.
For most real-time applications, it’s usually worth focusing on a core set of well-optimized and re-usable components that allow you to have a small base of master materials and instances actually generating shaders, then build the rest of your material instances on top, minimizing unique and new shader permutations where possible. This allows assets to be shared and re-used as well as multiple artists to jump in where needed and immediately be productive. For ultra-highend or non-realtime uses I can see the workflow convenience being more important but even then the overhead on the code backend to manage and long-term maintain the new parameter sets probably isn’t worth the time investment over other feature improvements in the short term, especially as this problem should become less of an issue as your blend libraries build up over time.
It’s definitely something we’ll consider to see if there’s a good compromise but I’d like to see if this is still a major hurdle when layers leave experimental status.
A improved integration with the existing material baking tools isn’t something we’re really pushing on for the initial release but this is something we want to work with the other teams on in the future. A robust system where layers can be real-time or baked robustly and swapped on the fly would be a great benefit to many engine users and really play to one of the strengths of layers.
This is most likely the tessellation outputs, can you try selecting your blend node and setting the vertex property to not blend (Use A or Use B)? The problem you’re facing has come up a few times so we’re looking at ways to make the default blend node more intuitive in the future.
Yes, definitely a bug and one I believe has been fixed since 4.19 released. We do still have a couple of tickets open for parameters that we will fix before the feature leaves Experimental status. If you can reproduce this in a small material please do send it through the bug submission form, it will make it’s way to us and we can verify this specific case is fixed: https://epicgames.formstack.com/form…ubmission_form
Love this idea! Should make the panel more consistent with others and improve usability.
Will do as soon I find some free time.
I’m running on 4.19.0 not the hotfix however, but I didn’t see anything related to the material layering in the changelog so i didn’t want to move my project yet.
That’s what I did, but the “Parameters Group” isn’t visible for me (see my previous attachments).
Hmm, not exactly, although your approach should in fact work. I will crop the image to 100% so you can see better what I did.
Also, try to give another name to the parameters. Maybe there is somewhere some conflict. I had some problems when same names were used in different parameter types in a blend function… don’t think that could be the case, since as you can see I have here same name used… but worth trying…
Hello all,
I’m currently running into some issues converting a material made in the older style layered material system to the new experimental system. Instead of using a new layer for every color change we have been using lerp nodes and submasks to change the color of specific outfit pieces within the same material layer.
I’ve tried to replicate this in the new system using a Material Layer Blend Function with some Blend Material Attributes nodes and Channel Mask selectors. It appears to be working somewhat but the colors seem to be slightly off. Almost like the submask color is being multiplied over the layer underneath instead of replacing its color. Check out the images below to see what I mean. Does anyone know what could be going wrong to cause something like this? Is it a problem with my material layer blend?
Yeah, I noticed it too. Blend Material attributes works a lot differently than lerping. Also, I have problems to add darker layers on top of brighter ones… When I multiply the mask, it clamps and inverts badly the top layer colors, much stronger when with lerping… almost like if it multiplies slightly the color from layer below, instead of lerping only.
Hmmm I’m getting the kinds issues here. Things are not behaving in a consistent predictable manner, I’m not sure if it is user error or the system is still too buggy for prime time.
For interested here is my Uber shader: https://drive.google.com/open?id=1ANKUVruc6TiEkH7-IuAhevz2ETk0uETg
Feel free to use it as you please.
Quick question, maybe it’s a easy one but i did not found some technical answer for that. In the Paragon pipeline for texturing they use texture mask for blended materials, using material functions and changing some parameters values, in the substance atlantis, they use some few master materials and work more with textures instead using materials blended together, is there a lot of difference in the cost of both way? In paragon the main reason for that is sacrificing some cost to be able to quick make skins / new models? Another reason i see is that working with materials is that i can use less textures, but i need a lot more of shaders instruction for that, is there a way to found out wich one would e better to work with? Thanks
I noticed that different Material instances with the same layers are compiled separately. This should be optimized. If Layers and LayerBlends are the same, it should reuse them instead of compiling new Shaders
The cause can be very simple. You have to un-tick the checkboxes in layer instances (see image below). This way you will use default settings from the instances, otherwise the material will overwrite the original settings and treat the layers as entirely new instances. This is actually annoying since the checkboxes are set as true when you drop the layers to slots, and that should be addressed as a bug.
This is actually good question. All depends on what platform you’re aiming and how much your game/project can afford of disk space, and what video card it will have to run.
Definitely material layering is not a good idea for mobile devices, where you have very limited image samplers, and shader instructions.
It’s better to lerp single textures, and limit lerping to only what you have to, in a single not overly complicated material.
On the other hand it’s a good idea to use material layering for PC games, where you don’t care that much for shader instructions. Also when game should be streamed/stored online and you are tight on space and with limited internet connection, so amount of disk space is very important, then you would want to use the same textures all over and over again.
About the Substance, I don’t really know much what they do in their plugin, and how expensive performance it has?
Not everyone can buy it, and i think you can obtain similar results with material layering at the end…
This is what I think at least.
I’m having an issue with parameters embedded in material functions not being editable with the new system. The options are exposed and you can visually see the changes in the small material view port, but the changes don’t take on the asset it self. Anyone have any idea how to make this work with the material function?
If I take what is in the material function and place everything in the material layer like normal it works… but it breaks once I use a material function.
Here you can see the change take place in the material viewport, but not on the asset itself.
https://forums.unrealengine.com/filedata/fetch?filedataid=136797&type=thumb
Here is the material function within the material layer.
https://forums.unrealengine.com/filedata/fetch?filedataid=136795&type=thumb
Here is how the material function is laid out.
https://forums.unrealengine.com/filedata/fetch?filedataid=136796&type=thumb
I’m noticing in the 4.19.2 update that I’m not seeing the checkbox for enabling Material Layering under experimental? Has this been moved/removed. I do not have the option to create Material layer assets.
Did I miss something?
It has been moved to Project Settings while ago, see the image:
Hi team, any plans to make it possible to sample landscape layers in layerblend assets?
Or alternatively make it possible to access an individual layer outside of the stack? ( so for example, you know a layer will be present somehwhere in master node-graph, you just don’t know which layer yet - in a similar way to textures)
I feel like if there was a material layer parameter node that would be a great solution that would slot in perfectly with the old workflow.
[USER=“4894”]Tim Hobson[/USER]
I found a new Issue in conjunction with Landscapes.
You can only paint Landscape Layers, if the original Default Layer Stack utilizes Landscape Layers. If only Material Instances utilize Landscpae Layers (Default Layer Stack empty for example), they are not recognized.