Is it true that we can't blend different material shading models with material layers?

Hello,

As the title suggests, but to be more clear, by blending I mean blending between subsurface, hair, skin, cloth etc… through alpha masks. But it is important they retain their corresponding material qualities.

I looked everywhere and I couldn’t find such an example of this sort of blending of material layers. From my tests it seems that only materials of the same shading model can be blended correctly.

Ok never mind, after some more searching I found my answer, Layered Materials support just one Shading Model? - Rendering - Epic Developer Community Forums

So bottom line is that I can’t do what I suggested in the earlier post unless it is a multi object with different mat ID’s, it is unfortunate. Because if i need to use a a subsurface shader with a subsurface profile on the same mesh with masks it will not work.

I hope in the future we will be able to do this sort of thing similar to offline renderers. Thanks.

The reason for this is that the renderer would render everything at half the frame rate if it supported materials that could use two different shading models at the same time.
It’s entirely a real-time performance limitation.
Once the GTX 1080 is a cheap card that you find in cereal boxes, real-time engines will probably have more features like what you want :slight_smile:

For that to work, the lighting would need to be able to handle a pixel that is, for instance, both hair and skin at the same time. Conceptually, what would that even look like? The only reasonable approach would be to solve both lighting models completely and blend the resulting color, which would require tons of extra G buffer storage.

The usual approach with hair and clothing in particular is to dither its edges and use Temporal AA to layer it smoothly. This way your skin and hair and clothing are separate meshes, and only perform the calculations relevant to their lighting. Trying to unify them in a single material would never save you performance even if it were possible, because then every pixel would need to solve the lighting for all of those lighting models whether they are masked out or not.

Thanks jwatte, I understand : ).

Edit:

Just wanted to add that for long time developers out there this information may be the norm (understandably so) but for many of us newcomers it may not, especially for those of us coming from CG backgrounds, There’s no way we would’ve guessed it “require tons of extra G buffer storage” if no one pointed this out to us and I don’t mean spoon feeding don’t get me wrong. For us “Layered materials” translates to just that, blending any kind of materials together with mask or so on. All i’m saying is that it would be of great help if these limitations were somehow mentioned in the docs even a one sentence note would’ve helped in this case to avoid wasted time. I couldn’t find any such info, perhaps i missed it, if any i found an image in the docs which points to skin being blended with metal etc. maybe they are referring to the old subsurface model but still it was unintentionally deceptive which in turn hinted that we may be doing something wrong…I spent half a day testing out the materials with no results while browsing for answers before i posted here.

Note: I understand the multi-mesh approach, in my case i was trying something very specific on a single mesh which is why i asked.

But thanks again for your time guys.

The confusion could be caused by the way the material editor displays material layers, which are really just naive material functions. Because the material attributes node isn’t aware of the shading model, it makes it appear as though every input is available to use by leaving all of the pins open.

Whereas it’s evident when directly editing a material and displaying the inputs normally that many inputs are, in fact, discarded or dependent on the material’s ultimate choice of shader model (e.g. metallic is disabled by the subsurface profile model, and is changed into ‘Scatter’ by the hair model). Even some input nodes, like light vector or scene color, are only applicable to certain shading models or blend models or material domains.

Speaking of which, rendering a pixel multiple times isn’t that much of a deterrent because the translucent blend model necessitates it. Clearly the performance hit can be worth the price if the feature is important enough.

Hi Dementiurge,

Thanks for the input, well yes I slowly discovered the logic in the end. In this case I wanted to do something regarding the subsurafce shader and subsurface profile shader, I wanted to mix those two together and blend it with a mask and some sort of additive effect. Naturally i can’t have the subsurface profile shader as base material because its function works on an entirely different method (Screen space etc… plus greyed out slots) so that leaves the subsurface shader useless and unable to plug it’s parameters to the final subsurface profile material. so it becomes a deadlock.

The reason why is i simply wanted to use the subsurface’s backscattering abilities and blend those onto my subsurface profile material in certain areas. Unfortunately that was too good to be true.

Going back to the drawing board for a workaround.

1 Like

Wow, this was posted in 2016 but it’s still legit. Thanks for sharing.
It drove me nuts, I couldn’t find any information on that topic.

As pointed out, using offline ray tracers like Vray it’ was always very simple to just blend two different shader models. But yeah, real-time and G-Buffer have limitations.

So the two options I see:

  • Splitting the geometry into separate meshes
  • or unique Material Elements

If you don’t have either one.
Turn off your computer and get a few drinks at your local bar.
cheers

You may be interested in the new experimental material system, Strata. This is more like BSDFs you’d find in offline renderers and does support mixing of different shading models. It’s not without limitations since it’s still optimized for real time, but should allow for much more flexibility.

yeah, but still using UE4 for this, but will look into Strata for UE5 one day.