Selective Blending in GBuffer Decals

Selective Blending in GBuffer Decals

Unreal Engine 5 with Nanite and Lumen has made the biggest leap in realtime rendering since Physically Based Rendering. This means that the era of baking bevel normal maps and global illumination is finally at its end (for AAA titles) and realtime lighting using a deferred rendering pipeline is becoming the new standard. Now is the right moment to address the way deferred decals work in Unreal Engine one more time and hopefully… the last time.

In this original thread on the Unreal Engine forums:

https://forums.unrealengine.com/t/metalness-in-dbuffer-decals/74679

There are three issues that have been talked about since 2016 regarding G-Buffer Decals:

  1. Selective blending for Albedo, Metallness, Roughness and normals
  2. Sorting of decals within one material
  3. Sorting of decals between multiple materials

Selective Blending Problem

Currently the buffers use one channel to blend values with the underlying material… “Alpha”. This is fine, until you want to setup geometry decals that only blend normal maps at certain parts and blend metal, roughness and albedo at different parts, which is common in most titles that use this technique: Doom, Alien Isolation, Star Citizen etc.

Current workaround is to make two separate shaders; one that only blends the normal map info and the other that only blends albedo, roughness and metalness. But this is a shame, because It’s not only bad for performance, because of the extra draw call per decal material and possibly even double the amount of parallax occlusion calculations etc. It’s also an inefficient way of authoring geometry:

You either have to prestack the decals in your editing software and place them that way, which complicates adjusting the geometry of the decals. Or place 1 version of your decal, copy them, assign the other material and then Export… Also if we want to complicate it even more for ourselves… you possibly want to have the control over albedo roughness and metal blending individually. But this is simply being avoided by most artists because maintaining both the assets and your own sanity is close to impossible if we do not resort to vigorous automation.

Selective Blending Solution

In the forum post linked above, user And-Rad went above and beyond to provide a solution. And a clever one as well. The way he implemented it is by being able to choose a separate type of blending for the Decal shader, “Selective”

image5

This opens a pin in the shader node where you can input a vector 4 that with each individual channel masks each of the Buffers.

As soon as there are values assigned to the “Multi Opacity” slot, the regular “Opacity” slot becomes disabled. The example with the Multi Opacity slot here is actually the way it’s implemented for D-Buffer decals. Because at the time of the writing of that post it was not yet possible to separate Roughness from Metal blending. This is something that is something we then either have to work around or for Unreal to decide to update the implementation of that specific case.

I think part of the ingenuity in this solution is that it keeps all currently implemented decals as they are and circumvents any issues when updating to the new system!

This is what User And-Rad has to say about it:

“-…- The good news is that I got selective blending to work with regular decals as well, and the performance impact is pretty minimal - … - The most severe limitation is that metallic opacity and roughness opacity will always be the same. The way gbuffer decals work seems to be coupled heavily with the general layout of UE’s gbuffer, where roughness, specularity and metalness are stored in the same render target. Without heavy modifications to the whole render pipeline, opacity can be set separately only for each render target, which means I can blend [color], [normal] and [roughness+metal] separately.
A somewhat mitigating factor is that this limitation is noticeable only in specific circumstances and can be worked around in most of those. Still, it’s not optimal”

So to wrap up this part there are two things to look into:

  1. Use And-Rad’s design and Implement it into Unreal
  2. Check if separating the opacity for Metal and roughness is an option

Decal Sorting

Sorting between Multiple Decal Materials:

Sorting is based on the order in which materials are exported from our editing software. There are workarounds to make the exported materials ordered by name so materials that need to be rendered on top has for example name AA_… and at the bottom ZZ_…, but in general sorting preferably happens with an order that you could set in the shader, much like sorting fudge for transparent materials.

image3

Solution could be:

  1. Unreal Engine says fix it in the exporter
  2. Unreal Engine adds sorting functionality per material

Sorting between Decal on the same Material:

Sorting is based on the vertex index. Vertices that are created later (have a higher index) are sorted over vertices that have a lower index. Workaround: Duplicate the decals that you want to render over the others.

image6

image1

Fix is hard to determine for this, this could be done in the Editing software where you could reorder the vertex index with a tool.

Solution could be:

  1. Unreal Engine says use workaround/fix in editing software
  2. Unreal Engine finds a good solution

Closing thoughts

As we explained in the intro; With Nanite and Lumen added to the toolset, Unreal Engine is close to becoming the leading game engine in the market. Now is the time that the mesh decal workflow is becoming more relevant than ever. Where we want to create huge open worlds down to the tiniest details without blowing up our texture and memory budgets.

We have been working around the Decal issue for years which shows our dedication to the software, we are not going anywhere! But please help us out with this solution, you do not owe this to us, we just want to make Unreal Engine better!

Special thanks to Laurens 't Jong for precisely identifing the problems and solutions for this topic!

It’s a shame that there’s no one answering this. I’m really interested in using this technique in my workflow.

2 Likes

It’s become less relevant than ever. The mesh decal use cases highlighted above solve the problem of large meshes with lots of details blowing past texture budgets. Since Nanite is Epic’s answer to that problem, the only respone you’ll be hearing from them is “Use Nanite”.

Whatever chance there was of Epic implementing these features for mesh decals, it’s gone. Everybody should accept that and adapt.

Yes, Nanite is a great technology, but the production of assets for this technology cannot be called fast, especially if you assume the correct UV unwrapping. It is also worth considering the amount of data, if all the details are done by geometry.

At the same time, the approach with decals allows you to optimally use resources, both geometry and textures. Also it works great with Lumen lighting and can be very well combined with Nanite tech. I see no reason not to implement this little feature if it will help hundreds of developers around the world.

7 Likes

I agree with @Sergey_Tyapkin. Nanite is good and all, but to author these types of detail with geometry is relatively more time and labor intensive than simply stamping on a decal. Star Citizen, Alien and Doom use this technique heavily and I don’t see why its taking such a long time to implement. As mentioned in this github pull request here the code is somewhat there, but not implemented. I think having the “option” along side Nanite is more reasonable than forcing everyone to Nanite.

1 Like

Great topic! We’re currently working with these kind of workarounds too and I was just browsing the forum wanting to make sure that Epic hadn’t released some new features that I may have missed.
I’m also quite surprised that this kind of feature currently requires an implementation that doesn’t seem to be optimal for performance. Especially considering that this method of applying detail is fairly common.

As @Sergey_Tyapkin and @TalosX have already mentioned, this isn’t necessarily about running into performance barriers that Nanite would solve. It’s a great timesaver for integrating detail and creating asset variations non-destructively.
Unless I am missing some other kind of functionality that would work at a mesh level, I simply can’t see how you could rival the speed of this kind of workflow for the time being.

1 Like