Download

Metalness in DBuffer decals

Are there any plans to add functionality for using the metalness material input when using DBuffer decals? I think that would be very helpful, especially since the introduction of mesh decals in 4.13.

I know of course that you can use metalness when using the translucent and stain blend modes, but the problem with those is that they don’t work with static lighting. For that we need to use the DBuffer decals, which don’t support metalness. This makes it effectively impossible to use the decal system for things like rivets, screws or scratches exposing metal under an object’s paint.

What’s Epic’s stance on this? Is that on the roadmap? What kinds of feasible workarounds are there? I currently use a masked material applied to a dedicated decal mesh (consisting largely of quads and tris laid over the mesh geometry). This works well, but such a material setup is a little more costly than the deferred decal one.

I feel like using deferred decals for this kind of added mesh detail should be the cannical way, but current limitations still get in the way of that.

Best regards,
Rames

Hey ramesjandi,

Thanks for taking the time to write up this feature request. I do see how the Metallic value being exposed for Dbuffer decals could be useful for examples aforementioned in your post. I have gone ahead and written up a feature request ticket to have it reviewed as a potential future feature in the engine. For tracking purposes you can follow and vote on the issue following the link below to our Public Issues tracker.

UE-36896 - Expose Metallic Input for Dbuffer Decals

Let me know if you have further questions or need additional assistance.

Cheers,

Andrew Hurley

The issue linked above doesn’t seem to be public any more. Is it still planned to allow this?

I’m using Dbuffer decals to blend in tyre tracks with puddles. It mostly works but the reflection in the puddles is very dim, because I can’t alter the metalness of the underlying dirt material. For now I’m working around it by using a masked material applied to a plane but obviously it’s harder to blend nicely and requires a flat surface.

Feature requests were removed from issue tracker at some time in the past.

Meanwhile I’d like to sign under the request for metalness for decals.

any news about this issue?

Any news guys? It’s been 3 years and still no metalness here

Since this thread got necroed a couple of times over the years, I thought I might as well post here. I managed to implement metallic output for dbuffer decals. Below are some screenshots of how stuff looks. All the white dots and sprinkles in the buffer image are fully metallic decals. There is no dynamic lighting in the scene.

Note how this also allows placing non-metallic decals on metallic surfaces. This wasn’t possible before where decals would inherit the metalness value of the underlying surface every time, which means you couldn’t place a sign or a poster made out of paper on a metal wall without the poster becoming metal, too. The dark screws in the screenshots below are correctly rendered non-metallic even when placed on the metal floor.

How does it work?

There’s a new option in the material settings under Decal Blend Mode: DBuffer Tanslucent Color,Normal,Roughness,Metal. Choosing this enables the metallic pin in the material attributes. From there it works as every other material does:

  1. Connect pin
  2. Save material
  3. Profit!

I have not yet measured the performance impact, but it should be minimal. Connecting the metallic pin increases instruction count, but that’s the case with every non-metallic vs. metallic material. There are some additional instructions in the shader files that are necessary to actually read the metallic value and write it into the respective buffers. But that’s just some assignment and basic arithmetic operations. It doesn’t get cheaper than this.

One reason I haven’t measured performance is that the material is not yet optimized. I’ll write about that in another post, in order to keep these from going too much TLDR.

So, what’s the bad news?

There is unfortunately no way to get this working without modifications to UE source code. I tried to get this done by modifying the shaders only, but it simply does not suffice. Modifications are minimal, Git tells me I only had to change 24 lines of C++ code. But still, if you want to use this you have to use a custom engine build, as it is also not possible to pack this in a plugin.

Once I’m done with all I have planned, I’ll tell you which commits you have to cherry-pick in order to integrate this in your engine builds. Alternatively, you can go pull the whole engine code from GitHub. The branch to check out is named decals-plus. This might not always be up to date and major breakage might occur. You have been warned.

What can you do?

If you’d be so inclined, I’d like you to help testing the stuff. There shouldn’t be any problems really, but I would still like to know if there are problems on specific platforms or hardware configurations. I have compiled a demo project that you can play with. You can get it here:

Windows_64
Linux_64

Setups the project was successfully tested on:
Windows, Nvidia GTX 1060
Windows, Radeon HD 8850M
Ubuntu 18.04, Radeon HD 8850M

The Radeon card is a GCN 1.0 Southern Island chipset released in 2013. On Ubuntu the amdgpu free driver was used, running through Mesa 18.1. I haven’t tested either the radeon driver nor any of the proprietary drivers and I didn’t test with Nvidia on a Linux system at all. I suspect It works just as well, but I rather keep my Linuxes free of NV if I can help it.

I can’t compile for Mac or consoles and if anyone who can would like to compile the modified UE code and give it a spin, I’d be tremendously grateful.

Future work

I should talk about that in another post. This one’s long enough as it is.

Goal: Selective Blending

Ultimately, what I want to achieve is the ability to selectively blend different channels of the decal with the underlying material. Examples:

A metal bolt overwrites roughness, normal, metalness (usually) and color (in most cases) values.
A crack might overwrite normal information and maybe roughness because of dust and dirt that settle in cracks over time.
A panel might overwrite or blend normal information at the edges and retain it on the surface.
A layer of paint overwrites color, maybe roughness and might overwrite or blend or retain the underlying normal information.

The screenshots above show this quite nicely. The screws overwrite all the information of the underlying material. The strip of color overwrites the color and roughness, but keeps normal information. The recess in the middle keeps the underlying normals intact and nearly overwrites all of it only at the incline.

Hold on. Isn’t this exactly what we want? Yes and no. It is, but only because I cheated. First, look at the rim of the screws where the surface is recessed and gets a little darker than everywhere else. The color roughly matches the color of the tiles, but only because I manually set it so. Look at what happens when I set the background color inside the decal material to a bright green:

Nice. It is obvious that this method doesn’t scale, especially with very colorful surfaces. We don’t want to have to set a background color at all, we want the decal to automatically use the one that’s provided by the underlying surface. Like with the large recess in the middle and the strip of paint at the bottom.

This leads to the second problem with this setup. The piece of wall above uses 6 materials! Two of those are the metal beams and the tiles on the wall (which could easily be packed into one), the other four are for the different types of decals, using the different types of dbuffer decal blend modes:

Screws: Color,Normal,Roughness,Metal
Recess: Normal
Paint: Color,Roughness
Recess (metal panel): Normal,Roughness

This means that what many of us want - to selectively blend the different material attributes between decal and underlying surface - has already been possible with vanilla UE for years. The drawback is that the number of draw calls would rise spectacularly if you wanted to do this for your whole environment. That sucks, especially for fully dynamic lighting which requires an already increased amount of draw calls compared to statically lit games. Even then it’s not perfect, as the example with the screws demonstrates.

The Plan

So the plan is simple: Implement selective blending/masking inside the decal shader so that the four different decal materials above can be replaced by one.

This might be easier than I originally thought. While I was working on getting metalness to work, I stumbled upon pieces of code that suggest getting the actual blending to work is a matter of modifying no more than 4 or 5 lines of shader code. The shader even incorporates a concept of “multi opacity”, where opacity is stored as a vector containing separate opacity values for color, roughness and normals. It’s not used like that for now, but changing it does seem pretty trivial at this point. I’m very optimistic for now.

The problem lies in getting the data to the shader. This is not really a technical problem, but a conceptual one. It also impacts performance (so it probably is a technical one…). Right now, the decal material provides only one opacity value via the Opacity pin of the Material Attributes node. This one float value is used for every blending operation on all the attributes that interest us.

The Question

The question is: How do we provide four (color,roughness,metal,normal) different values for opacity to our material while adding the least amount of overhead and performance impact to the shader?

The answer to this will impact how comfortable it is for artists to work with the new decal material. Performance-wise, it’s a matter of minimizing instuctions needed and data transferred. Floats are faster than texture samplers, but providing texture masks is simpler for artists than computing crazy bitmasks inside their node graph. Also, this should all be done with the least amount of changes to UE source code.

What doesn’t work is utilizing the alpha channels that already exist on texture samplers for color and normal opacity, respectively. You might think that the alpha channel of a texture node is passed to the shader when plugged into the BaseColor pin, but the compiled code only ever uses RGB values and discards the alpha. Same for the normal map. Changing that is comletely out of the question.

There’s also the issue of blending vs. masking. Should we blend between decal and underlying surface or is it sufficient to completely mask the decal once its opacity reaches below a configurable threshold? Masking would be cheaper and could probably be done using only one float as input for all four attributes. But there is no doubt that blending would look better, be more versatile and easier to work with. I tend towards blending, which is also closer to how the shader handles it now.

This means the Material Attributes node has to provide 3 new pins or one pin for a 3-component vector. Here’s where I’ll be investigating during the next few days. The node already provides 2 custom data pins. They’re disabled by default, but that’s easily changed. The problem is that those two are floats only and therefore not sufficient. There seems to be a way to define completely new custom pins. I only ever heard of it, but if the rumours are true, this might be the way to go.

Goals: Addendum

One thing I forgot: Once I have all of this working for dbuffer decals, I like to implement it for regular deferred decals, too. I actually started with those, but getting them to blend with instead of overwriting underlying materials turned out to be more complicated than I thought. Once I’m more familiar with Unreal’s render pipeline, I’ll have another go at it.

The rendering steps leading up to regular decals seem pretty clear to me: The gbuffer is filled during the prepass and the basepass, and right before lighting is applied, deferred decals are rendered on top of the geometry. You can see it in RenderDoc: No decals at one stage and all decals at the next.

Still, I need to do some more digging until I’ll be able to figure this out. The point is, I would like to have selective blending working not only for dbuffer decals, but for regular ones as well.

But Why?

Why would I want to do this when dbuffer decals already give me all I need? There are two reasons.

The first one is really simple as it can be explained by two numbers: 99 and 51. The default base material in its simplest form requires 99 instructions in the base pass if static lighting and dbuffer decals are enabled in the project settings (they are in a new blank project). The same material requires only 51 base pass instuctions if both options are turned off. That’s roughly half.

Which means that projects that don’t make use of static lighting can usually save a good number of shader instructions if they disable it. If you don’t use static lighting, though, there is usually no need to have dbuffer decals enabled, since their main (only?) advantage over regular decals is that they can be used with baked lighting. But if you wished to make use of selective blending in your decal material, you’d be forced to enable dbuffer decals only for this feature. And your game will be less performant because of it.

That’s reason number one why I’d like to implement selective blending for regular decals as well.

The second reason is that dbuffer decals still don’t provide access to the same material attributes that regular decals do, most notably the Emissive pin. There is no chance I will try to enable emission in dbuffer decals. Doing this would require adding another render target to the dbuffer which I want to avoid at all costs. As it stands, there is just no space left to fit the emissive information. The last remaining space that the dbuffer had to offer went into storing metalness.

Goode ol’ regular decals don’t suffer from this limitation and offer a more complete package as a result. Getting selective blending to work for them would offer people a perfomrant alternative for dbuffer decals in games that make use of dynamic lighting only.

Interlude: Translucency Sort Order

One problem I noticed when stacking decals is that the sort order wouldn’t respect geometry. You can see it in the image below. Even though the screws are placed well above the paint strip, the paint gets rendered after the screws and as a result overwrites the screws’ color information. That’s not what I want.

It’s true that UE allows users to define the sort order for translucent geometry and decal actors, but only relative to other actors. The problem is that the above does not show two actors fighting for the Z, it’s two materials on the SAME actor. None of the settings available to us handle a situation like that. It gets even more annoying when it’s the same material that contains stacked decal geometry.

There are several possible solutions for this problem. The first one would be to split overlapping decals into separate actors. Looking at the image above, I could import the wall as the first mesh, the paint strip as a second and the screws as the third mesh. I’d place all of them in the level and could set the translucency sort order in the actor settings so that the screws would always render on top of the paint.

I’d also quickly start to hate the **** out of it. Having three separate actors for what should be only one, always taking care to move and rotate them in sync, and maintaining sort order values that don’t conflict with other translucent pieces of geometry that might enter the scene just sounds like trouble. I’m doing this to make my life easier.

So I took a look at what detemines the order in which different materials on the same mesh are rendered, and - to the surprise of probably no one - it’s the order of the material slots on the mesh. The top-most material is rendered before the one below it and so on. Which means all I have to do is make sure that the order of the material slots corresponds to the intended sort order of the decals.

Maintaining Material Slot Order

As far as I could find out, the order of material slots on an imported mesh is determined by the application that exports the fbx file. The Unreal editor preserves that order on import. So this part is not really related to UE but to Maya, Max, Blender or whatever application you use to build and export your 3D models.

I use Blender, which is free software and thus made it easy to spot where the order is determined. It’s not something the artist has any immediate control over, so I modified the code in the fbx exporter to allow for some user control. The modified exporter is on GitHub, but please be aware that I also changed the way how the exporter handles root bones on exported skeletons. This is a longstanding cause of confusion especially for people new to Blender and Unreal, but I don’t want to derail this thread, so I invite you to google it. Just keep in mind that if you use the modified exporter for skeletal meshes, an additional root bone is never created and UE imports it without modifications to the skeleton.

The way Blender’s fbx exporter sorts meshes now: If you’re using the binary exporter and “Selected Objects” is ticked, the meshes inside the fbx file will be sorted alphabetically, according to their names as shown in the outliner.

This allows you to easily control the order of material slots. You could call the wall mesh “AA”, the paint mesh “AB” and the screw mesh “ZZ” and it would guarantee that the screws are rendered last. The result is this:

Note that it doesn’t really matter what the other meshes are called, as long as the first instance of a paint mesh is sorted alphabetically before the first instance of the screw mesh, and if you don’t have overlapping decals on a mesh, you can ignore all of this completely. Also note that Blender sorts case-sensitive. I realize that giving meshes some random names does not fit everyone’s workflow. Mine is highly modular even inside Blender, using instanced meshes and linked duplicates, so I usually don’t care about the names of the meshes that are about to be exported.

I don’t know if Maya or Max users have similar problems with sorting material slots and unfortunately I lack the time to dig into it. I hope you are fine, but if not, you can always switch to Blender :stuck_out_tongue:

Looks like I’m done. The two screenshots below show the same decal before and after implementing multi opacity. In the first image opacity is 1 for the whole material. That’s how it looks when the opacity pin is just left alone. The second image shows multi opacity applied, providing different opacity values for each attribute. You can find more detailed breakdowns of the images into the different buffers below.

Providing Opacity

In order to drive the opacity for color, normals, roughness and metalness I use a single texture. It works pretty much as it always has been, only now all four channels of the texture are read when connected to the pin instead of just one. I made sure this is “backwards-compatible”, though. If you connect a constant value, all of the decal is visible for every attribute. If you connect a constant vector, you can control the opacity for each attribute separately, but the values will be the same across the whole surface of the decal. If you connect a single texture channel, this channel drives the opacity for every attribute. This is analogous to how it works with the regular Opacity pin.

Note that the Append node is necessary because, as I wrote earlier, texture samplers return a 3-component vector and discard alpha, so I have to put it back in there, so to speak, before it connects to the pin. Append is not the only way to do it, there are others that can do the same thing. The imortant part is that whatever goes into the Multi Opacity pin has to be either a 1-component (i.e. a scalar value) or a 4-component vector.

The opacity texture I’m using looks like this:

You can see that most parts are either fully white or fully black. Blending happens here and there, most of it in the roughness channel, never in metalness. The two screws and the clip thing overwrite the underlying surface on every channel. The pink dots only ever affect the base color and leave every other attribute alone.

The normal channel also overwrites the underling normals for recesses, cracks and panels, but only where the actual incline is, leaving the surface inside alone. This preserves the underlying information nicely. You can see that on the large one in the screenshots above, and you can see that it might have been better to increase the normal opacity on the inside of the two smaller recesses, because having the underlying normals shine through as strong as they do here, this looks a bit awkward.

The roughness channel provides an opportunity to add some form of wear to cracks and seams. By blending the roughness opacity just slightly and providing a roughness value close to 1 for that area of the decal, I can simulate how recesses gather dust over time and lose a bit of their shininess. Alternatively, if I set the roughness value close to 0, I can simulate how edges get smoother over time from all the friction. A low roughness opacity here means the effect will be very subtle. It also doesn’t matter how rough the underling surface actually is. All I’m saying here is, “Make this area slightly rougher (or smoother) than it is now”.

The metalness opacity is relatively boring, all in all. It should almost always be either black or white and never blend. It controls where the decal can overwrite the metalness value of the surface beneath it. A thing to note is that this means reversing metalness, too. If a surface is made of metal and I wanted to have some screws on it that are not, the metalness opacity for those screws should be white. Because only then do the screws overwrite the metalness, in this case setting it to 0. The Metalness giveth and the Metalness taketh away.

Providing Values

There is an important distinction to be kept in mind, and the last paragraph hinted at it a little. What this texture provides is opacity, not value.

Nowhere does it say how rough, how metallic a surface should be or which color it should have. The metalness channel up there doesn’t say that the two screws and the clip should be made of metal. It says that for those 3 objects, the decal is allowed to overwrite the metalness of the underlying surface. Whether they actually are metallic depends on what you connect to the Metalness input pin of the decal material. If you connect a single constant value of 1, it won’t make your whole decal metallic. It will make it metallic only where the texture above allows it to. So even though I made the screws metallic in this specific example, as seen in the screenshots above, they would cease to be so the moment I disconnected the Metallic pin.

That’s not to say you can’t use this texture to drive values, of course. If you were certain that all of your screws will always be made of metal, you could just plug the channel into the Metalness pin and be done with it.

Before & After

So here are the screenshots of how stuff looks before I implemented multi opacity and after that. It’s the exact same decal and the same material. I noticed a little too late that I changed the roughness value at some point between the two captures. That’s why the decal is rougher in the first set of images. Its roughness changed from 0.75 to 0.45, but apart from that, all material parameters are the same.

We can see in the first set that the decal opacity completely overwrites the information below. Its opacity is set to the same value for every material attribute, 1 in this case. The second set shows how the multi opacity texture drives each attribute independently. The color of the pink dots blend nicely with the underlying color while the other attributes don’t care about those dots at all. Roughness, normals and metalness are still taken from the surface below. We can also observe the roughness blending while not affecting the color or the normals at all. All in all, this is some nice selective blending with all the freedom we wish for.

What I find exciting for effects and stuff is that we can also drive those opacity values - separately - by gameplay code. Think about some simple things like a puddle of spilled paint: It can dry up over time, changing its roughness to that of the underlying material while preserving its color. Procedural and animated decals become a whole lot more versatile with this.

Technicalities

Turns out I did have to implement an additional render target for the dbuffer after all. It took some time for me to understand certain things about UE’s render pipeline, but once I did, it became clear that the only way to do this cleanly is to use an additional render target for the metal opacity.

This problem only exists because I want metallic dbuffer decals as well as selective blending. Either one of those alone would be possible with only 3 dbuffer render targets, as it is in vanilla UE, but since I aim to implement both, There’s no reasonable way around a fourth texture. I spare you the details now, but I can go into it if there’s a demand.

Adding a render target to the buffer turned out to be way easier than I thought, so Epic probably deserves a lot of credit there. Documentation is sparse, but the existing code provides enough useful hints to get by. Optimization is next on my to-do list and once I’m done with that, I will know how much of a performance impact this step turns out to be. The additional render target should increase GPU memory usage by about 8 MB for 1080p displays and 32 MB for 4k.

I also noticed that cooking for GLSL 150 (SM4 on OpenGL systems) threw a shader compilation error for WorldGridMaterial because it now exceeded the limit of 16 samplers per material. You can google what’s up with that limitation and why Epic put it in place. The point is if you package a game for Linux or Mac and you compile shaders to be compatible with OpenGL 3.3, you might hit this sampler limit.

There are 2 ways around that. Either reduce sampler usage elsewhere or compile shaders for newer versions of OpenGL. You can reduce the overall usage of texture samplers by disabling features in the project settings. I decided to disable stationary sky lights, which was enough to finish packaging the project without errors. You can disable support for older versions of OpenGL in the project settings as well.

There are basically 2 relevant versions of OpenGL that correspond to SM4 and SM5. OpenGL 3.3 is the equivalent of SM4 (D3D10) and was released in 2010. OpenGL starting at version 4.3 is the equivalent of SM5 (D3D11) and was released in 2012. The mobile AMD graphics card from 2013 which I wrote about in my first post supports OpenGL 4.3. I’m writing all of this to give you an idea of how widespread driver support for OGL 4.3 is on current Linux systems. I don’t have official numbers, but I think it is safe to assume that most Linux gaming rigs are easily capable of utilizing SM5.

All of this is completely irrelevant for Windows, which doesn’t impose such tight restrictions on the number of samplers used. I can’t test on consoles, one day I might be able to. Does UE support decals on mobile systems at all?

As it stands now, the extended decal functionality works on Windows and Linux. I suspect it works on MacOS as well. I’d be very surprised if it didn’t work on current-gen consoles, to be honest.

Wrapping Up

That’s it for today. Next on my list is optimizing and profiling the decal system and also refining some workflow procedures. If it turns out that UE supports dbuffer decals on mobile systems I’m inclined to package a mobile project to test it for Android, too. Once again, I don’t have access to Apple hardware or I’d test on those systems, too.

I currently don’t have time to test this but wanted to stop by and take the time to say thank you for making decals awesome (the way they should have been all along)! Looking forward to working with decals again :slight_smile:

Optimization

There are 3 areas of optimization I want to talk about.

  1. Material pins: There are now two pins that deal with decal opacity: Opacity & MultiOpacity. How do they interact with each other? I decided to make it like this: If MultiOpacity is connected, the opacity will be read from this attribute, otherwise the input of the Opacity pin will be used. If you look at the compiled HLSL code for a material, you’ll find dozens of statements like these:


#if POST_PROCESS_MATERIAL
    ...
#endif

#if FEATURE_LEVEL >= FEATURE_LEVEL_SM5
   ...
#endif


When a material gets compiled, the parts inside these checks will only be compiled if the checks resolve to true. That’s one of the main mechanisms by which Unreal controls shader complexity. The material is not a post-process material? Ignore whatever it is between the #if and the #endif above. Does the platform support SM5 (D3D11)? Include the code above, otherwise, ignore it, too.

Most of those variables are set by C++ or console variables. So I created a new variable, USE_DBUFFER_MULTIOPACITY, that is set from C++ whenever the MulitOpacity pin in a material is connected. I now had a way to check for the value of this variable inside the shader. The following function returns the opacity of a material:



half GetMaterialOpacity(FPixelMaterialInputs PixelMaterialInputs)
{
    ...
}


You can read “half” as “float” for this specific example. So, the function returns a scalar, which is not what we need when dealing with multi opacity. I modified it to look like this:



#if USE_DBUFFER_MULTIOPACITY
half4 GetMaterialOpacity(FPixelMaterialInputs PixelMaterialInputs)
#else
half GetMaterialOpacity(FPixelMaterialInputs PixelMaterialInputs)
#endif
{
    ...
}


The shader compiler now changes the return type of this specific function from a scalar to a 4-component vector whenever a material has the MultiOpacity pin connected. I did this in a couple more places and the end result is that when the renderer is not dealing with decals, everything works exactly as before, while shaders that need multi opacity always get it because USE_DBUFFER_MULTIOPACITY will always be set correctly.

This means that the decal material works the same way it has always worked when only the Opacity pin or none of the two is connected. There are also some bandwidth and computation savings. It also means that migrating to this custom version of Uneal Engine does not change anything about decals automatically, leaving users in control.

  1. In Unreal Engine, there are some nice optimizations that happen for decal blend modes when the Normal material pin is not connected and I wanted to implement those for the new mode as well. This has led to the creation of an additional blend mode.

Suppose we were using the decal blend mode ColorNormalRoughness, but didn’t connect the Normal pin. When compiling this material, Unreal checks if the pin is connected and if not, “downgrades” the material to the same blend mode without the normal information. ColorNormalRoughness becomes ColorRoughness, NormalRoughness becomes Roughness and so on. For this to work with the new blend mode, ColorNormalRoughnessMetal, there had to be another blend mode, ColorRoughnessMetal, which you can find in the decal blend mode dropdown in the material properties.

This additional blend mode can be used like all others. The reason it exists is to make the aforementioned optimization possible, but since it’s already there, you can use it if you want to.

  1. I also added a new decal response to the list of possible decal response modes in the material settings. Decal response governs what kind of decals have an effect on a given suface material. The default response is ColorNormalRoughness, which means if a decal is placed over a mesh with this material, the decal’s color, roughness and normal values will override the ones coming from that material. Changing the decal response is a way for a material to allow or prohibit the use of decals on itself. It also has an effect on shader complexity.

Since there is a new decal blend mode, there should also be a corresponding decal response mode. There is and it is called ColorNormalRoughnessMetal, unsurprisingly. Enabling this decal response allows decals on this material to overwrite metalness in addition to everything that ColorNormalRoughness allows. Activating this response mode also increases shader instruction count from 99 to 104 for the base material.

Here are the dropdown menus containing the new modes:

Compared to previous builds of this custom engine branch, metalness will not be visible by default on decals. For consistency with upstream, I decided to keep the default values for blend mode and decal response, which means a new material will have ColorNormalRoughness as its default decal response. In order to use the Metallic pin in a decal, its blend mode has to be ColorNormalRoughnessMetal, and in order to actually see the metalness every material that interacts with this decal must have its decal response set to ColorNormalRoughnessMetal as well.

Completeness vs. Compactness

As long as UE supported only color, normal and roughness in dbuffer decals, every possible combination of those could be selected in the blend mode menu. Now, after metalness has been added to it, this is obviously not the case anymore. I only added 2 blend modes while I should have added 8. I limited the amount of additional blend modes for 2 reasons.

The first one is covered by what I wrote earlier. I want to do this with the least amount of changes to Unreal Engine as possible. There were good reasons to add the 2 modes that I did add, but it is my opinion that there are not enough to support writing the code that adds 6 more. It just spams the list of available blend modes when most people are well served with 3 or 4 out of the 9 that exist now for dbuffer decals.

As for the second reason: I really don’t think there’s much utility in the missing blend modes. Let’s be honest, when do you need to have a decal that only overwrites normal and metalness? And you cannot use the ColorNormalRoughnessMetal mode which allows you to do just that? Not only do modes like RoughnessMetal and NormalMetal provie hardly any usefulness, they’re also going completely against established PBR gudelines and practices.

A decal that overwrites metalness, but doesn’t at least overwrite color at the same time breaks PBR, conceptually speaking. I understand that a lot of people use very stylized NPR shading techniques in their projects, and if they really want to, they can violate PBR guidelines with the decal blend modes that I already implemented. But I don’t have to be an enabler for them :slight_smile:

In short: I think the missing modes are useless.

Numbers

Let’s see how all of this affects performance. I already wrote that the new decal blend mode raises shader instruction count by 5. With static lighting enabled, the base material requires 99 instructions when decal response is left at ColorNormalRoughness and 104 when it’s set to ColorNormalRoughnessMetal. If static lighting is disabled, those numbers change to 73 and 78. The increase in instruction count seems to be consistent.

I ran the demo room (links in the first post are NOT updated, btw.) three times under the same conditions but with these differences:

  1. Unmodified Unreal Engine, all mesh decals set to ColorNormalRoughness and all material decal responses set accordingly. This acts as a kind of baseline.
  2. Modified engine, but still only using ColorNormalRoughness.
  3. Modified engine, using ColorNormalRoughnessMetal decal modes everywhere except for the graffito on the wall.

Let’s compare 1 and 2 first. The decal blend modes are the same. 2 uses the modified engine, but doesn’t use the new decal modes. The render step that renders dbuffer decals is called CompositionBeforeBasePass. There is also DeferredShadingSceneRendererDBuffer, which collects the decals in the scene and is also responsible for drawing them. These are the results:



**CompositionBeforeBasePass:**
1. 0.11ms avg.
2. 0.14ms avg.

**DeferredShadingSceneRendererDBuffer:**
1. 0.238ms avg.
2. 0.277ms avg.
(Render thread time in both cases about 5ms.)


This tells me that dbuffer decals in the modified engine have an increased base cost. They have become slower even when the new decal modes are not used. I also have an idea as to why that is. A jump from 11 to 14 is roughly equal to one from 3 to 4. The same goes for 238 increasing to 277. I therefore suspect that the increase in render time is a result of adding a 4th render target to the dbuffer in order to provide for the metalness. If this is true, there should not be another significant increase in render time once we activate the new decal blend mode. Lets check that.



**CompositionBeforeBasePass:**
2. 0.14ms avg.
3. 0.14ms avg.

**DeferredShadingSceneRendererDBuffer:**
2. 0.277ms avg.
3. 0.261ms avg.


Huh. Seems performance is pretty much the same. Who would have thought?

All in all, the render thread time has increased by about 1%, although that’s a pretty useless thing to say. Decal performance depends on the number of pixels on the screen that are affected by decals, and for a setup like the demo room, that number will always be pretty low, which means decals will make up only a small portion of the overall render time. If the new decal system was 10x slower than before, render time would still have gone from 5ms to just 5.5ms.

It’s better to go with what the comparisons above show: We can estimate that the time to process decals has increased by about 33% for the modified engine, regardless of whether the new blend modes are actually in use or not. Since this is still well below 0.1ms for rooms like the one above, I for one can live with that.

Let’s look at a couple shades of green to finish this off.

This is the decal room with static lighting enabled and decal blend mode set to ColorNormalRoughness.

This is with blend mode set to ColorNormalRoughnessMetal. You can barely see it, but it is a shade darker than the image above.

This is with static lighting disabled and decal blend mode set to ColorNormalRoughnessMetal. That is a nice, solid green I’m seeing there. Which provides a good opportunity to transition to the final part of this undertaking: Getting multi opacity to work for regular, non-dbuffer decals. This is what I’ll be spending the next couple of days with.

Awesome job on fixing the decal shaders ! Do you have any engine build with this new shader though?

@CattusEx Thanks! I’m not quite sure I understand your question. The code changes live in the GitHub repository I linked in the first post, and I use this to compile a custom build of UE. Are you asking if I have a precompiled build of the engine available for people to use? As far as I’m aware, distributing precompiled binaries of the editor violates Epic’s EULA.

Great work. These features are a must have, I hope that Epic is taking note. In the meanwhile I’ll try to integrate this into my own project

So, there’s good news and bad news. The good news is that I got selective blending to work with regular decals as well, and the performance impact is pretty minimal. The bad news is that the workflow differs a little from dbuffer decals and both implementations don’t have feature parity.

The most severe limitation is that metallic opacity and roughness opacity will always be the same. The way gbuffer decals work seems to be coupled heavily with the general layout of UE’s gbuffer, where roughness, specularity and metalness are stored in the same render target. Without heavy modifications to the whole render pipeline, opacity can be set separately only for each render target, which means I can blend [color], [normal] and [roughness+metal] separately.

A somewhat mitigating factor is that this limitation is noticeable only in specific circumstances and can be worked around in most of those. Still, it’s not optimal.

Pics or it didn’t happen

You can activate selective blending my setting the decal blend mode to Selective, as seen in the screenshot below. This is a new blend mode I created for that purpose, so that the other modes are left unchanged and work as they did before. This blend mode shares all the limitations of the other gbuffer decals, most importantly: it doesn’t work well with baked lighting.

The first set of images shows (by now) nothing special, really. It blends the same way that dbuffer decals blend and nothing has to be changed in the material graph.

All the metallic parts overwrite every attribute, the recesses and seams overwrite normals at the incline, but retain normals on the surface (like the large one to the right of the metallic strip), and most recessed shapes have a very weak roughness opacity in order to blend the underlying roughness with the one provided by the decal. You can see some faint traces of that in the dots on the bottom.

This specific material/decal combination would look exactly the same when built with either gbuffer or dbuffer decals. There is virtually no difference that an observer could notice. Let’s look at what happens when the same decal is placed on a metallic surface.

If the normals of the first image look inverted to you, it’s because I changed the light’s direction in order to better make out the surface. Unfortunately, when viewed side-by-side with the image above this creates kind of an optical illusion that suggests the normals, and not the light, have changed.

Apart from that, you might think there’s nothing wrong with the first image. You’d be sorta kinda right, but only because the decal’s roughness opacity is so weak that the adverse effects are barely noticeable. It’s still enough to observe the problem in the buffer images, though.

When looking at the metallic buffer, we can see that there’s a ton of grey, which is generally not what we want. All those grey parts come from the roughness opacity, and what happens is this: When it comes time to blend the decal with the environment, the renderer takes the metallic value of the decal (0 for those grey spots), multiplies it with the opacity (the same for roughness and metalness), and blends it with the underlying, fully metallic surface. It comes down to something like 1 * 0.9 + 0 * 0.1 = 0.9, I don’t know the exact blending operation off the dome. The point is that roughness blending always results in metalness blending which produces metalness values that are not 0 or 1, which violates PBR specs in most of the cases.

The reason the images above look still kind of okay is that metalness can be blended to very light greys without producing non-PBR results right away. It’s okay to have lightgrey metalness in areas that are covered by thin layers of dust for example, provided the reflectance value in the base color is still correct.

Workarounds and Workwiths

In summary, unwanted effects will happen if the intended metal opacity differs from the actual roughness opacity and the metal value of the underlying surface is different than the metal value of the decal.

Let’s break that sentence apart: If roughness opacity and metal opacity are the same, there’s obviously no problem. Even though the wrong texture channel is used, metal opacity will still be calculated correctly. If roughness and metal opacity differ, but the metal value of the decal and the surface are the same, then the resulting value will stay the same. For non-metals 0 * 0.123 + 0 * 0.877 is still 0, still not metallic. The same but in reverse goes for two metals. If they differ, the amount of roughness opacity is proportional to the visible error, which means the error can be reduced if the opacity is low.

With all that said, here are some ways to work around this problem:

1. Don’t blend roughness opacity independently. If you never blend roughness independently from metalness, you’ll never run into this problem. Their opacities will always be linked and you get by with only one texture for both. This is very limiting, though. Forget blending roughness inside of seams and cracks or anywhere really. Roughness opacity will always have near white or near black values and more even blends cannot be achieved this way.

2. Use metallic and non-metallic versions of decals. Let’s say you have a seam in your decal sheet and that seam blends roughness opacity to simulate dust and dirt that have gathered in it. If this seam is not specifically set to be fully metallic, it will look good on non-metallic surfaces and bad on metallic ones, and vice versa. The solution to this is having two variations of that seam in the decal sheet - a metallic one and a non-metallic one. By placing the metallic seam on metallic surfaces, the problem goes away. The downside to this is that you have to know/decide in advance which surfaces of a mesh are going to be metallic and which ones are not. This severely limits modularity and it might force you to rework some of your meshes if, for example, art direction changes during development.

3. Use weak roughness opacity. As stated, light greys in the metalness don’t break PBR immediately, so you can limit yourself to having roughness opacity very low where it’s independent of metal opacity. It will make metals slightly less metallic, but you might get away with it and keep the shading intact. Might also limit overall usefulness, though.

4. Ignore it. Well, you can always decide to just not give a ****. After all, most of these decals consist of narrow seams, bolts and other very small elements that generally make up only a little portion of screen space and are seldom the center of the action. Who cares about bolts and seams? There’s enemies to murder and worlds to save, and no one pauses to stare at a wall to admire the nice shading of that hex nut in the corner over there. In fact, go to Port Olisar and do just that. You will see tons of places where the decal shading might seem a bit off, but it really makes no difference to the overall experience.

5. Use dbuffer decals. DBuffer decals don’t suffer from this problem and blend every material attribute independently. You can always choose to use those instead. If your project makes use of baked lighting, you should be using dbuffer decals anyway, so this issue might not even be relevant to you at all.

Feature Comparison

I consider this expanded decal system to be feature-complete. I will look into AO since I have a feeling that it should be possible to get it working for decal materials, but that might just be my limited understanding of UE’s rendering pipeline. I also might have another go at getting emissive working for dbuffer decals. For the time being though, I’m finished and here’s how dbuffer decals differ from gbuffer decals:

DBuffer

  • available attributes: Base Color, Metallic, Roughness, Normal
  • selectively blend color, normal, roughness, metal
  • works with dynamic and static lighting
  • adds constant overhead of about 25 instructions for every default lit surface material in the project, whether decals are in the scene or not (this is the cost of enabling dbuffer decals in the project settings)

GBuffer

  • available attributes: Base Color, Metallic, Roughness, Emissive, Normal
  • selectively blends color, normal, roughnessmetal
  • works flawlessly with dynamic lighting, sucks most of the time for baked lighting

Pick your poison, as they say.

Performance

Performance impact is practically negligible. I profiled FSceneRenderer_RenderMeshDecals and the frame time was 0.056 ms for the unmodified engine and 0.053 ms for selective blending. So there’s no statistically significant difference. There’s nothing like additional render targets going on and the only performance impact that’s worth mentioning comes with evaluationg a vector instead of a scalar for decal opacity.

Idiosyncrasies

There are two things I’d like to mention before closing this bit.

First, concerning sort order of mesh planes inside the same decal material. I have not yet figured out how to reliably control which decal plane gets rendered on top of another if both are part of the same material on the same mesh. I wrote previously about how to properly sort different decal materials on the same mesh, but this doesn’t apply to stacking decal planes in the same material. For opaque and masked materials, sort order is defined by the geometry, i.e. if you place a polygon in front of another, it will be rendered in front of it, too. The same doesn’t seem to be true for translucent materials. If I have a screw and I place it on top of a panel, both belonging to the decal in one material slot, then the sort order is, for all intents and purposes, undefined. It’s not really, though, since the rendered order is consistent and this whole thing is deterministic, after all. I just haven’t figured out yet how to control it. I suspect it is determind outside of UE, in the DCC app or the FBX exporter.

Second, if you plan on using both gbuffer and dbuffer decals alongside each other, be aware that there is a predefined stacking order that you cannot control, not even by manually adjusting the decal actor’s sort order: gbuffer decals are always rendered on top of dbuffer decals. It’s easy to see why that is. Dbuffer decals are applied before the base pass, gbuffer decals are applied before lighting, well after the base pass. When the renderer deals with gbuffer decals, the dbuffer has been processed, its render targets destroyed and there is just no way to access it again to do any kind of sorting. You can see it in the image below, where every decal is a gbuffer decal except the graffito on the right, which gets drawn behind every other piece of decal.

Oh sorry, only seen the link for the compiled project :smiley:

Heads up: I merged 4.20 and everything works as far as I can see. While doing so, I noticed that there is a new decal blend mode called Ambient Occlusion. This sounds promising. But it seems that this type of decal really is for AO only, the other pins aren’t even enabled. I don’t quite see the usefulness in this, but what do I know? Anyway, maybe I can glean some insight into decal AO from this and incorporate it properly into the custom decal blend modes.

awesome progress! are you planning on a pull request so proper decal support can get incorporated into the master branch in the future?