Dynamic Decal Blending + Mesh Integration Tools

It seems CryEngine are releasing some really powerful improvements to their engine tools that would go a LONG way if implemented nicely in UE4.

A good overview of these features can be seen in this video: Snowscene breakdown - YouTube

These features include decal based blending (as seen in the video above, a single decal projects snow on to any objects under it, with control over the angle and thickness of the snow - both easier than a per material based setup and seems much more consistent over many objects (what’s going on with their projection method, I see no seams?)).

Also they seem to have really nice features that allow material sampling from a given object (eg. the terrain) to provide really seamless transitions with custom geometry- perfect to add detailed shapes like jagged cliffs to the soft terrain but use only the terrain material and make it appear to be part of the landscape and not a unique mesh with an obvious seam.

I know there are ways to get almost similar results currently in UE4 but they are usually either too expensive or have glaring issues, having a properly implemented approach from Epic for really powerful, performant and easy to use blending is definitely the number one feature request I have on my list by far.

Just think how much this simple change could have (if implemented so almost every piece of geometry in the scene could blend nicely with terrain/other objects)- level design would instantly get a lot more fun and powerful, moving things around and having everything update seamlessly instead of having to tediously paint or place objects to hide seams. It instantly give a big layer of polish to a level for “free”.

Please consider adding some nice native support for this, all the new VR features are great but I really don’t think the number of devs that benefit from those specific platform features come close to the amount that’d benefit from this global change.

DICE have also had this tech in their engine since Battlefront, and you can see how useful and amazing this feature is here: Creating a Level in Star Wars: Battlefront - YouTube

It just looks like a glorified and tooled version of triplanar mapping. Do you have any performance stats on it on/off? I’m pretty sure it wouldn’t be too hard to make a mockup in UE4 and compare the two.

However it’s done, it’s still a great thing to have, especially if it can match the alignment of the textures and actually make a smooth transition with polygons instead of a hard edge. That’s above anything you can get with triplanar alone.

It would be a glorious day if I never had to make a skirt for an object again when trying to place it on the terrain.

Oh you can do that with distance fields and there are quite a few guides on it. Here’s one of the first ones that I could find:
https://forums.unrealengine.com/showthread.php?131825-Terrain-Blending-Tool-(based-on-Star-Wars-Battlefront)

Using triplanar mapping and distance field contact blending, you’ll pretty much have all of your bases covered.

You seem to have little to no idea of what is being talked about in this thread. No you can’t do that with distance fields. Fields give you intersection. That is all. Thread starter and other poster here seems to be well aware of ability to recreate the same effect for case to case basis. Thread is not about that.

Proper tool like this requires to sample terrain’s normal map, height map and weightmaps and textures used in landscape material.
It would be cool being able to enable this feature for any material and automatically pass in relevant maps in respect to which landscape component/ world composition level the mesh is in.
Such implementation is generally possible and at first glance requires quite a bit of work, but should be doable.

But…

The underlying problem is that if you try to sample all that in mesh’s material, which you want to blend, and add complexity of the mesh material itself… things go grim.
Imagine the case of having to blend 5-6 landscape layers on the mesh.
Remember, that the only information of the textures you would need for the mesh comes from landscape components.
Now think of a case, where mesh is located in 4 landscape components simultaneously.
Each component has 3-4 moderately complex layers.
Your mesh material would need to sample 12-16 sets of textures(normal map and basecolor at least), in addition to whatever you have in the mesh’s material.

That is simply unrealistic and will remain so for years.

I don’t have definite information about implementation of such tool in the engines you’ve mentioned, but I’m almost certain it relies on virtual texturing.
Without it, it seems impractical.
Virtual texturing is on the roadmap and hopefully we will see it working in foreseeable future.

I agree that terrain/mesh blending tool is a widely welcomed feature.

also sampling the distance field is quite expensive. I find it really hard to justify almost doubling your instruction count just to remove a seam

Oh boy, here comes Deathrey with his little personal grudge lol… Yes, I know plenty of what I’m talking about and you are apparently very under-informed about what you can do with that intersection… Yes, you can do exactly what Daniel described in saying “especially if it can match the alignment of the textures and actually make a smooth transition with polygons instead of a hard edge. That’s above anything you can get with triplanar alone.” With the area of intersection, you can then apply any sort of material or effect and get rid of the harshly joined edge; like adding in a WPO and modifying normals.

It’s still just as easy to make the effect from within the terrain material. If you have a snowy mountain area, your snow material is already inside of the terrain material. Therefore, this is where you’d perform that kind of task because you’d be wanting to add snow to the rocks that are intersecting the landscape. You can even throw in a vertex paint channel to add/remove the effect in different spots that you want/don’t want it to be.

Here we go again with the complaining about landscape problems… It’s like you find ANY reason at all to hijack threads and bring this up lol. I used to have a lot of problems with tons of layers, but that was when I was working with 6gb of ram. Since then, I’m now using 16gb of ram and haven’t had an issue since. Hell, the level I’m working on has nine pretty complex layers and I’m not having any issues at all. If I wanted to, I could easily take it up to 16 layers. Upgrade your ram, optimize your materials, don’t use 4k textures for every single channel and you’ll live. Also, once it’s all compiled, for standalone/packaged, it runs waaaaaaaay faster.

There are a lot of ways to tweak the blending and save a lot of performance. A dithering approach would work well for games that have a render target of 1080p or higher.

Gasp It’s almost like you have to render an extra model!

Don’t get me wrong though, it would definitely be awesome for them to expand upon the decal system but that could throw a rack’s worth of spanners into the mix. Projecting a decal is one thing, projecting a decal that can access the meshes, their materials, blend between them and even offset their geometry, is a pretty big beast to squeeze into the rendering pipeline. I’d still like to see some benchmark stats, of the Cryengine implementation, with that effect on and off.

an extra model would just take over, so at any given pixel you either render one material or the other. a more expensive pixel shader means you’re rendering all pixels more expensively.
start doing more complex things like blending the same texture on your mesh as from the landscape below (which btw might be blending several layers at the same point) and adding more fancy effects and things get out of hand quickly.
there’s a reason the engine provides a material complexity view. you mock me when I mention shader complexity but then you yourself suggest to “optimize your materials” ?

sampling the distance field is not prohibitively expensive, it’s just a really big chunk of a feature (that stacks on top of others) that’s really hard to justify when you get the point where you need to sacrifice features for the sake of performance.
just because it can already be done in a certain way doesn’t mean it’s the only acceptable way to do it. what’s being suggested here is to do things in a different (more powerful and efficient) way, taking advantage of the deferred renderer. unless you like to settle for less, there’s nothing wrong with suggesting such things. especially if it’s something that can be done like… Gasp other engines do!

@Deathrey @Chosker, Just don’t read those posts and don’t reply. You already know what happens in the end.

Back on topic, this feature is currently not supported in UE4 and there’s nothing you can do about it. It requires lots of hard work that has to be done by the engine development team. After Frostbite 3, Cryengine is the second engine to have this feature built in. But if I’ve learned one thing from comparing other engines with UE4 it’s that you shouldn’t expect to have the cool stuff, also at the same low cost in UE4 like other ones. We’re still struggling with performance issues with almost anything related to landscapes (tessellation, performance per layer, grass tool, etc.) it’d be very unrealistic to ask for new landscape specific features while the basics are almost broken and not production ready.

It is, a bunch of people have been fooling around with it for over a year now after DICE showed off the same thing for Star Wars Battlefront. So far nothing’s come of it, and all attempts have ballooned into far too much of a performance hit even if the visual results are nice.

What people are doing isn’t quite what dice did.
People are using distance fields to blend between 2 materials (the second material on the mesh being the same material on the landscape). But what dice did is that their meshes blend into every/any layer of the landscape they’re placed on. The landscape layer underneath the mesh is sampled and bleeds into the mesh, which is not something anyone has been able to do in UE4.

I didn’t mean literally… I mean’t figuratively. You’re trying to create something, similar to a skirt model, out of a shader; in order to bypass having to model and place the skirt.

You’re asking for a next generation level “icing on the cake” type feature and you’re not willing to pay the cost for it. Make up your mind. As for optimizing materials, yes, you should always optimize them but you need to make sure you optimize them properly. You don’t need mesh-terrain blending on models that are 3000 units away and therefore you’d put a range on it; within the material.

Like I’ve said many times, different engines are designed differently. Picture an inverted pyramid. Now picture having to make all sorts of changes toward the bottom point. That’s what would need to be done in order to implement a lot of these features(better landscape layering, “3d” decals, etc) in a more performant way. It would likely require them rewriting half of the rendering pipeline and that’s probably not going to happen any time soon.

We have been on topic but you’re just cherry picking what you want to see.

Absolutely. The FB3 engine, if it were licensed, would probably cost studios ~5 million; at the least. It’s a super AAA engine that’s pretty much only used on games that are raking in tens of millions, if not hundreds of millions. Cryengine, on the other hand, has always been a leader and “trendsetter” in graphical fidelity. A platform only has so many cpu/gpu FLOPS to work with though, so where they are strong in some areas, they will be weak in others and it definitely shows in some areas. I’d rather have an engine that is a jack of all trades, master of none, than an engine that is strong in several areas and weak in many. Which is exactly why I’ve chose UE4 over other engines.

your skirt analogy doesn’t hold up at all. we’re trying to create a skirt on top of the body that doesn’t also need to make all the fancy features of the body behind it.

how do you put a range on something “within the material” if dynamic shader branching isn’t even there?

also the point of using deferred decals is to greatly reduce the cost of such icing on the cake. they are deferred, they have access to all the buffers that are already written in that frame. you know what that means right? you sample what’s already there instead of having to duplicate the same functionality in a different material

Alright, table cloth ghost costume then… Unless you’re using transparency, on the cut up table cloth, it shouldn’t have to sample the pixels of the “rock” underneath it. Dithering is your friend. If there is a dithered mask, over top of the rock, then it should only have to pull information for the rock material on the pixels that the cut up table cloth ISN’T occurring on. Now if there is transparency going on, then yeah, it would have to sample the rock and do a blend. That’s why you’d definitely want to avoid transparency like the plague; on an effect such as this.

There is an awesome node called pixel depth. You can compare values and determine what you want to do from there. It’s commonly used to fade things out like normal maps to a flat normal at a distance. This is one big way to help optimize materials.
https://docs.unrealengine.com/latest/INT/Engine/Rendering/Materials/ExpressionReference/Depth/

Oh I’m aware of how they work and the beauty of them them(still waiting on deferred spline decals). Trying to add in a feature, like this CE decal system, requires a lot of work though. UE4 has a pretty simple version of deferred decals and doesn’t allow for WPO/displacement because of the stage in which they are rendered. After digging around, I found a quote from Andrew Hurley, on an answerhub post asking about WPO/displacement on decals, saying:

dithering is not my friend. for quick LOD transitions it’s good but for permanent effects up-close it’s ugly. the only way to mitigate its uglyness is to use TemporalAA, which in UE4 gives you horrible smearing of moving objects (or the entire screen if the camera moves) and blurs the entire scene as well.

no. there isn’t an awesome node that allows you to optimize materials on the fly based on an evaluated condition.
you obviously lack this very simple shader knowledge so I’ll make it clear for you: Lerp samples both inputs first and then blends between them. all the time. for all pixels. always. lerp doesn’t do dynamic branching so when you “compare values and determine what to do” and “fade things over a distance”, both versions that go into the lerp are contributing to the shader complexity, always.
fading things over a distance is used for cosmetic purposes (fading normalmaps into a flat normal, fading textures to a less tiled version to avoid repetition), but whether you know it or not, it causes extra overhead which I’m sure you’ll agree means it’s the opposite of optimization. pretending it optimizes your material is simply lying to yourself out of ignorance.

“optimize your materials”. sure.

It comes down to everyone not knowing how to optimize their materials, every time. :wink:

Oh wow, I assumed that the engine already had dynamic branching implemented in it but I tested it pretty thoroughly and it definitely doesn’t… So far, I’ve made all of my materials assuming it was in the engine and never really bothered to actually benchmark test it to see. I mean it’s only been an HLSL feature since what, SM3? It kind of makes sense to code an engine so that if a lerp is a 0/1 or an IF node is an A/B, that it would skip the rest of the stuff behind the branch that wasn’t needed. Shame on me for expecting that out of a AAA engine in 2017 lol…

That being said, does branching in a custom node work at least?

On the plus side, at least all of my materials are ready to go for when it does get implemented. I might go back and replace my IF nodes with a custom node, if it actually supports the branching.

And about the dithering, at a distance, it’s not really a big deal. If dynamic branching really worked, it would save a TON of performance to dither blend two distant materials together, rather than do an actual lerp between them.

Left: regular blending, Right: dithered blending

Hybrid blend where it will do a normal blend <600 units and a dithered blend beyond that:

Update: Yes, you can put in some dynamic branching. I had some time to mess around and quickly made a simple IF switch custom node. I put it through the ringer and it works great! To benchmark it, I just set up an overkill lerpfest between four 8k textures for the complex part and just a plain color for the simple part. Within the level BP, I made a simple loop that alternates between the two states every 10 seconds (to ensure the graph settles).

One caveat that I’ve found is that if you have any sort of animating going on behind the “deactivated branch,” like a panner, it will keep the textures “hot” in the cycles and they will still contribute to frame time. Though I did only put one of these custom functions in and it was right before the material attributes. It’s a pretty cheap little function, so it probably wouldn’t hurt too badly to throw more of them into the mix; like before things like panners if needed.


[branch] if ( A >= B)
{
return ThroughA;
}
else
{
return ThroughB;
}

When it’s on: ~90ms
When it’s off: ~67ms
So it’s definitely working!

Shouldn’t the single color branch be way faster than 67ms if it’s dynamic branching?

1080p, epic settings, dynamic lighting, bunch of other programs open and a toaster of a video card… Blank scene, with those settings, I will get like 10fps lol.

EDIT: Tested it out, yep, 10fps with just the character, the dynamic lighting and those settings. Maybe I need to restart the editor because that still doesn’t seem right.
5c33e688d6b6e98e0b9bd3f634d402a11151f330.jpeg