UE Showcase, landscape erosion

In this showcase they show a landscape being eroded, not only affecting the height map but also changing location of the two different layers being used: Unreal Engine 4 - GDC 2014 Features Trailer [1080p] TRUE-HD QUALITY - YouTube

I’ve been using landscapes a lot in UDK and set it up in UE4 as well but can’t figure out how to do that, I’ve tried to read all the documentation I could find on landscapes but couldn’t find anything about that.

Does anyone know how that is done?

Also, I’ve used the SubUV_Function to use only one of 2*2 textures in an atlas.
Is there a way to make it tile from that result and not use what’s outside of it?
I get it to only show one of the 4 textures but as soon as I try to tile it, it uses the whole sheet of 4 textures.
Or is it any other way of doing it? I’m not worried about mip map problems as I’m only planning on using it with normal maps that aren’t visible when further away.

Here’s how i used the SubUv_Function.

Thanks in advance!

It looks like it’s most likely using WorldAlignedBlend as seen here: Engine Feature Examples for Unreal Engine | Unreal Engine 5.1 Documentation

Try it out, it’s really finicky to tune right, I’d definately take their advice and set up a material instance to play with the sharpness and bias values otherwise you’ll be watching “Compiling shaders” all day.

As Enos mentioned, they are using WorldAlignedBlend for slope detection, maybe have a look here:

Materials can get complicated with this, easier way woudl be to bake out weightmaps from an external program.

Thanks! I usually use weightmaps, which works great but that feature just looked so awesome.

I’ve tried the WorldAlignedBlend function and it worked great.
It doesn’t seem to work for normal though, I get the following error: “[SM5] Function World_Aligned_Blend: (Node PixelNormalWS) Invalid node PixelNormalWS used for Normal input.”.
Does this mean it can’t be used with normals?

Hi Sitrec,

That function does indeed work for normals but there are a few details I should probably clarify for you. First, that function can blend in three different ways depending on the output pin used:

1)Alpha: This one uses PixelNormalWS. That means it uses the worldspace normalmapped surface in order to determine how each pixel is “facing”. This looks great for detailed things like bits of snow on your rock or brick shapes etc. The thing to understand is, you can’t blend between 2 normal maps using the normalmap version, because that is like creating a cyclical expression, or endless loop. If you want to blend the normals also, use the VertexNormals output.

  1. w/Vertex Normals: This option uses simply the meshes vertex normals to blend. It is the cheapest option by far, and usable for normals or anything. It can look a bit nasty if the mesh is very low poly, like vertex lighting.

  2. Using Explicit Normal: This option allows you to specify a unique texture that will be used. If you use the explicit normal, you will be able to use the normal map blend to blend between normal maps. It is a bit more expensive because you will be adding another texture sample.

In your case, the quick solution is just duplicate the normalmap sample, put it into the “In Explicit Normal” input, and use the w/Explicit Normal output pin.

Also, if you mouse over the output pins, all of that is explained in a bit more detail. Please let me know if you think there is something unclear about how any of that works.

Another note about this node. The math it returns is essentially the same as diffuse lighting calculations. This node can be useful for doing lighting for any sort of custom lighting work or toon shaders. You just treat the explicitnormal like a regular normal map. And “In World Vector” is just like a directional light.

Got it, thanks Ryan! Really helpful, it works perfectly now.