Attempting Displacement on Landscape

Hi. I am making an automaterial for an Unreal landscape.

I have the diffuse, normal working ok. They UV scale based on distance which is what I wanted. So far, so good

The issue comes when I am trying to use the height (stored in the Diffuse alpha), which I am lerpin gseparately, but in exactly the same way I am for the (working) diffuse and normal.

When I take my material functions ( where the scaling happens) and try and connect them with this scaling (based on a pixel depth lerp between two same but differently scaled textures I get “node pixeldepth invalid” errors within my Master Material. I have tried other ways to accomplish this using camera position etc but the error is always the same.

Can I have an automaterial in this way with a displacement? Is it possible? I have trawled and trawled and seen a few dead ond forum posts here and there but it does not seem to have been explained conclusively.

The shots below are of my mastermaterial ( where I Layerblend the automaterial and the other landscape layers) and the material function.

If you can help then please do…ive even bought tutorials for in the hopes this will be addressed but it never is. The World Displacement just does not seem to want to accept the input this way.

It’s always tricky debugging other people’s landscapes. I think you might get more joy if you use a separate material attributes node for each ‘area’. So one for the normal, one for the albedo, one for the height etc.

This is a good vid:

Thank you. I have seen that but hes doing a height blend ( in terms of the altitude in workds space). I have an automaterial (first shot) with three material functions going into that. The diff and norm are and height ( in terms of the alpha blend between two textures within materials) al work fine, and the landscape is plenty subdivided, only the world displacement - locked in the Landscape Layer Blends as it is - will not connect to the world displacment without the above error.

It is caused, as I understand it, by the World Displacement on the Master Material having issues with the way in which I am fading between two scales of UVs ( at the Material Fucntion level) and I cant seem to find mention of how to do it. If you ahve any other ideas Id love to hear them, Really!

I can’t get the same error as you. In any event, your problems may go away when you define the function ‘more correctly’.

A function is supposed to take inputs and transform them into outputs, so it shouldn’t actually have textures inside it. It’s supposed to be a resuable piece of machinary.

If you make the function like this:

](filedata/fetch?id=1774090&d=1591801787)

then you re-use it each time like:

](filedata/fetch?id=1774091&d=1591801808)

PS: It is the right way to use a function, but I think there may be something a bit iffy going on, both with height maps in Alpha ( unless you specifically put them there ), and I can’t get the camera fade to do what I think it should…

EDIT: Ok camera fade works.

Hi - back again… :wink:

Can you show your grass function, as that’s the one giving the error… thanks.

He did. the function here

For what it’s worth, I think the issue is that the pixel depth is used in the camera depth fade function and then later hooked into the world displacement - which does not support pixel depth?
Would love to test but I’m stuck compiling 20k shaders atm.

Thanks so much for the help guys. I dont feel quite so alone now:)

Yes, the issue does seem to be that the world disopacement has an issue with pixel depth blends. Dont nention compile times, MostHost, it’s a masive hinderance!

I ve been looking to try and find a mode that it will accept but to no aviail. As I say everything else works great, its just that Dislacement (World Displacement) wont take this fade and as Im sure you fellows know scaling uvs on distance is too good a visual tweak to just discard. If it came to that id sadly have to keep the distance based tiling ( controlled by the Pixel Depth fade at the MF level) and bail on the displacment…which would be a shame.

I know others have this issue as ive mentioned above so id love to solve it and share but i think im missing something fundamental (like whether or not this is even possible).

Edit; This chap seems to be having the same problem. Alas, if he found a solution he did not share it. But its essentailly the same problem only ive got my pixel depths defined on the incoming Material Functions.

Heres what I have so far, Displacement would really set it off though ( andf some decent foliage)

I believe that since displacement is per-vertex, a per-pixel operator would be invalid. I had to work around this by using Distance(AbsoluteWorldPos, CameraLocation). Seems to actually come out cheaper in some cases vs pixel-depth; maybe it just packs better.

Normally you do use distance and camera location.

If you want to make your material.perfrom well you will also make a hard UV line at about 10 or 20m from the camera locatiom and only add an extra twxture sample to cover this hard line.

The formula is all derived from WPO and Camera placed into a distance node.
Or you can get the vector lenght and do math your self, but the distance node is the same.

Subtract the distance you want the line to be at. Make it a scalar and play with the material instance.

Then, since I want my mask to move with it, I use that distance ± an offset to create the area where the other texture is blended upon.

Other things that I Used to think were good but aren’t is additional samples.

Let’s be 100% clear. An unotimized landscape plus a bad material can eat up to 50ms.
the first thing you should do is to make sure that the landacape you are using with no material on isn’t running your frame cost above the 8ms.
if it is, you need to balance the LODs (which with the new system since .19 it kinda sucks but it can be done).

After that your landscape material really only needs 2 textures. A diffuse and a normal.
you pack whatever you want in their alpha channels.

For roughness and specularity you can play with one of the channel results off the diffuse.
Its not the same as using the proper texture… though I suppose with very minimal diffuse change you could replace the real texture into the most similar looking channel.
the problem doing that is that it may not always be the same pin. So avoid it and standardize it by using the same pin to derive somewhat similar information.

the alpha channels normally contain displacement, or an alpha.
I use the alpha off the normal as an alpha mask to blend grass colors and match the landscape, for instance.

After a few weeks of tweaking the result is a scene (mind you just landscape, grass, and athmosphere/light) running above 80fps on a 1080ti at 4k.

introducing an extra packed texture into the mix takes it down below 60fps, so the cost is Very high compared to the quality of the result.

Cheers, MostHost, I really appreciate the time taken to reply. I thank you too for the emphasis on optimization and frame time. I run with stats nopw to check precisely that now, so thanks.

I think the solution is figuring a pixel depth mask that can drive two texture scales (based on “distance”) that doesnt upset what the World Displacement wants to recieve. Ive tried, with my limited understanding, to generate an alternative to what the Pixel Depth node does in the hopes of the World Displacement accepting it. Ive failed so far. So many times in fact that I dont think its actually possible. Do you think im wrong?

Considering I have it running, yes.
If you open up the CameraDepthFade function you can copy and paste your starting point.
It has Camera position & Absolute World Position - into a Custom that does a Length operation.

Instead of that you can put both the 2 position expressions into a Distance node.

Subtract from the distance the offset you want (a scalar you can change).

Plug that subtraction result into an IF statement as A
set B to the distance.

Put a UV coord x 10 in the >
Put a UV coord x 1 in the <

Plug the output to a texture and see what happens when the material builds.

Im going to try that today. Thanks for the help Mosthost, wish me luck:)

Hi. I can see when i preview the final clamp that it is going from black to white based on distance. I have that plugged into the UV of a texture. Only the alpha I see in the preview does not seem to be driving the UVs/

The alpha channel isnt a UV coordinate.

thats why I said to drive the result of an IF statement with the alpha.
you need the IF output to be some scaled UV for the results that are greater than and some other for the UVs which are less than.

Apologies, I meant the alpha of the output, not the actual texture alpha.

I think im getting somewhere now… Hard to tell as the compile times are a bit drastic but we shall see.

Thanks again for helping:) I stil find lots of this quite confusing ( ive used in-house tools for a decade and im new to UE).

Okay ive got it all set up and the layer blends are working. All but the world displacement which was the issue before.

Ive not got any error this time *(since im blending the distance UVs a different way) but whenever I plug-in the World Displacement it breakes the material. If you’evb any idea id really love to hear them:)

Oh… yea. You can use only 1 blend by just using material attributes on the material.

Thought you had it set up that way already.

Each function outputs a material attributes. You then can blend the attributes and put the output right to the material.
or breake it, make global changes - x vectornormalws, maskg G x scalar, append, back to displacement, tesswllation multiplier, roughness/specularity change, etc. Then you create the new material attribute and connect again.

Also need an error if it breaks…

Hi again:)

Im not sure I follow. In the set up above I have a convoluted (because I cant see any other way) connection to the Layer blend that is taking care of the World Disp. Every other constellation i tried failed, as does this one. No error. It just breaks the Mat.

I also tried instroducing a Matt Att after the Height lerps ( as in ths hot above, only witjh a Matt Att after, and I also tried before, it reaches the Layer blend, and got an eror “Cannot mix Matt Att with non Matt Att nodes”. Ive uploaded the material and the function if you get a chance to look at it as I confess at this stage I am profoundly confused:(

Had to split them up as theres a 600 kb limit here.

So, look the whole root of the problem is your WorldAlignedBlend node.
By default it uses PixelNormalWS.

This doesn’t allow you to blend either Normal Maps, or Displacement.

Remove it. Realistically, it should be included within the layer functions of each layer if you want to include it, for Each layer needs to have the Normal Map also aligned with the Diffuse.
Not doing this won’t look right.

To make an Auto Mask you have to look at the Slope value given by VertexNormalWS Mask B.
You need 2 masks, so try setting one subtract to .7 and one to .8.
Then you multiply those to create a falloff. usually high values, 10 to 20.
Clamp it.
Subtract one from the other. This creates a mid range.
then subtract the subtraction from the first.
calmp all 3 (last one should not need it as it was clamped before).
and output into 3 variables.
Each will give you a different alpha value to use.

With that, you can avoid the mess that world aligned blend causes.

Make yourself a custom blend function.
simply 1 scalar labled Alpha, and 2 Material Attributes A and B inputs. break both, and lerp the 2 together for each pin you want (keep it minimal).
Make material attributes out of the lerps, and output the function.

With the function you can now combine the slope mask to the blend and make a simple auto material using just 1 node instead of having to split everything up into stuff several times over.