Landscape Reflections - Being PBR is Not Enough.


I’m making this new thread instead of bumping the one from 2016.** This is not a new issue. And is not limited to UE4.**

The issue: Landscapes reflect light like any other surfaces do, in UE4. However, that’s not the behavior we’d expect from it. The cause and fix might be different in each engine, but I’m purely discussing the unwanted effect itself.

UE4: With a roughness of 1, landscape is still reflecting an unrealistic amount of light. Reducing specular means making things look wrong on another level.Source]( And thisand… This.
Cryengine: They agreed on the issue and did their own solution internally. Not much information available on that besides this. Here’s source.
Unity: Same issue. Their developers seem to have fixed it after a user brought it up. And source.

I have spent lots of time looking around and invented workarounds to battle the issue and make it look better.

Most effective workarounds
**1) **Use Fresnel mask to reduce specular.
2) Reduce specular based on distance from camera.
**3) **Use Albedo in specular calculations to break up the reflection.
4) Multiply AO/Cavity maps on top of that. (Doesn’t help much since those get mipped in distance and lose details).

At the end of the day, reducing specular to fake a proper landscape surface is just adding to the existing problem because materials need correct specular amount. the ultimate solution is clearly not something to be done on our end. This prevents us from achieving realistic results.

Isn’t it time to fix this once and for all?

Edit: Mods, please move to feedback forum if more appropriate. But in rendering forum discussion might pick up quicker. Thanks.

Roughness is not something that you author once and for all. It is context-dependent.

PBR shading is correct enough. But it is only correct within a given fragment. When you are setting roughness of the surface, it represents microsurface data, that you cannot otherwise input with a normal. With landscape, one fragment can equally be 1 millimeter when surface is close, and 10 meters, when viewing vista. So when you author your roughness, it remains valid only as long as your macrosurface(geometry/normals) does not bleed into microsurface level(single fragment). Which is exactly what happens with landscape(In fact any granular surfaces) and it is also one of the dominant reasons, why sci-fi setting instantly looks gorgeous, but any natural surfaces, such as wood or rock, tend to give you white specular sheen. You need to account for all details you have averaged out while lodding down(includes both geometry normals and normal maps) and add them to roughness. There are ways to deal with it, such as normal variance from mipchain to roughness, but that imposes quite a bit of complications. The most simple and efficient is artistic approach to adjust roughness with distance(screen size? and/or angle?) by some fancy function.

TLDR: Increase roughness with distance-angle.


My example is with roughness set to 1 as mentioned above, so there’s really no artistic control there since maximum roughness is already not enough which is why workarounds mostly involve reducing specular (which is wrong).

It’s a similar issue as to what happens with foliage. Microfacet shading becomes not so micro at a distance. I’ve been able to alleviate it somewhat by using mip sharpening for textures like normals and roughness so their per-pixel detail is maintained for a farther distance from the camera. You could also try the normal composite option for the roughness texture, which is pretty similar to what Sledgehammer did for WWII, although their’s was specifically meant for solving this problem and not to reduce specular aliasing.

Yep. But what makes you think that it is incorrect? I’d expect real world landscape to look more or less the same way, if such a featureless landscape could exist. In fact, you can find quite a few locations, exhibiting such specular on vista terrain in places such as Sahara or Aral Sea.

Besides macrosurface bleeding into microsurface, I see no issues with shading and those should not be dealt on the shading level.
It is not limited by your normals. Directional shadow plays pivotal role too. While ambient occlusion textures would mip down to averages and give you pretty good dampening on indirect specular in the distance, there is nothing like that in place for directional shadow. All those rocks, grains of sand, blades of grass and what not, they all cast shadow. And with distance, you should perceive average of shadowed and unshadowed areas, and thus, averaged specular. But in the engine, you are getting full, because either shadows are disabled at such distance or even if they are not, they still would operate on a pretty smooth surface, not affecting your specular. In not so near future, it is likely to be addressed in realtime rendering. For now, gotta think about tech art wokarounds. And In this case, it is not the specular that should be adjusted, but something with shadow projection, (negative bias based based on roughness/distance/angle ? Added shadow based on roughness/angle/distance?) Should be something like that. But if you take away directional shadowing, I am convinced, that shading looks close enough to ground truth to call it correct.


This one’s rough, more resolution, making sure temporal AA is on, but most importantly turn on… ***** I can’t find it and can’t remember what it’s called. It’s a texture filtering technique everyone else in the world calls Toksvig after it’s originator, but UE4 supports it yet has some specific term for it that no one else uses just to confuse people.

Anyway, if anyone can remember what it’s called in UE4 that’d be great. But it changes the roughness mip maps to take into account how rough the normal maps are, so this exact scenario happens less, alongside other unwanted things like fireflies. It’s really, super cheap too, even if it’s not 100% effective it’d definitely help anyone looking at a solution for this problem.

By the way @Dethrey OP is 100% correct here, in a perfect render this doesn’t happen. It’s just an effect of mip maps “flattening” things out and making them look smoother than they should the farther away they are.

It is called Composite Texture in Unreal, and I have absolutely no idea why so many authors are not making best use of it.

But it addresses only texture level details, not geometry details, which play pretty important role with landscapes, and will still result in glare on landscape, unless something is accounted for during shadow projection.

There is one remedy, I might suggest for specific case from rendering side. That being, store sun microshadowing in additional gbuffer target, so that it can calculated in material, and when averaging out, will get you distant directional occlusion. Microshadows would be applied during shadow projection pass.

All your points are correct. Maybe I didn’t phrase it clear enough.
What engine does is it defines the shading for a surface. But no one is considering if that surface is supposed to be simulating a flat smooth surface or is intended to be simulating a very rough and noisy surface like terrain, which is why there should be some differences between a flat plastic plane and a flat terrain plane. We can call it workarounds, tricks, hacks or whatever it is. What I’m saying is, what do we do about it? or what “can” we do? Suggestions about shadowing are quite Epic Games level and not something any of us would want to do on our own.

So what would be the best solution that’s still based in reality but artist friendly? A surface self occlusion value/texture mostly for landscape type surfaces?

Sad part is, that likely, not much, especially as content authors.

Composite textures, that’s it, thanks!

Composite textures will help, you can see in the documentation how both temporal AA and it make a shiny but rough surface darker. But Dethrey is right, for far away lanscapes there’s still going to be a problem. Right now I don’t what to do about it either, far shadows with a large number of high res cascades and making sure lods are extremely high/distance field shadows… but all that sounds expensive.

I’m afraid better than that really would need Epic to add more features, as far as I can imagine it at least.

For far away stuff, usually, you put a few layers of fog in.
Doesn’t have to be fog, maybe it’s Haze. Either way you layer that stuff in over different planes and drive the values via MPC.
This won’t fix the base problem, but it does give you a way to offset it on top of a postprocess.

Actually, a camera distance based PP that tones down the intensity or bloom with a custom material may also be a viable solution to your problem.

Check out the kite demo for an example of how to do the fog banks. (Note that they forgot to turn off shadows on the meshes and that they therefore leave lines across the landscape, last time I looked it over at least).

Pretty sure that suggestion is not applicable to OP.

To sum up, at least 3 separate problems exists:

  1. Geometry details, and consequently, directional shadows and AO fading out with distance. Includes both terrain mesh simplification and any kind of ground clutter. No realistic universal approaches to deal with currently.Sun shadow maps is the closest one.

  2. Normal detail textures averaging out with distance. Easily dealt with by texture compositing/ normal length to roughness and what not.

  3. Shading. Even though I am fairly convinced that currently widely used shadow-masking function undermasks on grazing angle for rough surfaces, and completely understand OP, I’d still challenge that it is correct enough to serve vista terrain purposes.

For point 1)
you can’t expect a 1 to 1 fidelity at a far distance. Even with imposters the shadows would change. Most of the time this doesn’t really matter either because of the fog and haze also being an IRL thing that offsets or precludes those shadows from view. Unless you bake lighting you also won’t really see shadows from instanced objects at far distances anyway. (Not that that’s a good thing mind you, It’s just that the load on rendering becomes unrealistic for what little detail the shadows of something like a tree add).
Technically you could bake out a distance replacement material that includes the shadowed area if you are working with fixed light.
Either way, all of this kind of has 0 to do with what the OP stated… aside from the fact you can factor in and adjust the texture of the landscape to include those details at a distance by way of a render texture and a scene capture 2d component you only run once at the start of the pre-baked light level.

Unreal’s shading model has always been pretty good compared to anything else on out there when it comes to game engines, so I would be surprised if they got this wrong specifically on the landscapes. So first, I’ve made a flat landscape and a simple mesh plane and aligned them next to each other with no gap. Then I’ve put a fully rough material on it and captured a screenshot:

You can see there’s absolutely no visible seam. So we can safely assume that the material shading is not handled any different for landscapes, compared to static or skeletal meshes.

Next, I’ve introduced PBR texture set, some classic random megascans:

And yes, it indeed does look a bit slimey at the distance, not like a rough dry grass. But here’s what happens when I turn of mip mapping for all the maps:

This is the actual roughness I’d expect to see.

This is the ages old problem of game engine texture filtering not being anywhere near the quality of the texture filtering of offline renderers. But handling mip mapping imprecision at distance by introducing some weird roughness bias parameter specifically for landscape use would not be a fortunate solution for this I am afraid.

Anyway, it does not make sense to conflate it with Unity’s bug of handling roughness of landscape materials differently depending on gamma space. Unity has special shader for landscape which can’t be used for static meshes and vice versa. This shader had probably some hard coded gamma math for textures which did not properly updated when the switch between linear and sRGB space was added to Unity. It’s definitely not the same issue, or even similar.

What CryEngine folks talk about could be more related to that. I myself have encountered this problem too, so I too would like to see it solved. But at the same time, I’d like to avoid attributing the issue to factors which are not responsible for it.

I don’t think it has been mentioned yet, but a big part could also be that the average visible normal on a 3-dimensional ground starts to shift significantly at grazing angles. The strength of that effect depends on how much height the material relative to the detail size has. I’ve done a quick test for it in Blender, comparing normal mapped planes with actual geometry. (And really this effect is in no way specific to only landscapes anyway, something like a wool sweater would display the same behavior):

A quick solution could be to bend the normal towards the camera, but that would require further testing.

I’ve tried my theory of bending the normal. Currently, I’m just creating a dot product from the camera to surface point direction and the vertex normal. And use that with an eyeballed curve to bend the normal towards the camera.
Example with a light shining towards the camera and roughness values of 1 and 0.25.

It’s less pronounced with a light behind the camera but still a clear improvement (only roughness 1 here)

One thing that would help rough surfaces at grazing angles would be multiscattered specular.

Also maybe switching to Oren-Nayar diffuse shading, over the standard lambertian model.

Since I’m currently rewriting my landscape material I have put my theory to the test in unreal and while this is probably not “correct” bending the normal towards the camera leads to a significant visual improvement.

Here is the material snipped I’m using to bend the normal, the values are just eyeballed to look good: