Announcement

Collapse
No announcement yet.

Landscape Reflections - Being PBR is Not Enough.

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • replied
    @witch-dev Nice workaround but not suitable for every situation. Imagine looking down a extreme slope. Or having watery/wet layer on the ground. It will look very wrong.

    Leave a comment:


  • replied
    Originally posted by preston42382 View Post
    witch-dev

    Why did you base the lerp on the Saturate output? Not questioning it because I know something better, but asking because I'm learning.
    Rosegoldslugs explained it very well, though admittedly, in this case, it's not really needed. The values won't go out of the 0-1 range anyway (At least with reasonable input values).
    Also, the abs node after the dot product should probably be a saturate instead.

    Leave a comment:


  • replied
    Originally posted by witch-dev View Post
    Since I'm currently rewriting my landscape material I have put my theory to the test in unreal and while this is probably not "correct" bending the normal towards the camera leads to a significant visual improvement.
    It's definitely a hack, but hey it works, great job!

    Leave a comment:


  • replied
    Originally posted by preston42382 View Post
    witch-dev

    Why did you base the lerp on the Saturate output? Not questioning it because I know something better, but asking because I'm learning.
    Because linear interpolates generally take an alpha value between 0 and 1. Saturate works like a clamp between 0 - 1, so in order to avoid any unwanted negative values coming out of the Lerp, you should clamp the alpha input.

    You could always go outside of 0 - 1 but it depends on your use. In most cases, there is a specific purpose to blend between 2 fixed inputs so you clamp the alpha to never get anything outside of the 2 inputs.
    Last edited by rosegoldslugs; 12-07-2019, 11:49 PM.

    Leave a comment:


  • replied
    witch-dev

    Why did you base the lerp on the Saturate output? Not questioning it because I know something better, but asking because I'm learning.

    Leave a comment:


  • replied
    Nice, I love the control you can have with your solution witch-dev. Thank you for sharing the code.

    Leave a comment:


  • replied
    Since I'm currently rewriting my landscape material I have put my theory to the test in unreal and while this is probably not "correct" bending the normal towards the camera leads to a significant visual improvement.
    Click image for larger version

Name:	witch201912051.jpg
Views:	177
Size:	188.2 KB
ID:	1694708
    Click image for larger version

Name:	witch201912052.jpg
Views:	193
Size:	197.9 KB
ID:	1694707

    Here is the material snipped I'm using to bend the normal, the values are just eyeballed to look good:
    Click image for larger version

Name:	witch20191205NormalBend.jpg
Views:	186
Size:	222.7 KB
ID:	1694709

    Leave a comment:


  • replied
    Also maybe switching to Oren-Nayar diffuse shading, over the standard lambertian model.

    Leave a comment:


  • replied
    One thing that would help rough surfaces at grazing angles would be multiscattered specular.

    Leave a comment:


  • replied
    I've tried my theory of bending the normal. Currently, I'm just creating a dot product from the camera to surface point direction and the vertex normal. And use that with an eyeballed curve to bend the normal towards the camera.
    Example with a light shining towards the camera and roughness values of 1 and 0.25.
    Click image for larger version

Name:	Terrain_and_normal_correction.gif
Views:	262
Size:	477.3 KB
ID:	1682846
    It's less pronounced with a light behind the camera but still a clear improvement (only roughness 1 here)
    Click image for larger version

Name:	Terrain_and_normal_correction_2.gif
Views:	249
Size:	289.5 KB
ID:	1682847

    Leave a comment:


  • replied
    I don't think it has been mentioned yet, but a big part could also be that the average visible normal on a 3-dimensional ground starts to shift significantly at grazing angles. The strength of that effect depends on how much height the material relative to the detail size has. I've done a quick test for it in Blender, comparing normal mapped planes with actual geometry. (And really this effect is in no way specific to only landscapes anyway, something like a wool sweater would display the same behavior):
    Click image for larger version

Name:	Terrain_and_normals.jpg
Views:	320
Size:	563.2 KB
ID:	1680093
    Click image for larger version

Name:	Terrain_and_normals_sun.gif
Views:	295
Size:	438.2 KB
ID:	1680094
    A quick solution could be to bend the normal towards the camera, but that would require further testing.

    Leave a comment:


  • replied
    Unreal's shading model has always been pretty good compared to anything else on out there when it comes to game engines, so I would be surprised if they got this wrong specifically on the landscapes. So first, I've made a flat landscape and a simple mesh plane and aligned them next to each other with no gap. Then I've put a fully rough material on it and captured a screenshot:
    Click image for larger version

Name:	Landscape.jpg
Views:	560
Size:	71.4 KB
ID:	1680019You can see there's absolutely no visible seam. So we can safely assume that the material shading is not handled any different for landscapes, compared to static or skeletal meshes.

    Next, I've introduced PBR texture set, some classic random megascans:
    Click image for larger version

Name:	Textured.jpg
Views:	455
Size:	233.8 KB
ID:	1680020
    And yes, it indeed does look a bit slimey at the distance, not like a rough dry grass. But here's what happens when I turn of mip mapping for all the maps:
    Click image for larger version

Name:	NoMip.jpg
Views:	446
Size:	273.2 KB
ID:	1680021
    This is the actual roughness I'd expect to see.

    This is the ages old problem of game engine texture filtering not being anywhere near the quality of the texture filtering of offline renderers. But handling mip mapping imprecision at distance by introducing some weird roughness bias parameter specifically for landscape use would not be a fortunate solution for this I am afraid.

    Anyway, it does not make sense to conflate it with Unity's bug of handling roughness of landscape materials differently depending on gamma space. Unity has special shader for landscape which can't be used for static meshes and vice versa. This shader had probably some hard coded gamma math for textures which did not properly updated when the switch between linear and sRGB space was added to Unity. It's definitely not the same issue, or even similar.

    What CryEngine folks talk about could be more related to that. I myself have encountered this problem too, so I too would like to see it solved. But at the same time, I'd like to avoid attributing the issue to factors which are not responsible for it.

    Leave a comment:


  • replied
    For point 1)
    you can't expect a 1 to 1 fidelity at a far distance. Even with imposters the shadows would change. Most of the time this doesn't really matter either because of the fog and haze also being an IRL thing that offsets or precludes those shadows from view. Unless you bake lighting you also won't really see shadows from instanced objects at far distances anyway. (Not that that's a good thing mind you, It's just that the load on rendering becomes unrealistic for what little detail the shadows of something like a tree add).
    Technically you could bake out a distance replacement material that includes the shadowed area if you are working with fixed light.
    Either way, all of this kind of has 0 to do with what the OP stated... aside from the fact you can factor in and adjust the texture of the landscape to include those details at a distance by way of a render texture and a scene capture 2d component you only run once at the start of the pre-baked light level.

    Leave a comment:


  • replied
    Originally posted by MostHost LA View Post
    For far away stuff, usually, you put a few layers of fog in.
    Check out the kite demo for an example of how to do the fog banks. (Note that they forgot to turn off shadows on the meshes and that they therefore leave lines across the landscape, last time I looked it over at least).
    Pretty sure that suggestion is not applicable to OP.




    To sum up, at least 3 separate problems exists:

    1) Geometry details, and consequently, directional shadows and AO fading out with distance. Includes both terrain mesh simplification and any kind of ground clutter. No realistic universal approaches to deal with currently.Sun shadow maps is the closest one.

    2) Normal detail textures averaging out with distance. Easily dealt with by texture compositing/ normal length to roughness and what not.

    3) Shading. Even though I am fairly convinced that currently widely used shadow-masking function undermasks on grazing angle for rough surfaces, and completely understand OP, I'd still challenge that it is correct enough to serve vista terrain purposes.

    Leave a comment:


  • replied
    For far away stuff, usually, you put a few layers of fog in.
    Doesn't have to be fog, maybe it's Haze. Either way you layer that stuff in over different planes and drive the values via MPC.
    This won't fix the base problem, but it does give you a way to offset it on top of a postprocess.

    Actually, a camera distance based PP that tones down the intensity or bloom with a custom material may also be a viable solution to your problem.

    Check out the kite demo for an example of how to do the fog banks. (Note that they forgot to turn off shadows on the meshes and that they therefore leave lines across the landscape, last time I looked it over at least).

    Leave a comment:

Working...
X