Trying to recreate the Distance to nearest surface node as something that doesn’t take into account camera position.
Unfortunately the distance to nearest surface nodes only works within a certain camera radius and it’s very visible when it disappears in my case.
I think what I need is to sample the scene dept by biasing it to the height of the plane I’m interested in instead of the camera.
Maybe I’m just not thinking about this correctly because the plane is a plane and not a mesh with geometry.
Making different masks for absolute position height is pretty simple - Wold Position - Z, mask B.
I have been trying to approach the scene depth similarly, but no matter what it seems to just travel with the camera height/position. In fact, changing from scene depth to pixel depth has absolutely no change in render result.
Does anyone have any ideas on the best approach to this?
I’m 10 minutes away from saying screw this and making custom textures to use as masks…
iirc Distance to Nearest Surface uses Global Distance Fields, where the resolution is based on camera distance. Are you trying to get the height/world position of a plane to use that information in another material?
Essentially yes, I’m trying to color an intersection of meshes that is further then the camera can get to. So far the only way that seems to work is to apply a custom texture derived from a scene capture and a post process. That does cause issues too though, I’d need something close to 8k to have it look nice enough even as a mask.
I’m trying to solve that by blending the colors from the outside of the mask to the inside color of the mask but that too is getting to be a lot of nodes…
Anyone have some interesting shader ideas to color in a random area on a world aligned texture that happens to make a curve in a linear gradient that follows the curve?
I think I have to go through and do some per pixel math to just fade from one step to the next. it’s not a simple process, of that I’m sure. Ideas welcome.