I’m working on a material to simulate water distortion for orthographic cameras and I have succesfully managed to blend two noises together to modify the ScreenPosition UVs and pass them to SceneColor. The result is pretty good, but when an object with the material moves near the right or bottom edge of the view I get a weird white color. I suspect the white spots come from UVs referring an area outside SceneColor.
Stripping down the material, it all comes to this:
As you can see, I just plug ScreenPosition with added arbitrary coordinates into SceneColor and the latter into emissive color.
I tried clipping between 0 and 1, dividing by screen width and a lot more to understand how ScreenPosition outputs its data but to no avail. Everything seems perfectly in order until SceneColor comes around.
The coordinates I can input are also puzzling. First 0.1 and 0.3 don’t seem to relate to my view size in any way and second, when I input negative values…
… I don’t get the white effect at all.
I don’t really know what to try at this point. Is there any way I can avoid this behavior mirroring or, better, clamping the position of ScreenColor I’m going to read from?