This issue has been driving me nuts on and off for a while, but I finally figured out out what was causing it and why I was having a hard time reproducing it sometimes. Basically, I have a SingleLayerWater material and do some lerping between AbsorptionCoefficients to change how the water looks around a shore. However, this annoying banding effect happens at very random resolution scales. There isn’t a rhyme or reason that I could spot, like it needing to be odd or even percent values, but it definitely points to some kind of issue under the hood in rounding and probably some target buffer resolution.
Here’s a clip of what I mean using a packaged version of the scene(you might need to increase the video quality, embedded clips seem to default to 360p):
I use the following in my master material for finding the shore and lerping between AbsorptionCoefficients(slightly modified values from the defaults in the material instance that is used):
I’ve tested with just this chunk of material code to debug and I’m not seeing any of the banding effects with it, but sometimes it’s hard to tell with raw gradients. As far as I can tell, this issue seems to be coming entirely from how SingleLayerWater handles things during its pass.