Subsurface transmission and SMRTRayOffset

Since upgrading to 5.6, we have noticed an issue where thin geometry on our characters like fingers and earlobes are often glowing with nuclear intensity when backlit by the sun light:

[Image Removed]

Reducing the value of r.Shadow.Virtual.ScreenRayLength reduces this artifact, going away entirely when set all the way down to zero:

[Image Removed]

The issue appears to be that with the default ScreenRayLength, at even moderate distances from the camera, the ray offset ends up pushing the shadow ray all the way through to the opposite side of the fingers, so they end up with no shadowing and zero OccluderDistance, meaning they transmit 100% of the light.

I believe I have tracked down the source of the change to a modification made to VirtualShadowMapProjection.usf in CL 31295145:

VSM: disable screen ray and normal bias on subsurface materials

Can cause various artifacts with extreme opacity values

Before this change, subsurface materials would go through the call to SMRTRayOffset = VirtualShadowMapScreenRayCast(…), which would end up generating a much smaller ray offset value, limiting the “glow” to just some edge artifacts that were much less visually obvious:

[Image Removed]

Based on the changelist description, I am curious whether the original intent was to force SMRTRayOffset to 0.0 for subsurface materials, and not back to the value set in r.Shadow.Virtual.ScreenRayLength? If not, is reducing that CVar value to a lower value the best way to go about fixing this, or is there something else that we can try?

[Attachment Removed]

Steps to Reproduce

  1. Enable virtual shadow maps.
  2. Create a directional light using VSM and enable the Transmission flag
  3. Create a material with Subsurface Profile settings and apply to an object with thin geometry areas
  4. Align the light, object, and camera such that the object is backlit.
    [Attachment Removed]

Hi,

thanks for the repro steps, I could replicate the issue on my side. I investigated the relevant shader code and forcing SMRTRayOffset to 0.0 for subsurface materials prevents any control over the subsurface intensity as changing ScreenRayLength is no longer taken into account (it’s equivalent to setting ScreenRayLength to 0). As the VSM’s NormalBias is no longer used for subsurface materials, lowering ScreenRayLength is the only viable way to reduce the glowing effect.

Hopefully that answers your question, but if not please let me know.

Thanks,

Sam

[Attachment Removed]

This came up as part of a task to modify the VSM shadow projection to better support Subsurface Profile lighting model, so now we have the ability to control the subsurface transmission intensity by adjusting the Extinction Scale in the Subsurface Profile asset. So, that shouldn’t be a problem for us.

I think at this point with all of our local changes, we’re in a good place. If the existing shader code is working as intended in the base engine configuration, then I think this is all resolved. Thank you for your help.

[Attachment Removed]

Hi,

no problem. I’ll close the case, but feel free to reopen if you have more questions.

Thanks,

Sam

[Attachment Removed]