Hi,
I noticed something confusing while inspecting material parameters using the Ray Tracing Debug visualization in Unreal Engine 5.5.4.
To reproduce the issue, I created a very simple scene:
-
Two spheres
-
Both spheres use white Base Color
-
One material has Metallic = 0.2 and Roughness = 0.2
-
The other has Metallic = 0.8 and Roughness = 0.8
Then in the viewport I switched to:
Viewport Mode → Lit → Ray Tracing Debug
and inspected the individual visualization modes.
Roughness
The displayed color values match the input values:
-
Roughness 0.2 → displays ~0.2
-
Roughness 0.8 → displays ~0.8
Metallic
However, Metallic behaves differently:
-
Metallic 0.2 → displays around 0.48
-
Metallic 0.8 → displays around 0.90
These numbers correspond almost exactly to gamma-corrected values (pow(value, 1/2.2)).
Looking into the shader code
I checked the shader code in RayTracingDebug.usf and noticed the following:
-
RAY_TRACING_DEBUG_VIZ_ROUGHNESSappliespow(value, 2.2) -
RAY_TRACING_DEBUG_VIZ_METALLICoutputs the value directly
Because the debug visualization is rendered before the final display transform, my assumption is:
-
The visualization output will still go through the display gamma conversion
-
Therefore linear values appear gamma-corrected on screen
It seems that Roughness compensates for this by applying pow(value, 2.2) so that the final displayed value matches the linear input.
However Metallic does not appear to apply this compensation.
My question
Is this behavior intentional?
If Roughness applies a correction to ensure the debug visualization reflects the linear value, shouldn’t Metallic do the same for consistency?
Right now this can be confusing for users.
For example:
-
A linear value of 0.2 in Metallic appears as 0.48 in the debug visualization.
-
Someone inspecting the debug view might assume the value is wrong and try to adjust the texture incorrectly.
So I’m wondering:
-
Is this an intentional design decision?
-
Or is this an inconsistency in the debug visualization implementation?
Thanks!