I would like to author shots on macOS or Windows and then submit to a headless Linux instance for rendering. I noticed that the resulting EXRs produced on each platform with the same project, shot, render settings, etc. are quite different. Only Windows produces the result I would expect.
For my test I enabled MovieRenderQueue_WorldDepth and MovieRenderQueue_WorldNormal.
I can’t upload the EXRs yet, so I’ll describe the differences.
- WorldDepth
- Windows - Values max out at 65248 which is close to float16 max of 65504.
- Linux and macOS - Values max out at 1
- All three platforms have heavy quantizing, but Linux and macOS appear worse, especially if the sky is visible. In my Linux and macOS renderings, the scene geometry has a depth of 0 while the sky has a depth of 1.
- Where in the rendering pipeline does this quantization come from? I inspected the material graph and didn’t see anything special happening.
- WorldNormal
- macOS and Windows match
- Linux does not match
- e.g. A ground plane pointing up on macOS and Windows has a normal of
(0, 0, 1)
(reported asRGB 0.5,0.5,1
). On Linux it has the RGB value0.21,0.21,1.0
. I’m guessing this is a gamma issue as0.21^(1/2.2)
is pretty close to 0.5. - Is this a color management mistake on my part? Do the default color spaces on each platform differ?