Shouldn't the tonemapper output match the color depth of the framebuffer?

Hello,

I have a question regarding the tonemapper within the post-process pipeline. As far as I understand, tonemapping takes HDR values and scales it to LDR so that a monitor can render it. When going full-screen (and assuming r.FullScreenMode 0), the framebuffer uses 10bpc… when using DirectX, the DXGI format is R10G10B10A2. The tonemapper output, however, always uses the R8G8B8A8 format as seen in the “ComputeOutputDesc” function.

Why is that? If my monitor supports deep color via a 10bpc signal, shouldn’t the tonemapper’s LDR be 10bpc as well?
The only reason for not doing so is that the alpha precision might have to be lowered which would affect the AA pass after the tonemapper?
Please help me understand.

Thank you,

We could modify UE4 to grab a 10:10:10:2 back buffer. Even with a 10:10:10:2 back-buffer we still would need to map to the output colorspace (Gamma 2.2 for Apple products, sRGB for PC, or say Rec709 for HDTV, etc), and then apply some kind of temporal dithering to avoid quantization artifacts (aka banding) in the output (even with 10-bit/channel). Without the quantization this would be higher priority, but with the quantization there have been other issues with larger !/$ to fix in the engine which are getting all the attention.

Thanks for the answer Lottes,

Just to make sure I understand this correctly, there is currently (UE4.6.1) no temporal dithering going on post the tonemapper pass?
Can I assume that the tonemapper is the last pass in the renderer when in game mode? Meaning the RGB values will not be modified post the tonemapper and represent the values in the final framebuffer.
It seems that the tonemapper pass sets the alpha value for Temporal AA. Temporal AA occurs at the beginning of the post-process pipeline… hence the tonemapper’s output alpha value is to be used in the next frame by the Temporal AA pass?

Thank you,