The issue is just that when using UE 5.6 or later the resulting colour looks off compared to what 5.5 and pevious versions would produce. I have made 2 other threads about this issue but I didn’t have a repro without using our software and I didn’t know what the actual problem was. However now I have found the error and was even able to make a workaround on the plugin side. I still wanted to explain what I found in case it’s a bug and not intended.
The specific code that causes the problem is this in UnrealEngine.cpp line 2199:
if (IsDefaultBackBufferLinearSDR())
{
DisplayGamma = 1.0f;
}
This was added in this commit: https://github.com/EpicGames/UnrealEngine/commit/5ca4e76b4b106a5ae1a5cccd4383e2587b80f17a
If I comment this bit out then I get the colour I expect. So this sets the DisplayGamma to be linear if the default back buffer is FloatRGBA. This in turn causes FDisplayClusterViewportProxy::GetResourceColorEncoding_RenderThread to return linear as the colour encoding because of this logic:
const float DefaultDisplayGamma = UTextureRenderTarget::GetDefaultDisplayGamma();
const float DisplayGamma = Contexts[0].RenderThreadData.EngineDisplayGamma; // is 1.0
if (DisplayGamma == DefaultDisplayGamma) // They don't match
{
return { EDisplayClusterColorEncoding::Gamma };
}
// Custom gamma value is different from default
return { DisplayGamma }; // Linear encoding
This seems to create one issue where when the InternalRenderTarget resolves to the InputShaderResource there is some conversion that occurs because the encodings don’t match. This makes the final output look brighter than expected. That’s not the only issue though
The second consequence of this is that when calculating the gamma correction before OCIO in FOpenColorIORendering::AddPass_RenderThread (the one without the gamma parameter) it is different than previous versions because of the new 1.0 display gamma value
float DisplayGamma = (View.Family->EngineShowFlags.Tonemapper == 0) || (View.Family->EngineShowFlags.PostProcessing == 0) ? DefaultDisplayGammaRT : DefaultDisplayGammaRT / EngineDisplayGamma;
Since EngineDisplayGamma is now 1.0 the resulting DisplayGamma is 2.2 instead of 1.0. This is what results in the weird colour output.
So was this an oversight ? Why is the display gamma now always 1.0 when the back buffer is FloatRGBA ? I have attached the captures from 5.7 and 5.5 so you can see the difference.
Capture_5_5.jpg(73.9 KB)
Capture_5_7.jpg(77.8 KB)