We are currently trying to solve some differences between our SDR and HDR pipeline systems and would appreciate some help.
Problem
When switching to HDR from SDR the artistic intent is lost due to errors in tonemapping. There is a loss of roughly one stop of exposure across the image and bright emissives no longer hue shift or have the same path to white.
Current Solution
Using the following commands we correct the errors in HDR and bring it closer in line to the artistic intent.
- r.HDR.Aces.SceneColorMultiplier 3.25 (the default of this is 1.5)
- r.HDR.Aces.GamutCompression 1 (fixes the shift to violet in blue highlights)
- r.HDR.Display.MidLuminance 12 (adjusts the nit ouptut level of linear diffuse midgrey 0.18)
Questions
- Can you explain why multiplying the Scene Color is necessary for HDR? By default it is set to 1.5 with HDR output.
- Where in the RRT code of the SDR Pipeline is the path to white defined? In ACES color pipeline there is a hue-skew to the 6 key primaries as things get brighter. (e.g. Red to Orange to Yellow to White)
- How can we add a similar hue shift to the warm tones in HDR?
- What tonemapping/gamut mapping is happening to the scene-referred lighting during the HDR pipeline?
- With SDR we have the classic S curve controls. Are there any similar controls for HDR?
Steps to Reproduce
Create daytime lit scene using directional light of 120000 lux value and sky atmosphere.
Create emissive materials in red green and blue primaries and increase intensity in log2 values until they reach max white intensity
Create mid grey (0.18) spheres (lit and unlit materials)
Compare HDR and SDR versions.
The overall issue that you are seeing with HDR vs SDR has been a long standing problem. Our HDR pipeline is separate from our SDR pipeline, and uses different tone curves to get the final pixels. So, as you’ve noticed, it is challenging to create results that “match” between the two methods. Currently, we are taking steps to improve this by using ACES2.0 as a possible option for users to create content in SDR and HDR. But this effort is only in prototype state for 5.6, with more integration planned for 5.7.
The default SDR UE look is based on ACES, but isn’t exactly ACES. There is a brightness scale factor of 1.45 that is applied, and then the ACES curve is applied for SDR.
With the HDR SceneColorMultiplier that you chose, how are you doing the final comparisons of the HDR and SDR versions? It is good that you found a working value, but it would be helpful to see how you are doing that final comparison.
For question 5: there is no adjustment to the HDR “look” via curve controls. This is also planned for future integration, to allow adjustments that are effective for both SDR and HDR. There are issues with this, however. If the same curve is used, it would be likely that a rolloff could be baked into the curve that would adversely affect the overall brightness of HDR (for example). If you have specific suggestions for how you intend to alter the look using a single curve, that would be helpful to discuss.
Hi Rod
Thanks for getting back to me. We are doing our comparison by creating a sample scene of light and material values and swapping between HDR and SDR. We also compare this sample scene and our own in-game lighting on console with side by side tv displays in HDR and SDR mode. From those tests we started to land on the values mentioned above. I have attached a photo of the type of test scene we use. It was in the mid gray values on the ground that we first noticed the difference.
Do you have any thoughts on question 2 and 3 from my original post?
Is the HDR tonemapping a simple logarithmic output or is there more complexity to it?
[Image Removed]
The ACES shaders are in Engine/Shaders/Private/ACES. And the result you get is dependent on CVARs that begin with r.HDR.
The ACES code contains the HDR tonemapping algorithm, so I’ll admit I didn’t quite understand question 2&3, because the HDR path is also ACES.
Can you share the various r.HDR CVAR values that your project is currently using?
Hi Rod
We have done some rebalancing and further tests in the office. It seems the original values we were using were because we were matching with our SDR Monitors which were commonly averaging around 300 nits output for max white. When we calibrated our local SDR Monitors to 100 nits and reverted to Epic Default values we got a fairly even match between SDR monitors, HDR Monitors & HDR TVs. We may need to do further tests with bright SDR based TVs and monitors to see how working to 100 nit workflows effects our SDR pipeline.
We are still keeping Gamut Compression at 1 and leaving Expand Gamut off for both SDR and HDR visuals to maintain visual matching.
Our Test Values that we have now reverted
- r.HDR.Aces.SceneColorMultiplier 3.25 (the default of this is 1.5)
- r.HDR.Display.MidLuminance 12 (adjusts the nit ouptut level of linear diffuse midgrey 0.18, Default is 15)
Thanks for linking those files. We will look further into the code to see what we can find
Thanks for the summary. Do you want to keep this case open to share further results, or shall we close it at this time?