UE4 & High Dynamic range displays at NAB

According to the linked article on AVSForum the elemental demo was shown off by Dolby/Vizio on their Reference line of TVs doing 10-bit HDR. Did Epic contribute to this at all or was it post processing? I’m sure the performance might be poor for real-time games, but for archvis and movie making it would be pretty interesting. Will we see 10-bit (16-bit?) per component frame buffers eventually?

High Dynamic-Range Displays at NAB 2015

Outstanding to see these finally gaining traction. HDR is the future!

J^2

You asked several questions here, and I don’t have the answer to all of them. But here goes.

  1. a. I don’t know if Epic was involved.

  2. b. UE4 already does a post-processing effect called tone-mapping. This used to transform the “Linear” color information Unreal actually produces to the dynamic range and color gamut of our screens. It is most like this was changed to reflect the custom properties of that screen.

  3. There should be no direct change in performance with just changing the tone-mapping, as long as they optimized it as well as the current algorithm, and didn’t chane the color depth.

  4. The new UHD spec, in which 4k is only one aspect, is working toward provisions of either 10-bit and or 12bit panels for mass television production. This should transfer over to affordable consumer monitor panels as week. It should be noted that you can already buy a 12bit panel from Dell for a few grand, but this new spec is focusing on three major improvements.

  5. Will we see 16bit panels? Yes. Soon? No. The UHD spec may settle for 10bit panels, in which case the professional panels will probably only move up to 12bit or 14bit color per pixel. If the spec calls for 12bit panels, then we might see 14bit and 16bit panels later, but no one has made a 16bit panel that I know of. We can hope.

Now, you may know this already, but there is a lot of confusion in HDR versus color versus pixels. I will write another post of the benefits and downfalls as I see them.

These new standards are dealing with three major improvements, mainly resolution, color gamut, and dynamic range. Of these three,t eh most noticeable to our eyes will be the dynamic range increase. This is because we see contrast much better than we see color, loath though we are to admit it. But this is also the easiest to increase, as this is just a basic measure of how bright and how dark we can make the screen.

These screens being shown off there at the highness are 1-2k nits. (A nit, per wikipedia, is “candela per square metre (cd/m2) is the derived SI unit of luminance. The unit is based on the candela, the SI unit of luminous intensity, and the square metre, the SI unit of area.”)

To give you a reference to how bright that is, 2,000 nits, or cd/m2, is the equivalent of a average cloudy sky while 2.5k nits is the moon’s surface and 5k nits is the typical photographic scene in full sunlight. So, going from 80 nits in the sRGB spec to 1-2 nits is a huge boost for realism, but it is still on the low side to fully recreate a real life scene. To fully reproduce a “solar disk at horizon” or a sunrise/sunset, we would need 600k nit displays.*

Obviously this is in the right direction, and gives us as artist a much larger palette t work from, which is good. Unfortunately, the standards aren’t there yet for the minimums, so some of those displays were promoting 300nits, which already exist in the HD market.

I will deal with resolution in my next post, as it is easier to understand.

So, how much benefit do you get from a 4k screen? That depends on how far away you are from the screen. If you put that many pixels in a 5.7" display, you can view it at roughly the tip of your nose and you won;t be able to see any pixels. In fact, with just the resolution alone, at those dimensions, it would look better than going to the theater. Thats because we perceive detail as contrast. So, resolution can directly provide that contrast, with two adjacent pixels with different colors, and luminosity can provide that contrast.

Both of these together will increase the quality of your viewing experience, no matter if they up the number of colors or change the color gamut used.

But what about a laptop monitor. To get the same density of pixel, on say a 17" monitor, you could sit as close as 13", and not see a hint of pixels. On a 15" monitor, you can sit 12" away. On a 13" monitor, you can sit 10" away.

Wait, nobody sits that close to a monitor, do they? (I ask as I hover 12" over my 2011 Macbook Pro non-retina display.)

What about Desktop displays? On a 23", you need to sit 18" away. On a 27" monitor, you need to be sitting 21" away. Wait, aren’t these numbers getting bigger? I don’t know of any artist that sits 21" away from his monitor, an they usually have 30-32" monitors, if they can afford to.

Then we get into tv size displays. On a 40" display, you need to be 31" away from the monitor to even get to the same quality experience. I have read of several people using 40" 4k monitors to replace 4 1080p displays. They aren’t sitting that far back.

So, what about the critics that say that you have to have incredible eyes to see pixels at those distances? Well, these figures are usually calculated at normal vision, they claim, and most people have less than that, not to mention all those people that have bad enough eyesight to need glasses.

There is some truth to this, as not everyone can see pixels at those distances. But lets look at all of those that need glasses. We need glasses because our vision naturally is not 20/20. So we get those glasses. And it corrects our vision. But for more than hot of us, it doesn’t correct our vision to 20/20, it corrects our vision to be better, sometimes by a large margin, than 20/20 vision. Yes, having glasses can be a super power.

And if you can’t see the pixel on the better display, then no problem, it is still a better display, because its higher resolution is still contributing to the image quality.

Now, this all affects performance, obviously, in a way that just the dynamic range doesn’t really.

Next I will talk about color gamut.

Color gamut is this huge topic. But lets get some things out of the way.

Changing the color gamut to Rec 2020 or DCI-P3 from sRGB will not give us more colors. The number of colors available to display is directly determined by the number of bits per pixel. These are two different things.

So, if the UHD standards call for P3 or Rec 2020 color gamut, but don’t mandate a higher color depth, then we will see little immediate benefit in the games industry. In fact, we would likely see some serious problems in the non-physically based rendering strategies currently employed.

This is because while these won’t give us more colors, it will give us new colors. Give me any RGB color in current sRGB space, and when you use the same numbers in Rec 2020 space you will get a different color.

Now, it gives us a wider range of colors, as in a deeper green, or purple color. And that is its main benefit, this deeper range of colors. Which will help us get closer to what the human eye can see. But we need to increase the color depth, the bits per channel of the displays, in order to both display more colors, and get closer to some of the colors that we lose when switching to these wider gamuts.

So, to give you a comparison, any 2k digital movie you have seen over in theaters that adhere to the movie standards, was shown with a P3 color gamut, and a 10bit color per channel. 4k movies have the option of 10 or 12 bit color depth, still with the P3 color gamut. So that is what is about to be in your homes, on your desktops over the next 5 years. And your games can look like that, as it seems likely they will be going with the P3 color gamut, due to issues with the current implementations, few as they are, of Rec 2020 displays.

If they are going to change the gamut, I would prefer they change it to Rec 2020, which covers all of the sRGB, Adobe RGB, 99% of P3 and most of Pro Photo, so that any new displays could increase colors fidelity simply by increasing the bits per pixel.

So we want a minimum of 10bit, hopefully 12bit color panel depth.

So, how does this affect game rendering? In terms of PBR based games, if the materials are done correctly, then they should just work. A changed tone mapping pass will be need to account for the new displays properties, but since this is already needed due to the nature of the current render and current displays, this should just be able to auto detect the type of display, or at the very least, a toggle for the user.

For mobile games, I don’t think things are going to change until new displays standards are made for them. They are already a wild west, most not adhering to the sRGB as it is. So the same strategies that are used to make your mobile games look good now should continue to work, though it becomes more of a headache.

For non-PBR games, you may be stuck with two options, either go with the mobile approach, and hope for the best, or setup assets for two different color profiles, one for wide color gamut, and one for sRGB.

A good look at how this will affect your game is to go back and look at games that were made in the 8bit era, and how they look on 16 and 32 bit color depths. Also, 16 bit games on 32 bit color depths. The colors change, and you can’t always predict how they do so.

Now, eventually, a lot of standards will be made, and we will have a relatively calm stable standard to work with, as we have had with sRGB for the last 15 years. And tricks will be picked up on the way. So everybody have fun!!!