Rec.2020 & HDR

Hi all,

I’m currently developing for VR but want to have a stab at getting an HDR signal out of UE4 into something like the LG OLED e6/g6 displays.

It seems like info is slim on what ability UE4 has to get an HDR signal out. I read a tech interview over at Eurogamer with the Gears 4 team. They’re doing HDR and make it seem easy to configure this. Someone over at AVS forum says Dolby Vision can be software-wrapped into a regular Rec 709 signal over HDMI 1.4.

I’d love to know more about all of this as synthesising these high dynamic range images is much easier than capturing them and should be amazing to see. The previews of Gears 4 suggest ‘better pixels’ (HDR) is much more impressive than ‘more pixels’ (4K) which I understand completely. What can one do with UE4 today with regard to HDR, colour spaces and even Dolby Vision compatible signals ?

Cheers,

Matt Hermans.

HDR output isn’t supported yet in the engine, it should be there in 4.13 though (which is in theory why Gears 4 supports it). I would just watch the changelogs. It’s not really any different on the content creation standpoint, all it’s doing is changing the output render target precision, 3d applications have had that sort of control for a very long time now when it comes to things like compositing/color correction.

Thanks for the tip Daniel !

Any update on this? Also was wondering, any chance of adding an option to dither the 10-bit/floating-point buffer down for 8-bit displays to reduce banding? Using a low amount of film grain already does a decent job of breaking up bands but I imagine dithering would be better.

Unreal already does grain quantization from 16 bit float in the tone mapper by default.

Also, note that you would have to have one of the most recent graphics cards that can output HDR once the engine actually supports it.

Far more important, your entire dev team would need HDR displays to even see it, let alone develop for it. And right at the moment they’re still ridiculously expensive. Ideally you’d also want the expanded gamut to be used for textures and not just lighting, so your entire texture pipeline would have to support that as well. Technically the changes for UE4 aren’t that difficult, new backbuffer output, ACES and the HDR tonemapper already support HDR output luminance as well as being able to expand to a larger color gamut. ASTC already supports HDR color gamuts for textures and UE4 supports ASTC.

So the real challenge won’t be UE4 at all, which is already getting HDR support soon, but everything else really.

You can buy an HDR TV for $600. For $650 you can buy a 49" Sony. HDR support is actually on the Trello Board, but it’s backlogged. The current log only stretches out to November, so it could come by the end of the year. FOR CHRISTMAS, PLEASE, EPIC!

I’m actually making a project for someone that’s going to be displayed in an aquarium, and there’s a lot of sun and bloom in the project that is washing out some of the darker details in the sand. HDR would fix this problem and make the project look absolutely GORGEOUS! I am recommending both a GTX 1060 and an HDR display to them. I don’t even care if I have to go back after I’m paid to set it up properly with HDR, if I can do that, that would be amazing!

I disagree. You don’t need to have HDR textures. The engine allows lights to extend beyond the range of a typical 8-bit monitor, and the textures scale accordingly. If you’re happy with the 8-bit range of a typical 16-million-color texture, then you shouldn’t have a problem with the final results using a standard texture. More often than not I find myself scaling down the contrast of diffuse textures because looks too harsh in the final result. Again, HDR lighting scales the contrast in textures automatically. Games have done this for decades. Ico had HDR on the PS2 way back in 2001. Twilight Princess released in 2006 with the effect in full force on the Gamecube. Monitors just haven’t kept up with their ability to display more than 256 values. We got surround sound, HD, 120 hz, 240 hz, 3D, 4K, and OLED all before we got HDR. Now that we have it, I want to use it. I want to know what my lights actually look like.

The only textures that need to be HDR are cubemaps, which are supported. I suppose you could use it for emissive textures if you really wanted, but a shader multiplier is usually fine.

The textures that are used for lighting are only defined within a 0-1 range, because they are physical coefficients, not colors in the sense of being values you output to the monitor, even the albedo. That said, normal maps and displacement maps can benefit from being 16 bit, and Unreal has this option.

Expanded color gamut has nothing, necessarily, to do with expanded luminance gamut. Normalized to standard realworld PBR materials current luminance texture range is enough, the color gamut however for textures is very much out of range.

Color gamut of textures is generally irrelevant, because the color gamut of your scene lighting is effectively infinite, to the limits of floating point precision. You can create a light with a red value of 1 billion and a blue value of 1 over 1 billion. Your albedo texture just specifies a value that is used to multiply that light color, it has no inherent color space itself, it is (once it gets to the shader) just a linear 0-1. You can change your white point and chromaticity and whatever in the tone mapper and everything still works, only the light source would need to be adjusted to match.

And if you use temperature to specify your light colors I would expect the engine to handle that for you. It would be neat if we could define light colors as spectral distributions from measured sources, and have the engine convert to rgb based on the monitor properties.

Doesn’t matter if your lighting is an expanded gamut, because the results still differ from what you want. Real world materials have albedos outside SRGB gamut, and so, therefore, should your albedo textures. The classic example is the red of a London double decker bus, its color isn’t reproducable inside the standard SRGB gamut. Either you’d, utterly bizarrely, have to find the exact right expanded lighting gamut set up to reproduce the color you see in real life, and thus make all the surroundings look bizarre. Or, just have material inputs support wider color gamuts like they already can, and then get the exact result you want and would expect regardless of lighting set up. Honestly, claiming lighting does it all is rather nonsensical, it’s like saying all albedo textures can middle grey so why bother because lighting set ups can get you “the right colors” anyway.

Any update on this topic anyone ? My OLED is arriving this week and I’ve got a few examples I could show/test.

Just reading this -
seems like it’s been around for a while - just not integrated into the Epic branch

I think it was just showed off on one of the streams, they did just add HDR support in some form. There was a blueprint node for it, I forget the details though. Point is, they’re working on it, I think they may have integrated that branch since it’s Nvidia only right now.

So we have Experimental support in 4.15 !
Who’s tried it so far ? Not sure what the exact requirements are in terms of project configuration - it talks about cvars and a bunch of automatic detection.
The details seem a bit vague but I have two scenes ready to go - they’ve been constructed with out-of-bounds values in mind for HDR display. Also the LG OLED E6 arrived this week so there’s something to display on too !


f86cb4bd8625e4520f8e822779f3fba48425b95a.jpeg

Matt Hermans

So I’m curious, I got a TV that has HDR support that was added in a firmware update, and I don’t think the HDR is actually doing anything, to me it looks like an adjustment you could do in Photoshop rather than any higher range, and I can still see banding in movies. Has anyone else had a better experience where you could notice a difference? If it actually does something then maybe someday I’ll upgrade to a current model.

It’s absolutely different and “added in a firmware update” sounds suspicious at best, it’s a hardware feature.

What TVs could be confirmed to use this. Which ones are Epic using?

So… there are 2 competing HDR standards, and HDR10 will probably win but anyway. They’ve only been available since last year, only in 4k TV form, and if you bought one you almost certainly know it. Monitors supporting it are almost completely unavailable at the moment, same with laptops, and OS support, and etc. etc. Basically it’ll be “supportable” at some point this year with all new, or relatively all new screens, and updated OS’s, and even GPUs require the right output type.