Wondering the same thing for a while.
Skylight intensity is tied to sky texture brightness/color, regardless of whether the sky is solid color, 8bit texture or 32bit HDRI. What would a physical unit on skylight mean in that case?
Emissive surfaces are expressed as cd m2 (nits). A sky box in hdr is using the pixel intensity multiplied by the light intensity, resulting in a total luminance, expressed in cd m2.
Think of the hdr pixels as a filter . If your hdr image ranged from 0 to 1.0, and the sky set to 1000 cdm2, the resulting luminance would be 1.0 x 1000 cdm2.
The only proper way to use this is to set the skylight to 1…always! Then use a multiply on your skytexture and adjust the value until the pixel inspector shows the correct luminance value for the setting you want to achieve. For example, bright daylight with white clouds (not cloudy but fully lit clouds) the clouds should read around 10k candela/m2
At least thats the only consistent way I have found. Increasing the skylight beyond 1 gives you more light while the sky still reads the lower candela value and that again breaks the correlation of the values. You will get brighter indirect lighting and shadows while the sky itself looks way too dark and that will mess with autoexposure as well. So I would recommend to never touch it (read: only for artistic tweaks)
Correct. I’m glad you’re pointing these things out instead of silently shaking your head like the rest of us ;).
In this case though I tried reporting the new cd/m2 skylight labeling as a bug on issues.unrealengine.com but was directed to this thread. Thankfully I have a UDN account and was told to try there also, where I did receive a developer response.
One approach that I implemented in a different life to solve this is as follow:
problem: unpredictable luminance of the sky due to unknown / unstandardized hdr values in hdr images, sometimes ranging from 0 to 1.0, 0 to 1000 or 0 to X.
solution: internally normalize the input image to force a range from 0 to 1.0, evaluate the Illuminance at an imaginary horizontal plane and scale by a user defined factor, resulting in lux .
benefit: no matter what image is loaded, hdr or not, the amount of light emitted by the sky box would always be the same. Then you get a true “horizontal lux” control and removed the unknowns.
The Sun and the Sky would the be controlled with lux values and be totally consistent.
TLDR; This creates a bigger problem and over-complicates things. The problem is simply that the UI shouldn’t label skylights as cd/m2.
If sky textures/materials/captures are normalized then sky luminance would be unpredictable. Please don’t normalize skylight input textures.
If I knew the HDR sky texture was captured at EV100 10 before, I certainly wouldn’t know what it would be after normalization
If I have two version of the same sky: one with sun sun painted out and one with the sun visible: normalizing them both would look very different.
Again, using skylight intensity other than 1.0 breaks the physically correct link between the visible sky and the light it emits.
Would this also normalize the captured scene? This opens a can of worms: then you’d also want to normalize all reflection capture actors presumably? Let’s not
It would be a a luminance matching the defined horizontal illuminance, so direct correlation with physical units is actually brought to the surface.
Besides, I don’t understand what a HDR image captured at EV10 means for you. Can you describe what this means in terms of hdr pixel intensity? What luminance a pixel set to 21.56,21.56,21.56 means in this case?
AFAIK the whole point of HDR imagery is to capture the full range of luminance, thus eliminating the notion of exposure entirely ( a pixel value becomes fully linear making abstraction of non linear aspects introduced by cameras). then what is left is a notion of scale, like zdepth or height maps.
I think I see what you’re saying. I could go outside on a cloudy day with an incident light meter and measure maybe 1,000 lux facing straight up, then capture an HDR panorama of the sky. Back in Unreal the HDR panorama will be normalized and I could plug in my measured 1,000 lux. Right? It’s not a bad idea as long as there’s still a way to get a sunny and non-sunny texture to use the same scale if I wanted (they would output/measure different results in lux obviously).
Correct, any HDR texture used as a light source should have the full range (unclipped, linear), but we still need to know how those pixels relate to real world cd/m2. Most stitching/processing software (such as PTGUI) don’t output in absolute luminance as cd/m[SUP]2[/SUP], but scale output based on the middle EV or some other scale, so the HDR is already “semi-normalized”.
If I shoot an exposure bracket and merge to an HDR, and the middle exposure was at EV[SUB]100[/SUB] 13 (let’s say ISO 100, f/8, 1/160s), I know that there is *some *relationship between those EV[SUB]100[/SUB] 13 pixels and luminance. PTGUI says that 1.0 in the middle EV should correspond to white in the output HDR. So presumably I could take 21.56,21.56,21.56 measured in my HDR and scale by 13 EV (*8192, or some number/formula) and get 176,619 cd/m[SUP]2[/SUP] for that pixel. (looks like a pixel near the sun)! Another pixel from that HDR might be from the blue sunset sky and measure 0.04 (0.0301,0.0503,0.0860), but when we scale that knowing it was captured at EV 13 we get 327.68 in luminance (246.57,412.05,704.51).
I’m sure I could be making some wrong assumptions about the math, but when I go through the process on HDRs I’ve captured with corresponding incident light readings I’m within a stop or two of where I should be. I can also compare to Unreal’s procedural atmosphere and check luminance with the pixel inspector, it’s close. I’ll leave the exact math to the researchers :). What I do know is that labeling sky intensity as cd/m[SUP]2 [/SUP]as it currently is will confuse people. I am glad you folks at Epic are at least trying to figure out ways to make physical lighting units less confusing
I also think we’re still talking about two problems. One is to remove the cd/m[SUP]2[/SUP] label from the UI (easy). The other is how to make photometric units more artist friendly and scaling these semi-normalized HDRs back to meaningful real world units (difficult).
Using either method could work:
Evaluate the Illuminance at an imaginary horizontal plane and scale by a user defined factor, resulting in lux.
Scale the HDR intensity using the photographic settings from when it was captured (only works if you know exactly how it was authored)
Either way, that scale would need to happen on the texture/skybox material side of things instead of the skylight (as @**Daedalus51 **mentioned). The best way to make that more intuitive would be if the skylight was visible as a light source instead of requiring a separate Static Mesh skybox.
I kinda like it! There is one big BUT from my point of view though…and that is that most people dont have a lightmeter at hand to read the lux when they might capture a sky. Or what happens if you just buy skies from CGSkies.com? You dont know any lux readings as the image data is all you have. So the system needs to work without the need to buy fancy and expensive hardware.
I think its a great option, but it should be just that…an option. There needs to be a working system that doesnt need you to specify that.
In Frostbite, its actually that way that the skylight intensity also adjusts the brightness of the HDR image visible on the skydome (hard in Unreal since they are decoupled from each others…both approaches have ups and downs), so they are linked and thats already nice! However, we still only rely on the candela readings from the skydome as the actual candela values mean nothing cause different images could be captured differently. So we often have that situation that one sky is quite bright and another one quite dark and both are set to…like 5000 cd/m2 in the editor. So again, just measure the stuff on the dome directly and be fine
People shouldn’t be forced to acquire extra hardware and have to capture own hdr sky textures. Because probably 99.95% of them just wouldn’t.
Prior to this new lighting stuff there were still over complicated ways of doing physically accurate lighting in UE4 but still, people just didn’t do it, due to it being a complicated process + extra effort.
If you develop new features for the engine make it usable for everyone in artistic friendly way.
Nobody is required to buy extra hardware, you would be able to use any HDR you want and give it any intensity just like you currently can. You can find reference lux values online and could plug in an appropriate scale factor using that.
Actually you can already do that by creating a perfect white diffuse sphere and using it as a light meter with the pixel inspector. Measure the skydome’s influence on the sphere using pixel inspector, then doing some mathand you can derive your skylight’s lux (in the direction of the surface normal you sampled from the sphere). Still would be nice if pixel inspector showed luminance in cd/m2 instead of skylights. Example: I sample the top of a diffuse white sphere under skylight only and pixel inspector reads: 318.3 (cd/m2) surface luminace, because the sphere is perfect white diffuse, I can just multiply by pi to determine the skylight’s illuminace is 1000 lux, (and if that is too dark or bright from the reference values I’m expecting, I’d have to scale my skybox materials instead of touching the directional light)
I totally agree some features could still make the process more artist friendly.
no difference as today really… you’ll give an arbitrary lux value just like you’re giving an arbitrary intensity today. The only difference is that with a method like the proposed one, you will be able to have a physical reference if you happen to need one.
Will we get an Physically accurate procedural atmospheric sky ? Something like current Atmospheric fog, but something that actually outputs real Lux values which can be captured by skylight and which does not require HDRI texture.
The current Atmospheric model works pretty well, but you just need to adjust your sun intensity based on angle, and probably the atmosphere’s Sun Multiplier and other settings too. Finding sky information does seem difficult, but there’s some decent reference for sun intensities. Here’s one: http://stjarnhimlen.se/comp/radfaq.html#10
You can then build angle-based sun intensity in to a blueprint. BP_LightStudio already adjusts color based on angle, so it’s pretty easy to apply that to intensity as well.
Here’s an example using atmosphere to match a photographed HDR, but I had to use a Sun Multiplier of 25.0 to get the correct luminance/illuminace readings.
These match (close enough to) the illuminace readings I took on location
18,900 lux in sunlight, facing sun
5,000 lux in shade, facing same direction
sun+sky - sky = 13,900 sun intensity
My actual Sun Intensity is 18,000 lux because its color is very saturated, something else to take in to account.
Skylight intensity is at 1.0 “cd/m2”, although I still hope they remove the misleading unit label.
Pixel inspector show the deepest blue sky at about 720.0 luminance (cd/m[SUP]2[/SUP])
The skydome material emissive had a multiplier of 23,170 or 14.5 EV.
The HDR middle exposure (mentioned earlier) was at EV[SUB]100[/SUB] 13.32, so I guess you can’t just take the EV number and plug it in to your skydome emissive intensity, but it’s close-ish. Again the difference is probably due to color saturation. It’s always good to have illuminace measurements to be sure, but with most HDRs you’ll never know the camera settings or illuminace at time of shooting.
The results came out okay. I can use manual exposure or EV100 numbers and it looks pretty close to what I’d expect. Something else to be aware of is I’m not using pre-exposure, so numbers around 65,000+ will start clipping, (at least on my hardware they do).
Yes, sort of. The proposed approach is not incompatible with this at all. And for arch viz, we have a requirement of creating a parametric sky that delivers that. It’s all aligned in the same direction. A procedural Sky is similar to a photographed sky dome from a pixel intensity stand point.
Being long time Archviz artist, I’ve found that Atmospheric Fog in UE4 can work pretty well as a replacement for Physical Sky model often found in offline render engines. It unfortunately suffers from two severe flaws:
1, Screen space reflections do not work correctly with it, as they need some background geometry to establish depth. This can be partially worked around by introducing giant skybox sphere around your scene, but it often interferes with the Atmospheric Fog look.
2, Sun disc drawn on the Atmospheric Fog has completely wrong intensity. I’ve reported the bug here: Unreal Engine Issues and Bug Tracker (UE-58668) and answer is “Won’t Fix”… go figure. So at this time, despite Atmospheric Fog being a better Sky model for Archviz than Preetham or Hosek/Wilkie, it’s bugginess and refusal of Epic to address these issues makes it unusable and UE4 will likely require yet another Physical Sky model on top of the existing one instead.
By the way, if the intent of the efforts is to make life easier for archviz people, then I would suggest implementing some 1-click algorithm, which would find HDR spots on the HDRI map (above defined intensity threshold), compute their radiance in proportion to the given scale, and automatically create Directional Light(s) with appropriate direction, intensity, color and source angle. That would save so much time.