PBR lighting questions

What is a good directional intensity power at noon and any other important settings in the light. As well as what kind of intensity should the skylight use?

Right now my sun have an intensity of 3 and the skylight is at 1 at noon.

I am using Distance field AO, ray traced distance field soft shadows, and the new dynamic GI…

Sun brightness between 6 and 12. skylight intensity is relative to the HDR source so can’t really give a number there, other than generally a ratio of 4 to 1 between sunlight and skylight is a good number (as per Kim Libreri who has a lot of film experience). You can measure that by looking at a white surface with sun on it, and adding a shadow casting object then comparing the brightness of the lit to the shadowed areas. Modify skylight until its around 25% the value of the sun. But first you need to convert to linear space I believe otherwise the difference you measure will be wrong. To convert an image to linear space in photoshop go to adjustment->levels and set the midpoint to 0.4545.

I am still learning a lot of this myself and have been meaning to go sanity check all the numbers we used on the Kite Demo. Most of the tricks for measuring brightness and real world values of the sun came from Kim Libreri but we were on such a tight deadline that I didn’t really get to absorb it all fully. I want to say the “correct” sun value was right at 6 I need to go redo some sanity checks to be sure.

A blog post about all this stuff in the near future would be very for the community and internally well.

wow I didn’t expect an answer from one of the guys working on the Kite demo. I’m pretty new to making games and I have only studied like 2 years of 3D so quite a lot is still pretty new, especially in UE.

I always wondered why the textures in the Kite demo pack was so dark. It seems to be a very strange way of doing things making the midtones darker like that. But then again PBR is pretty new to me. I’ve usually only tiled and removed some shadows and highlights on the texture itself and imported it straight, but I had to darken it a lot in the material because it was so bright. Then I tried to lower the intensity of the sun and skylight so I could import it straight from PS and do no color corrections in the material, but the lighting always looked off when I did that.

No wonder everything was acting so strange under different lighting (customized the skybox from the engine so it works as a day night cycle, even changed the star texture to something more appropriate when you have no light pollution)

I’ve watched the Kite demo stream showcase thingy, but I would love to see some more in depth on how you did a lot of the things, like the delighting process you had, and more about the photogrammetry. I’ve tried agisoft with cellphone images and very quick DSLR photos of random stuff around my house and it is amazing to see how fast you can make a very nice looking prop really fast, but how do take away lighting information was way beyond me. Here is my best test so far

I really want to know more so I can go out and try to recreate parts of the Norwegian landscape like this place my dad found while he was on a trip near my town, and yes those trees are legit green.




I will upload some images after I have tried out your advices in my scene.

As hard as we tried to be physically accurate, I think some of the kite demo assets did come out a bit dark. Not orders of magnitude, but somewhere between 10 and 25%. Some assets were spot on though. I can actually figure out how far off they are by using Nuke and our reference photography.

In order to calibrate this stuff, we use the 18% greyball values from color charts. Then we normalize the image to get that 18% exactly at 18% and from there we can do relative measuring between UE4 and real life.

First we bring in a raw EXR into nuke and then color grab the 18% grey chart with a box:
ff5cfec022cace52e7a8c9aeb34841ccd54844da.jpeg

Here the RGB values are 0.201, 0.236, 0.266

To “normalize” the image, we create a ColorCorrection node and enter different values for gain for each RGB channel.

For Red, we enter “0.18/0.201” for blue “0.18/0.236” etc. After doing that, the 18% grey will read exactly 0.18 like this (ignore the grey look of the image for some reason I changed view settings, its not that big a diff):

PBR_01.png

This allows us to sample objects in the scene now to get an average color:

7f51faa33dc420e7d23bd3cc1335c51bfa851b14.jpeg

Notice that this rock was around 0.16 Some rocks are brighter some are darker. This is a fairly “lichen covered” rock, and many of our assets in the Kite demo tended to be of this type of rock. Certainly the darkish ones.

You can then check your assets in the editor by using “high resolution screenshot” and also exporting EXR. If you make a 0.18 grey emissive, unlit material you can use that to verify the right values are being measured and to adjust your other textures accordingly.

Note that when viewing brigthness in photos like above, that also picks up specular. For a fully rough object, that adds around an extra 0.028. That means you really need to subtract 0.028 from the sampled values to get the “base color” values. Specular in UE4 will add back the missing fraction. So a rock that ‘measures’ as 0.16 should have a basecolor of around 0.132 which is much darker than the 18% grey color.

It is also possible to do the above steps in photoshop but nuke is much nicer for this type of thing. Nuke lets you work in linear color while “seeing” sRGB color. I haven’t yet figured out if photoshop can do that. Colorspace in various applications gets confusing :slight_smile:

Thanks again for making it easy to understand. Its been a while since I used Nuke. I was the only one that used Nuke and not Composite in our composting classes. Good thing that is free for non-commercial use now!

So from my understanding is that if you match that gray value to 0.18 the rest of the colors would be pretty much ok?

As long as your other materials look correct in relation to the 0.18 grey object in your scene. But also we adjust it so the 0.18 ball reads exactly as 0.18 so that relative brightness can be measured then. Then you can compare your textures to actual parts of a photograph like how I checked the average right brightness from above. Or you can see if your grass in reasonable (but for grass you really should look at the final “lit” color since things like subsurface will add to the final color).

Yeah foliage have been tricky, but I’ll dissect the material for the grass you used on the Kite demo, its reasonable understandable.

How did you do specular and roughness maps? I’v read several PBR tutorials, one of which uses this chart as a refference for the median values, to bad the image links are broken now I’ve usually just eyeballed it to get a general feeling about what kind of reflections each type of surface might have.

http://www.marmoset.co/wp-content/uploads/materialref02.png

Here is a few screens with random CGtextures.com textures I used the simple levels set to 0.4545 no correction except that and it instantly looks way better than before! Maybe slightly darkish though.

Thank you for the useful information! Though, just this small part I didn’t understand.
When looking at a white surface and a shadow casted on it, we need to convert what to linear space?

@fanzypantz, Sorry for the broken images. I will create another material later and upload images of the process again and let you know.

I recommend Puush for any kind of screenshots you will ever need to make. Handy little app

always works in linear space of the image. Then outputs to the screen what the screen supports, which on most is sRGB.

Your Nuke greyscale is a rather expensive profiling tool. If using Lightroom to correct you RAWs, and you probably should, Use Adobe’s free DNG Profiler to calibrate that image taking all of the colors of that Macbeth chart. So the image is as close to the real scene as possible.

That link explains how to use it.

It should help fix some of those problems. Lightroom always works in Linear space.

To get “proper” sun and sky values, try getting some RMY (representative meteorological year) data and reading the lux values for direct (sun) and indirect (sky) levels. I’m not sure what light units UE uses, but you certainly wouldn’t use the raw lux values (in the thousands). Can a dev shed some uh …light on this? How does UE scale physical light values?

Ps. If you want computational models for skies, look up preetham, perez and hosek.

Thanks for all info Ryan i would love to hear even more infos. Fanzypants as for de-lightening process there is quite decent software Bitmap2Material. It is not very accurate ofcourse but it is quick and easy solution.

great post thanks.

one thing to add, if your going thru all the effort to get correct colour values make sure you have a calibrated monitor otherwise it will wont look right on ur screen or transfer well to other peoples screens. I do photographry and have a good dell ips monitor which still needed calibrating to get what I saw the same on other peoples screens.mobiles and most cheaper monitors really boost saturation and contrast.another factor is the ambient light in the room ur monitor is. daylight balanced bulbs are recommended ,and any natural light from windows etc really can change colours on screen. depending on time of day and the weather a screen can have yellow colour casts for example if its sunny. also eye fatigue can be a factor ,after an hour or so most people start to over saturate colours due to your eye adapting to the room and then becoming blind to how saturated images are. basically its a nightmare to get accurate colours on multiple screens.

also there is the creative aspect, correct white balance and colours often look dull compared to a artistic colour toning. most people like warm bright images so is a creative balance between whats accurate and what looks good …but as stated if ur monitor isn’t close to been correct as a starting point any tweaks you do sends the colours in to crazy town.

"de-lightening process "… not sure if it right or not but I made a copy of the photo master material provided in the kite demo pack, then create material instances from it and add my own diffuse and normal maps. if ur taking the photos urself you can use non direct flash to give a shadowless lighting.

I did some messing around in the shaders to make a trivial test case and crunched the numbers against the analytical formulas.

It appears that directional light “intensity” is in units of Lux times 50. Sky Lights are in units of Lux times 50pi.

Point and spot lights have some additional random scalars that do not have anything to do with physical units, like the color gets multiplied by 16 on its way to the shader, and the luminous intensity conversion factor of 1/4pi is not applied, but somehow it balances out with a simple multiplier of 0.02 on the directional light. Use that, and a directional light will cast the same amount of light on a surface as an equivalent point light with physical units.

It should also be noted however that this scaling factor would change non-linearly based on your world scale, as it assumes you are using centimeters. Otherwise it will not match your inverse square falloff.

Getting physical units out of the backbuffer for realistic virtual camera rendering (or calibrated HDR displays) would be another kettle of fish, as again the units are still arbitrarily scaled on the output.

That’s very interesting, I’ve been wondering what the units are with the directional light

Any idea what the units are for an emissive texture?

Visualizing the HDR Eye Adaptation shows what seems like the raw HDR value (i.e. before eye adaptation/tonemapping) for the center pixel of the window.
A pure white emissive material, color [1,1,1], produces an L value of 1.000.

Testing with a pure diffuse material (0 roughness, 0 specular, [1,1,1] color), the brightest pixel on a perpendicular surface has a value of 1.000 with…
A white point light with inverse square falloff at 0.4i (intensity) and 1uu away, or 20i and 10uu away, or 80i and 20uu away.
A white spot light with inverse square falloff at 0.396i and 1uu away, or 19.85i and 10uu away, or 79i and 20uu away.
A white directional light at 3.142i.
Skylight having an intensity of 1 and an emissive skysphere of color [1,1,1].

Candelas are probably more appropriate units than lumens.

I’m thinking the key is to start with an HDRI cubemap and adjust all your lights according to that, rather than trying to use atmospheric_fog or BP_Sky_Sphere or a supposed point light’s value in “lumens”.

Physical units for Dir and Sky light are on UE4 Trello. If they don’t get backlogged we’ll finally get some proper light on our PBR materials.

The HDR visualizer is what I used to derive that scaling factor (and I modified the lighting shader to remove everything except light color and distance attenuation). Comparing the values generated by a directional and sky light with a point light that should produce the same illuminance on that surface according to the formulas.

Getting correct units from HDRI outdoors is extremely difficult, requiring lots of filters and careful calibration, because the brightness of the sun will still be clipped if you just try turning down the exposure. Measuring the illuminance of the sun or sky in lux is comparatively quite easy with a cheap light meter, and you can find plenty of tables online giving you illuminance at different times of day, weather conditions, etc. And luminous flux (lumens) values of different artificial light sources are readily available from many sources, and is the correct unit for small light sources that is independent of surface area. These two things are critical if you want to realistically light an indoor environment with both natural and artificial lighting and have the correct balance of brightness.

Anyone care to translate some those values into values we can use with the skylight and directional light until they update them? Still trying to calculate decent looking results with auto exposure on so that shadows are properly dark in a forest and the sunlight spots shining through are nice and bright. It’s easy to do this with exaggerated values (32 directional light + .75 skylight) but it doesn’t feel right if you go into a sunny clearing. The sun is way to bright and only looks “right” when in the dark areas of the forest with a few sun spots here and there.

For the sun, take the measured lux value and divide by 50 (assumes you are using 100 unreal units = 1 meter scale), that gives you the equivalent directional light intensity for UE4.

This is indeed super bright, and for eye adaptation you will need to adjust the histogram min and max to get your values in a range that the exposure is tracking. The HDR histogram visualizer is helpful here, if you have the right settings you should see a luminance “hump” that is not clipped on either side. I also find that an exposure bias of -1 looks best most situations. From there it is a matter of setting reasonable min and max brightness ranges that preserve contrast at the extremes.