Localized-IBL implementation

I was thinking about another CVar to enable/disable it on translucency in a global way, but a more granular setup would also make sense yeah
the issue with what you suggest is that it’s per shader on the receiving actor and not in the reflectioncapture itself. I would need to overburden the reflectioncaptures struct just for this which makes everything less clean (paying the cost even if you opt out of LocalIBL)

a more clean alternative would be a bool in the translucent section of the shaders where it allows opting into the IBL. but there’s more factors in this.
right now, since for translucency it’s done in the base pass the only place I’ve found to modify the compliation environment (i.e. injecting shader defines) is at the material compilation time. as a result, right now turning the IBL on and off only gets re-applied on translucent shaders when re-compiling them (while for every other shader it’s instant because the movable skylight updates cause the CVar to propagate in the next frame)

also so far I’ve tried to avoid extra editor LocalIBL-only functionality mostly because I’m not a fan of editor-exposed properties that require an obscure CVar to be enabled (LPV I’m looking at you).
if I knew for sure the PR would get accepted and this becoming a full supported feature (having it exposed in the Project Settings and all) I wouldn’t mind extending editor features for it. but the more ‘integrated’ I make it, the more Epic will have to support and maintain it, which I think means the less chance there is for the PR to even make it (since Epic doesnt seem interested in IBL)

Dropping by to mention that it is a good work so far, and in my view, UE4 desperately needs localized IBL more than anything else.
With the solution expanded to being able to pre-capture several cubemaps into array, and blend between two nearest slices in realtime, whole thingy would be priceless.

From my understanding ARK survivor uses similar technic. They showed it in twitch if I remember correctly.

thanks :slight_smile:
reflection capture cubemaps already are an array by itself under the hood. I guess what you mean is multiple capture scenarios per capture? if I’m not mistaken are you hinting at multiple time-of-day captures blending? if so that’s kinda already possible with my system, it just involves duplicating the captures for each desired time of day scenario and capturing them with a specific light setup, and then interpolating their Contribution (new feature I added) at runtime. it however means having x*n captures where n is the amount of time-of-day captures, and the system still only allows 341 captures.

the part about blending two nearest slices I really don’t get though. is it still the same thing as the contribution thing or something else? please clarify, maybe this brings some new feature ideas to the table :slight_smile:

WOW. That is awesome. You are creating something priceless here

And this is what I mentioned by Epic twitch. Twitch . At 46:20 min.

One thing is interesting, If this is the same technic with ARK developers, how did they fit 340 reflection captures with this type blending and open world into game?

yeah I had a look at the Ark dev kit to see it in action a couple of weeks back. I don’t know exactly how they blend between probes for time of day so I can only guess that they really just interpolate between capture data.
What I do know is they use a more elaborate ‘master’ and ‘child’ sort of system which enables them to re-use the same capture data of a reflectioncapture in other reflectioncapture actors. coupled with the fact that they do some generic captures (i.e. their forest setup is just a cubemap of a render of their own forest) allows them to cover a lot of areas with IBL. it’s much more generic than specifically using a capture of each area but it does the trick just fine if such areas are generic by themselves (i.e. as long as you don’t have shiny light-emitting crystals in your forest you can use the generic forest capture in it, and it will have… forest lighting)

so, the system is done.
I moved the code to avoid copy-pasted shader code (which led me to some weirdness in shader defines), and now translucency works with LocalizedIBL properly (tested all translucency shader modes IIRC)
only caveat is that translucent materials need to be manually re-compiled after enabling the feature

new before-after:

https://i.imgur.com/De05Z3F.gif

so for now this is it. I’ll hold off from implementing any new improvements for now. I just want to wait for the pull request to be looked into by Epic, instead of complicating things further :slight_smile:

Awesome work, crossing my fingers Epic is interested in implementing this!

Man this seriously looks so good! It makes me sad there isn’t more feedback on it.

If you store sky visibility to alpha channel you can blend indoor and outdoor probes lot smoother. This also allow semi indoor areas.

semi-indoor areas are already possible as visible here: https://forums.unrealengine.com/deve…87#post1494687

however things can always improve so I’m curious what you mean. in the case of the alpha channel being free to use it, can you elaborate on how sky visibility could help with the blending?

When rendering cubemaps store binary sky visibility to alpha channel. Or that can be calculated by if depth value is greater than treshold.(let say 10000meters). Then this visibility is downsampled for lower mips so it’s not binary anymore but percentage how much this direction see the sky. When you calculate local IBL you just use that alpha to blend how much local IBL and sky lighting is used. Dice engine does exactly like this.
Moving Frostbite to Physically Based Rendering | PPT page 79.

In areas where the sky is partly visible, that would be apparent in the local IBL anyway, right? So what would be the advantage of lerping to just the skylight?

I’ll have to double-check but IIRC unreal already has a sky threshold for probes and uses it to composite the skylight’s reflection into local reflections.

but yeah as says the local IBL capture will already have the distant sky captured in the probe and will project it as local light. so if I imagine it correctly, in a completely static scenario with a blue sky I’d be substituting the blue-sky pixels of the local capture for the pretty-much-same blue-sky pixels of the skylight.

the benefit I see from this is that since the skylight can change dynamically via blended cubemaps, in a time-of-day scenario I’d have dynamically-changing skylight colors coming through into the local IBL rather than just the statically-captured probe’s visible sky.
I think it’s worth doing it and since unreal already does it for reflections it’s probably trivial to do the same. I’ll look into it when I have some time to dedicate to this again :slight_smile:

Parallax correction is also different for local IBL and skylight. When pixel is near the opening there sky should be more visible and not just those captured sky pixels stretched.

true, but you only get the parallax correction of the sky when the near pixel mask lets the sky in. so I don’t get how the sky would be more visible.
in any case remember this uses the highest mip of the cubemap so in practice I doubt it will make a difference if the sky gets better parallax correction

Incredible work, the images of tests convince me enough. What version of Unreal did you use to implement this? I would like to do some tests with some levels that I have :slight_smile:

thanks :slight_smile:
the results can be convincing but they will never be as good as static lighting. also it can require a lot of fine tuning at times, but the benefits (low memory consumption, a little bit of ‘dynamicness’ into it, super quick build times) make it a very valid option IMO.

the code works on 4.20, and I will try to branch it out and adapt it when 4.21 comes and so on

I have another test scene in the works as showcase of the system but I haven’t had time to finish it. I’ll report back when I do so

Looking forward to seeing the new test scene!

okay here’s a new test scene, the classic Sponza Atrium
This one is just 9 probes placed so the capture time is really quick (just 3 seconds).
With this Scene I realized that the more granular I go, the harder it is to get light to spread around. So it’s tricky to find a balance between “light bounce distance” and “how local” the IBL is.

@

Does it work in VR ?