Where did you get this map shown in many Epic twitch LiveStreams but never released in Marketplace?
This is a test map that is used for Fortnite.
That’s from their upcoming Fortnite game
Time to install client again, thanks for posting!
The first version I made was placing Virtual Point Lights by ray tracing through the distance field scene. The result was unstable placement, no diffuse color, but a very small number of resulting VPLs to do lighting with so it was fast. It was ~6ms in a Fortnite level on an AMD 6870.
This version is working toward something more general purpose. Surfels (oriented disks) are generated to represent meshes, including diffuse color. They are treated as VPLs in order to transfer their lighting to pixels on the screen. This version is much slower because the number of VPLs needed is based on the surface area of the scene, instead of the projected area of the light. However it sets the framework for diffuse GI from multiple light sources and even multi-bounce GI, whereas the previous version could only do single bounce. This one is ~17ms in a Fortnite level on AMD 6870. The main optimization missing is creating a hierarchy of Surfels, and only lighting from the top of the hierarchy if that Surfel is far away.
There are still a lot of problems to solve - improving bounce distance, over-occlusion indoors, some leaking, point and spot lights, overall performance, etc.
17ms is pretty good considering its running on a 5 year old card(AMD 6870) and its not even optimized yet! Keep up the great work!
Could someone post some result of this new implementation?
Wow man, looking good so far!
I tried it on SunTemple demo. Idk about performance, since I don’t know at which stat group to look for GI rendering time, but
what I really like, is that this technique also considers indirect shadowing from directional light. Which is not that obvious in most other attempts at GI.
Also. It doesn’t seem to affect metallic surfaces right now. In anycase great work! It is already farrrrrr better than any volume technique (cough lpv, cough). I could literally light entire SunTemple demo with skylight intensity set at 0.
Impressive results, I have to say! If that is only the start (as you said, ) - it’s amazing - the quality really improved.
I wish you good luck developing it and thank you for your efforts!
Kind Regards.
Before/After Pics or…
Nice update! Looking forward to seeing more!
thanks for the screenshot! I agree they do look better than LPV.
Glad to see someone trying it out =) So far it has just been me in a test level so it’s kindof surprising to see it work with arbitrary content. One warning though - there is a known crash that will happen if you move meshes around enough with this on.
Support for material changes is in the design, I just haven’t gotten to it yet.
Instanced meshes are fully supported by all distance field lighting methods (AO, GI, shadows) in latest. For foliage you have to go into the instance settings and enable bAffectDistanceFieldLighting. It’s off by default because stuff like grass and bushes with high instance counts can really destroy performance.
I am working on landscape GI at this very moment =) My goal is to combine surfel GI with heightfield GI to get a dynamic GI solution for outdoors. It has to be a separate implementation because surfels don’t scale up to huge surface area that well, and terrains need to be represented by heightfields instead of distance fields, so a different ray tracing kernel is needed.
One note - you can actually do non-uniform scaling (squishing) as long as you only squish by say a factor of 2 difference between dimensions. I was a bit over-strict when I wrote the , in practice the lighting looks fine with only limited non-uniform scaling.
Compared to LPV here’s how I see Distance Field GI:
- Vastly higher quality indirect shadowing - leaking is actually solvable
- More detail in color bleeding - material is evaluated per surfel (disk) isntead of for a huge voxel
- Much higher view distance, not limited by volume texture scaling
- Only single bounce for now (can be improved)
- Have to be able to represent your scene with distance fields / heightfields
Yeah I haven’t done anything for indirect specular yet, and metal is fully specular. I will keep thinking about this. The problem is that one of the expensive steps (irradiance cache interpolation) is already heavy with just interpolating float3 Irradiance, if I try to store directionality which is needed for specular it will get a lot slower. Will probably have to bite the bullet on that anyway.
I haven’t dabbled in distance fields / heightfields yet, so could someone enlighten me on what it means for a level be be representable with distance fields / heightfields?
Hi DamirH,
You can take a look at the Distance Field documentaiton here:
Ray Traced Soft Shadows: Distance Field Soft Shadows in Unreal Engine | Unreal Engine 5.2 Documentation
Ambient Occlusion:
Once you’ve enabled this feature via the Project Settings > Rendering > Generate Distance Fields, and restarted the editor you can enable you Movable Skylight and Dynamic light to cast shadows independent of cascaded shadow maps.
If you take a look at this image you will see what distance fields are. These are visual representations of your mesh. If you see any quality issues or anything that does not look correct with the shadows you can adjust the Distance Field Resolution scale by opening the static mesh editor > Build settings > and adjusting the distance field resolution.
Distance fields are a great feature that’s being added to continually and is a great solution for shadow quality and distances compared to what Cascaded Shadows can provide.
If you’re confused or need help setting up or getting particular results, feel free to post in the Rendering section of the Forums or AnswerHub and I’ll gladly help you set up anything you’re having issues with.
Just want to say how great this thread is! GI is surely one of the most important features UE4 needs in the coming years.
Would be if someone posted a before/after of a fully fledged demo scene like the suntemple demo inside talked about.
Especially at the very end of the hall with all the flame lights. (supposing it works with point lights), else the day light areas.
Would do it myself but i don’t have 4.7 atm.
Again amazing work, seems like we can end up having 4 fully fledged GI solutions by summer time.
This looks wonderful.
I think a bit of noise and some imperfections are completely OK (more so if this can get better with years coming due to more hw performance). Not so long ago software renderers in 3d apps have been struggling with AO/GI flickering during animations and so it feels like a game engine should have the privilege not to be perfect
What I wanted to ask is how will these techniques work with characters (deformable meshes). Docs mention a few times that limitation is that it works only with static meshes. Can the DF GI light the characters (without characters “lighting” the environment)? Are there some options, workarounds or complementary techniques in plan that could be used on deformables while the DFGI works on static geometry?
UE4 already does something like that for skeletal meshes.
But, it’s possible you could use a primitive shape aligned with bones that could be used with the distance fields. Or if you’re using a rigid animated object (like a robot) then you could still use distance fields, it would just have to add support for that.