WHAT
On our second week of in-depth UE5 presentations we’re taking a look at Lumen, UE5’s Global Illumination and Reflections system. We’ll walk you through what it is, how to enable it, and what it provides. Plus, a high level overview of how it works and a look at content from the community!
I’ve been leaving my computer on all night to make lightmaps circa 1999.
Since then I heard the legend that one day, the heavens would open and we would see four archangels show us in a miracle of not needing the rituals or the arcane words “building lights”.
Contemplate the miracle, because it is close (or so I hope).
Was something changed about the global distance field to accomodate Lumen? It seems to be much higher quality in UE5 than in UE4, or maybe it’s just that the visualizer is different?
Are there any plans to support planar reflections with Lumen?
Why does the documentation state that light functions in Lumen are only supported for directional lights? I’ve tested it with spot/point/rect lights and they seem to work fine, what am I missing here?
Any chance of a lightmap/lumen hybrid approach for lighting in the future?
please cover some vr tips to getting it loaded on the vr headset. lumen just dissapears in headset but works fine in viewports and playing the same camera on the desktop screen produces lumen but swapping back to vr display shows global illumination is gone.
Is it possible to generate mesh distance fields at runtime for lumen? I’m working on a voxel game so generating the distance fields directly from voxel data should be possible. But how could I do that in practice. Do I need to convert my procedural meshes to static meshes?
[QUESTION] In BP, will we be able to request from an actor how much light is being received by it through Lumen (e.g. for stealth mechanics akin to Thief, SplinterCell etc.)
[QUESTION] Will we be able to alter Lumen through Post-Processing? (e.g. in the same way we can grab the SceneDepth etc.)
would it not be possible to use a capture camera around a 3rd person pawn from a VR persepctive, complete with IK and motion controller support, then merely play as the third person VR character in a lumen enabled world, then send that capture cube information to a VR pawn with a 360 sphere around it ? (in real time)
playing the captured material as a real time video on the sphere ?
thus allowing a VR pawn to see lumen ?
it might work the same way virtual desktop for oculus does… but within the unreal engine
On the stream two weeks ago Michal Valenti said that “Lumen works” in VR, which surprised me and I’m wondering why Lumen can’t be used in the new XR template (which you worked on, and I thank you for this!), I couldn’t quite understand your explanation. I presume the reason for this is that Lumen requires deferred shading (at least for now, hoping to hear if this will change in the future) and the new XR template uses forward shading?
Please cover this topic in more detail on the stream tomorrow and also discuss what’s the performance penalty of the Lumen “pass” in stereo rendering compared to a traditional single view - does/will it scale linearly, requiring twice the processing, or some optimisations/tricks are used to reduce the load?
Thank you guys for these in-depth presentations, I’m eagerly looking forward to tomorrow!
I love the idea behind Lumen and Emissive lights but the screen space aspect causes a lot of issues using it. Is anything planned here?
Is there going to be any functionality like the Lightmass switch in materials for Lumen?