I tried to find other posts regarding this, but I didn’t really find much. I’ve tried creating a VR scene from scratch, just dragging in some assets to see how Lumen lights.
While doing so I’ve set everything to use Lumen. hoping that it would work with SteamVR.
The preview in the editor looks nice, but it seems that in the actual VR preview, the GI pass is actually being skipped, as all the bounce light is missing from the scene.
VR:
It is highly recommended to create your VR project using the VRTemplate in UE5, because the project settings and plugins are already configured for the best VR experience. If you create a VR project in UE5 Early Access, you must disable Lumen by setting Project Settings > Rendering > Global Illumination to None. Lumen is an Engine Default in UE5 and is not currently supported on XR platforms.
It was stated however that they should at some point later.
Yeah… it’s a shame, but since they also mentioned that nanite will be coming eventually, and that he assumed that Lumen is already in there, sounds like it’s definitely going to work eventually
It seems to be a technical issue that may challenge the way nanite and lumen have been setup. I would imagine that VR would require some very specific solutions too. Foveated rendering combined with a form of ‘Foveated’ Virtualized Everything (Texture, Mesh [nanite], Lumen [Shadow Maps etc] or something. In order to get perf where it needs to be with VR, it likely needs to be constructed from the ground up, but idk. If I were a skilled programmer, I’d build it and let EPIC buy me out. LOL Gotta have eye tracking for the foveate stuff to work too and that’s very platform specific rn.
i have gotten lumen and nanite to run just fine side by side on 2 cloned clients. the limitation is inside the single engine viewport. it cannot do split screen modes either. any display of a secondary camera causes this. havent tested render targets yet either. im thinking ndisplay with vrpn setup properly might work out given they each run their own process but this would take a lot of work. ive gotten ue5 to run ndisplay but i cannot figure out vrpn and using ndisplay in frame sequential mode didnt help when trying to run stereo mode or side by side. the best i could think of is a render target thats cloned and cut a little differently with an offset for the eyes so technically its one big screen made into a vr type visual. not perfect but might look ok. thats where vrpn comes in too and hoping urneal doesent have a fit with all this. probably wont work out well. also quest 2 loses controller functions but not the tracking when in non-stereo vr mode. you can use virtual desktop with xbox 360 controller trick that it does but you lose tracking with that and only gain gamepad inputs.
Telling me they can’t just calculate Lumen for one eye and propagate to the other eye? Why would they need to double up on calculations. Bounce light is kind of universal.
Then you are going to be waiting a very long time. Epic said in their recent Lumen live stream that they will not be bringing it to VR due to performance issues with getting it properly working in VR, which is a HUGE bummer. Honestly, the fact they got Nanite and Lumen working this well is already black magic in my eyes, in time I bet they could do it, but if they do it probably won’t be for a long time. :\
An eye is a unique viewpoint that sees things the other eye cannot. Imagine a scenario where the player has their head right up to a wall edge that separates two rooms so that one eye only sees one room and the other eye only sees the other room.
Perfectly diffuse is universal, but specular is view-dependent.
And yet, some VR is just a single screen with 2 lenses and a custom aspect ratio. It doesn’t have to be perfect to use the same calculations. True Indirect lighting is indirect lighting regardless of the viewport, at least within a radius of the viewport. There are solutions like VorpX that make some non-VR games enjoyable in VR. I refuse to believe a prototype isn’t possible.