Is GI in VR possible yet?

I’ve been fighting with Unreal for about two weeks now to try and get some sort of Realtime GI going in VR, but everything I try just keeps coming up short.

My hardware setup is a desktop with a 1080ti and an MSI laptop with a 3060 in it. And for a headset I’m using a wired Quest 2.

Someone suggested on discord that I try RTXGI in 4.27 as it “supports” VR.

Outside of VR, it got even nicer results than lumen with none of the night time SSGI problems (I highly recommend trying it if you haven’t already). And it performs moderately well on my 1080ti and super nicely on my 3060 (although the strip of metal above the keyboard on the MSI gets hotter than the surface of the sun).

But when I try it in VR, the quality goes completely to hell. None of the nice shadows under tables and other furniture show up at all, and the head movement when you look around is… jiggly? Makes you feel like a bobble head.

I suspect that the nice imagery in the viewport is a combination of lighting by RTXGI and shadowing by Path tracing (the shadows and the reflections have that telltale sparkly look) and when I launch in VR the RTXGI is working but the path tracing is not (is that correct?)

Has anyone had any luck getting better results that this with RTXGI in VR?
Can anyone offer any tips or suggestions for getting the VR view to look more like the viewport view?

3 Likes

Good question. Haven’t dived in the VR world with UE. So can’t answer your question. But I do have a question for you, how do you feel the development on the quest 2 using UE? Not hard?

Ok, first up, apologies for not answering syrom, it’s a pretty broad question…

Now, re GI in VR.

Still battling away, currently my thinking is that RTXGI is in fact working in VR.
What’s not working is raytraced shadows.

i.e. here is what my map looks like in the editor:

and if I turn off casting static and dynamic shadows, that’s pretty much what it looks like in VR for me:

I have heard lots of people talking about how we can’t do raytracing in VR, but that’s what doesn’t make sense to me. To explain why, I made this short video:

Here you can see two cameras in slightly different positions having no trouble at all both rendering with raytraced shadows. I don’t understand why we can’t just do this and then just send one camera to each display in the VR headset?

2 Likes

P.S. I upgraded to a 3070ti

Hmm… interesting. Notice how without raytrace the screen shot looks so VR from what I’ve seen so far. Hopefully down the road it gets implemented properly. And yeah, see your point with the 2 camera deal. And congrats on the 3070ti! I’m still on the 2070 and waiting for a affordable 24gig vram card to be available.

Does this run in actual VR on a headset? I’ve been trying to get this working - but for what i can see its the same situation as many things in UE, just renders the left eye. In right eye RTXGI is just dark.
Do we need to turn off instanced stereo rendering? I’ve tried that but that breaks 4.27 rendering completely in the right eye…

RTXGI does render in VR for me, but without proper shadows it looks pretty awful.

1 Like

I guess what you mean is the Lightmass GPU bake.
If you package your game for VR, you use Forward Rendering.

When Forward Shading is activated, you don’t see Raytraced Shadows, or GI.
Also… for runtime in VR, you usually can’t use raytracing, because it’s to expensive.
Meaning, I haven’t seen any machine that could possible handle it to achieve solid 90fps.

To my knowledge, it’s best practice to use Forward Shading while your project settings have Ray Tracing on. Because you need to use DX12 and Ray Tracing in order to work with the GPU Lightmass.

Once this is setup… bake the lighting with the GPU Lightmass… that’s your Gloabl Illumination.
Unfortunately, it doesn’t look like that you can use the Visualization Buffer wen using Forward Rendering. Therefore you can’t see the the GI in the viewport. Same for the AO.

But the GPU Lightmass, is basically your baked GI information.
The only thing that doesn’t seem to work is the Ambient Occlusion.

I will post shortly some of my tests on my blog.

I will run more tests for Oculust Rift, Rift S and Quest 2.

Hope that helps a bit,
cheers

ps: I am also discussing all my development progress on my slack channel.
[Join FattyBull on Slack | Slack](https://FattyBull on Slack)

1 Like

Hey Bernard,

Thanks for all the info!

But no, I wasn’t talking about baked lighting…

I meant realtime GI in VR using either RTXGI or Lumen (or something else?)

ohh wow. Yeah, that would be amazing. So far I guess this is just not possible if you want to aim for steady 90FPS. I mean, yes… I guess you can do it with lower FPS. Maybe okay for Pre-Production.

If you package your game for VR, you use Forward Rendering.

This isn’t necessarily the case; Deferred rendering is viable in VR, but with heightened base requirements; For more complex scenes- particularly around lighting- it can actually net positive over forward rendering.

I’m not aware of any fundamental limitation in Unreal forcing a use of forward rendering, aside some buggy input motion when mirrors are present in the scene… But that can be addressed with time. If I’m missing something, I’d love to know about it, as we’re targeting VR+RTXGI under deferred in our current project.

Deferred is viable for some use cases, primarily outside of games; architecture, design/manufacturing, etc. MSAA is still the best anti-aliasing solution for VR, but Temporal Super Resolution works better than TAA, which you can use in UE5.

2 Likes