So… I was wondering, the oculus quest 2 even uses the forward renderer? (I’m asking because is on by default in the new openXR template)
I’ve read somewhere that the forward renderer is exclusive for PC VR (like for SM5) like, is not designed for android.
So… if the forward renderer doesn’t works on OpenGL 3.1 (don’t know if is the same for vulkan)… how is the quest doing the MSAA?
I’m not a specialist on this topic by any means, but I though MSAA was an exclusive AA method that couln’t be used by the deferred rendering.
I dunno, probably I’m missing the point and the whole mobile render is like, its own thing, but any clarification on this topics would be greatly appreciated.
Forward rendering works fine on Quest 2, OpenGL 3.1, Vulkan, and Android. It’s almost always recommended over deferred because it runs faster while being slightly less feature rich.
MSAA does require Forward rendering from what I understand.
Forward Rendering provides a faster baseline, with faster rendering passes, which may lead to better performance on VR platforms. Not only is Forward Rendering faster, it also provides better anti-aliasing options than the Deferred Renderer, which may lead to better visuals.
The forward renderer supports both MSAA and Temporal Anti-Aliasing (TAA) and in most cases, TAA is preferable because it removes both geometric aliasing and specular aliasing. In VR however, the constant sub-pixel movement introduced by head tracking generates unwanted blurriness, making MSAA a better choice.
In our tests, using MSAA instead of TAA increases GPU frame time by about 25% (actual cost will depend on your content).
Yeah, I’ve always made my apps for mobile VR using the forward renderer, it just ocurred to me the other day that maybe it wasn’t using it at all.
But yeah… I tried both renderers inside the quest 2 hmd yesterday and I “feel” that forward has less jaggies.
Forward Renderer is much better for Mixed Reality than the default Deferred rendering pipeline because of the number of features that can be individually turned off.
Unreal implies that either pipeline works, but that the Forward renderer has better performance (and on Quest, that’s a necessity):
However, there are some trade-offs in using the Deferred Renderer that might not be right for all VR experiences.
Yeah, looks like it, and it would make sense.
But that line “we recomend that all PC titles use the forward shading renderer” is in part what makes me doubt.
Might be just outdated documentation or incorrect wording on Oculus part.
Hadn’t noticed I was on the Rift’s page too, which is why they’re generalizing about PC titles in particular rather than VR titles.
On the Oculus page for Quest 2, it specifically mentions using MSAA which requires the Forward Renderer, though it doesn’t actually say that part.
Within project settings, if you hover over the Anti Aliasing Method: MSAA, the tooltip explicitly says:
Only supported with Forward shading.
Therefor the official recommendation from Oculus is: Quest 2 = MSAA = Forward Renderer.
That said I would love to see a project used to benchmark several of the competing feature sets that area available in VR. A lot of stuff technically works in VR, but is unfeasible for Quest development.
I’m currently working with the OpenXR template and Quest 2 but the TAA solution is way better than MSAA. No frame rate difference for now…
The project however is target to a Desktop platform (via link cable or airlink)
what settings are you referring to?
nope, you run it with DX12, Raytracing has to be on for the GPU Lightmass.
I basically just use it for the GPU Lightmass.
Make sure you use forward rendering, otherwise you’ll have performance issues.
I’ll post more soon on my blog, running a lot of tests for Quest2. FattyBull